[Mon Dec 9 06:17:01 2019][ 0.000000] Initializing cgroup subsys cpu [Mon Dec 9 06:17:01 2019][ 0.000000] Initializing cgroup subsys cpuacct [Mon Dec 9 06:17:01 2019][ 0.000000] Linux version 3.10.0-957.27.2.el7_lustre.pl2.x86_64 (sthiell@oak-rbh01) (gcc version 4.8.5 20150623 (Red Hat 4.8.5-39) (GCC) ) #1 SMP Thu Nov 7 15:26:16 PST 2019 [Mon Dec 9 06:17:01 2019][ 0.000000] Command line: BOOT_IMAGE=/boot/vmlinuz-3.10.0-957.27.2.el7_lustre.pl2.x86_64 root=UUID=c4f754c4-e7db-49b7-baed-d6c7905c5cdc ro crashkernel=auto nomodeset console=ttyS0,115200 LANG=en_US.UTF-8 [Mon Dec 9 06:17:01 2019][ 0.000000] e820: BIOS-provided physical RAM map: [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x0000000000000000-0x000000000008efff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000000008f000-0x000000000008ffff] ACPI NVS [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x0000000000090000-0x000000000009ffff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x0000000000100000-0x000000004f773fff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000004f774000-0x000000005777cfff] reserved [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000005777d000-0x000000006cacefff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000006cacf000-0x000000006efcefff] reserved [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000006efcf000-0x000000006fdfefff] ACPI NVS [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000006fdff000-0x000000006fffefff] ACPI data [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000006ffff000-0x000000006fffffff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x0000000070000000-0x000000008fffffff] reserved [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x00000000fec10000-0x00000000fec10fff] reserved [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x00000000fed80000-0x00000000fed80fff] reserved [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x0000000100000000-0x000000107f37ffff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000107f380000-0x000000107fffffff] reserved [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x0000001080000000-0x000000207ff7ffff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000207ff80000-0x000000207fffffff] reserved [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x0000002080000000-0x000000307ff7ffff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000307ff80000-0x000000307fffffff] reserved [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x0000003080000000-0x000000407ff7ffff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000407ff80000-0x000000407fffffff] reserved [Mon Dec 9 06:17:01 2019][ 0.000000] NX (Execute Disable) protection: active [Mon Dec 9 06:17:01 2019][ 0.000000] extended physical RAM map: [Mon Dec 9 06:17:01 2019][ 0.000000] reserve setup_data: [mem 0x0000000000000000-0x000000000008efff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] reserve setup_data: [mem 0x000000000008f000-0x000000000008ffff] ACPI NVS [Mon Dec 9 06:17:01 2019][ 0.000000] reserve setup_data: [mem 0x0000000000090000-0x000000000009ffff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] reserve setup_data: [mem 0x0000000000100000-0x0000000037ac001f] usable [Mon Dec 9 06:17:01 2019][ 0.000000] reserve setup_data: [mem 0x0000000037ac0020-0x0000000037ad865f] usable [Mon Dec 9 06:17:01 2019][ 0.000000] reserve setup_data: [mem 0x0000000037ad8660-0x0000000037ad901f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037ad9020-0x0000000037b0265f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b02660-0x0000000037b0301f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b03020-0x0000000037b0b05f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b0b060-0x0000000037b0c01f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b0c020-0x0000000037b3dc5f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b3dc60-0x0000000037b3e01f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b3e020-0x0000000037b6fc5f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b6fc60-0x0000000037b7001f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b70020-0x0000000037c11c5f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037c11c60-0x000000004f773fff] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000004f774000-0x000000005777cfff] reserved [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000005777d000-0x000000006cacefff] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000006cacf000-0x000000006efcefff] reserved [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000006efcf000-0x000000006fdfefff] ACPI NVS [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000006fdff000-0x000000006fffefff] ACPI data [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000006ffff000-0x000000006fffffff] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000070000000-0x000000008fffffff] reserved [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x00000000fec10000-0x00000000fec10fff] reserved [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x00000000fed80000-0x00000000fed80fff] reserved [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000100000000-0x000000107f37ffff] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000107f380000-0x000000107fffffff] reserved [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000001080000000-0x000000207ff7ffff] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000207ff80000-0x000000207fffffff] reserved [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000002080000000-0x000000307ff7ffff] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000307ff80000-0x000000307fffffff] reserved [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000003080000000-0x000000407ff7ffff] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000407ff80000-0x000000407fffffff] reserved [Mon Dec 9 06:17:02 2019][ 0.000000] efi: EFI v2.50 by Dell Inc. [Mon Dec 9 06:17:02 2019][ 0.000000] efi: ACPI=0x6fffe000 ACPI 2.0=0x6fffe014 SMBIOS=0x6eab5000 SMBIOS 3.0=0x6eab3000 [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem00: type=3, attr=0xf, range=[0x0000000000000000-0x0000000000001000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem01: type=2, attr=0xf, range=[0x0000000000001000-0x0000000000002000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem02: type=7, attr=0xf, range=[0x0000000000002000-0x0000000000010000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem03: type=3, attr=0xf, range=[0x0000000000010000-0x0000000000014000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem04: type=7, attr=0xf, range=[0x0000000000014000-0x0000000000063000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem05: type=3, attr=0xf, range=[0x0000000000063000-0x000000000008f000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem06: type=10, attr=0xf, range=[0x000000000008f000-0x0000000000090000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem07: type=3, attr=0xf, range=[0x0000000000090000-0x00000000000a0000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem08: type=4, attr=0xf, range=[0x0000000000100000-0x0000000000120000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem09: type=7, attr=0xf, range=[0x0000000000120000-0x0000000000c00000) (10MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem10: type=3, attr=0xf, range=[0x0000000000c00000-0x0000000001000000) (4MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem11: type=2, attr=0xf, range=[0x0000000001000000-0x000000000267b000) (22MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem12: type=7, attr=0xf, range=[0x000000000267b000-0x0000000004000000) (25MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem13: type=4, attr=0xf, range=[0x0000000004000000-0x000000000403b000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem14: type=7, attr=0xf, range=[0x000000000403b000-0x0000000037ac0000) (826MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem15: type=2, attr=0xf, range=[0x0000000037ac0000-0x000000004edd7000) (371MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem16: type=7, attr=0xf, range=[0x000000004edd7000-0x000000004eddb000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem17: type=2, attr=0xf, range=[0x000000004eddb000-0x000000004eddd000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem18: type=1, attr=0xf, range=[0x000000004eddd000-0x000000004eefa000) (1MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem19: type=2, attr=0xf, range=[0x000000004eefa000-0x000000004f019000) (1MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem20: type=1, attr=0xf, range=[0x000000004f019000-0x000000004f128000) (1MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem21: type=3, attr=0xf, range=[0x000000004f128000-0x000000004f774000) (6MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem22: type=0, attr=0xf, range=[0x000000004f774000-0x000000005777d000) (128MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem23: type=3, attr=0xf, range=[0x000000005777d000-0x000000005796e000) (1MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem24: type=4, attr=0xf, range=[0x000000005796e000-0x000000005b4cf000) (59MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem25: type=3, attr=0xf, range=[0x000000005b4cf000-0x000000005b8cf000) (4MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem26: type=7, attr=0xf, range=[0x000000005b8cf000-0x0000000064a36000) (145MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem27: type=4, attr=0xf, range=[0x0000000064a36000-0x0000000064a43000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem28: type=7, attr=0xf, range=[0x0000000064a43000-0x0000000064a47000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem29: type=4, attr=0xf, range=[0x0000000064a47000-0x0000000065061000) (6MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem30: type=7, attr=0xf, range=[0x0000000065061000-0x0000000065062000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem31: type=4, attr=0xf, range=[0x0000000065062000-0x0000000065069000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem32: type=7, attr=0xf, range=[0x0000000065069000-0x000000006506a000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem33: type=4, attr=0xf, range=[0x000000006506a000-0x000000006506b000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem34: type=7, attr=0xf, range=[0x000000006506b000-0x000000006506c000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem35: type=4, attr=0xf, range=[0x000000006506c000-0x0000000065076000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem36: type=7, attr=0xf, range=[0x0000000065076000-0x0000000065077000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem37: type=4, attr=0xf, range=[0x0000000065077000-0x000000006507d000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem38: type=7, attr=0xf, range=[0x000000006507d000-0x000000006507e000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem39: type=4, attr=0xf, range=[0x000000006507e000-0x0000000065083000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem40: type=7, attr=0xf, range=[0x0000000065083000-0x0000000065086000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem41: type=4, attr=0xf, range=[0x0000000065086000-0x0000000065093000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem42: type=7, attr=0xf, range=[0x0000000065093000-0x0000000065094000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem43: type=4, attr=0xf, range=[0x0000000065094000-0x000000006509f000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem44: type=7, attr=0xf, range=[0x000000006509f000-0x00000000650a0000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem45: type=4, attr=0xf, range=[0x00000000650a0000-0x00000000650a1000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem46: type=7, attr=0xf, range=[0x00000000650a1000-0x00000000650a2000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem47: type=4, attr=0xf, range=[0x00000000650a2000-0x00000000650aa000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem48: type=7, attr=0xf, range=[0x00000000650aa000-0x00000000650ab000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem49: type=4, attr=0xf, range=[0x00000000650ab000-0x00000000650ad000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem50: type=7, attr=0xf, range=[0x00000000650ad000-0x00000000650ae000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem51: type=4, attr=0xf, range=[0x00000000650ae000-0x00000000650b4000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem52: type=7, attr=0xf, range=[0x00000000650b4000-0x00000000650b5000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem53: type=4, attr=0xf, range=[0x00000000650b5000-0x00000000650d5000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem54: type=7, attr=0xf, range=[0x00000000650d5000-0x00000000650d6000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem55: type=4, attr=0xf, range=[0x00000000650d6000-0x0000000065432000) (3MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem56: type=7, attr=0xf, range=[0x0000000065432000-0x0000000065433000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem57: type=4, attr=0xf, range=[0x0000000065433000-0x000000006543b000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem58: type=7, attr=0xf, range=[0x000000006543b000-0x000000006543c000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem59: type=4, attr=0xf, range=[0x000000006543c000-0x000000006544e000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem60: type=7, attr=0xf, range=[0x000000006544e000-0x000000006544f000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem61: type=4, attr=0xf, range=[0x000000006544f000-0x0000000065463000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem62: type=7, attr=0xf, range=[0x0000000065463000-0x0000000065464000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem63: type=4, attr=0xf, range=[0x0000000065464000-0x0000000065473000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem64: type=7, attr=0xf, range=[0x0000000065473000-0x0000000065474000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem65: type=4, attr=0xf, range=[0x0000000065474000-0x00000000654c5000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem66: type=7, attr=0xf, range=[0x00000000654c5000-0x00000000654c6000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem67: type=4, attr=0xf, range=[0x00000000654c6000-0x00000000654d9000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem68: type=7, attr=0xf, range=[0x00000000654d9000-0x00000000654db000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem69: type=4, attr=0xf, range=[0x00000000654db000-0x00000000654e0000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem70: type=7, attr=0xf, range=[0x00000000654e0000-0x00000000654e1000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem71: type=4, attr=0xf, range=[0x00000000654e1000-0x00000000654fa000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem72: type=7, attr=0xf, range=[0x00000000654fa000-0x00000000654fb000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem73: type=4, attr=0xf, range=[0x00000000654fb000-0x0000000065508000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem74: type=7, attr=0xf, range=[0x0000000065508000-0x0000000065509000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem75: type=4, attr=0xf, range=[0x0000000065509000-0x000000006550b000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem76: type=7, attr=0xf, range=[0x000000006550b000-0x000000006550c000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem77: type=4, attr=0xf, range=[0x000000006550c000-0x000000006550e000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem78: type=7, attr=0xf, range=[0x000000006550e000-0x000000006550f000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem79: type=4, attr=0xf, range=[0x000000006550f000-0x0000000065513000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem80: type=7, attr=0xf, range=[0x0000000065513000-0x0000000065514000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem81: type=4, attr=0xf, range=[0x0000000065514000-0x0000000065515000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem82: type=7, attr=0xf, range=[0x0000000065515000-0x0000000065516000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem83: type=4, attr=0xf, range=[0x0000000065516000-0x0000000065522000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem84: type=7, attr=0xf, range=[0x0000000065522000-0x0000000065523000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem85: type=4, attr=0xf, range=[0x0000000065523000-0x0000000065593000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem86: type=7, attr=0xf, range=[0x0000000065593000-0x0000000065594000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem87: type=4, attr=0xf, range=[0x0000000065594000-0x000000006559c000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem88: type=7, attr=0xf, range=[0x000000006559c000-0x000000006559d000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem89: type=4, attr=0xf, range=[0x000000006559d000-0x00000000655c4000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem90: type=7, attr=0xf, range=[0x00000000655c4000-0x00000000655c5000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem91: type=4, attr=0xf, range=[0x00000000655c5000-0x00000000655ea000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem92: type=7, attr=0xf, range=[0x00000000655ea000-0x00000000655eb000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem93: type=4, attr=0xf, range=[0x00000000655eb000-0x00000000655f1000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem94: type=7, attr=0xf, range=[0x00000000655f1000-0x00000000655f2000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem95: type=4, attr=0xf, range=[0x00000000655f2000-0x000000006b8cf000) (98MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem96: type=7, attr=0xf, range=[0x000000006b8cf000-0x000000006b8d0000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem97: type=3, attr=0xf, range=[0x000000006b8d0000-0x000000006cacf000) (17MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem98: type=6, attr=0x800000000000000f, range=[0x000000006cacf000-0x000000006cbcf000) (1MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem99: type=5, attr=0x800000000000000f, range=[0x000000006cbcf000-0x000000006cdcf000) (2MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem100: type=0, attr=0xf, range=[0x000000006cdcf000-0x000000006efcf000) (34MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem101: type=10, attr=0xf, range=[0x000000006efcf000-0x000000006fdff000) (14MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem102: type=9, attr=0xf, range=[0x000000006fdff000-0x000000006ffff000) (2MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem103: type=4, attr=0xf, range=[0x000000006ffff000-0x0000000070000000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem104: type=7, attr=0xf, range=[0x0000000100000000-0x000000107f380000) (63475MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem105: type=7, attr=0xf, range=[0x0000001080000000-0x000000207ff80000) (65535MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem106: type=7, attr=0xf, range=[0x0000002080000000-0x000000307ff80000) (65535MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem107: type=7, attr=0xf, range=[0x0000003080000000-0x000000407ff80000) (65535MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem108: type=0, attr=0x9, range=[0x0000000070000000-0x0000000080000000) (256MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem109: type=11, attr=0x800000000000000f, range=[0x0000000080000000-0x0000000090000000) (256MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem110: type=11, attr=0x800000000000000f, range=[0x00000000fec10000-0x00000000fec11000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem111: type=11, attr=0x800000000000000f, range=[0x00000000fed80000-0x00000000fed81000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem112: type=0, attr=0x0, range=[0x000000107f380000-0x0000001080000000) (12MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem113: type=0, attr=0x0, range=[0x000000207ff80000-0x0000002080000000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem114: type=0, attr=0x0, range=[0x000000307ff80000-0x0000003080000000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem115: type=0, attr=0x0, range=[0x000000407ff80000-0x0000004080000000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] SMBIOS 3.2.0 present. [Mon Dec 9 06:17:03 2019][ 0.000000] DMI: Dell Inc. PowerEdge R6415/07YXFK, BIOS 1.10.6 08/15/2019 [Mon Dec 9 06:17:03 2019][ 0.000000] e820: last_pfn = 0x407ff80 max_arch_pfn = 0x400000000 [Mon Dec 9 06:17:03 2019][ 0.000000] PAT configuration [0-7]: WB WC UC- UC WB WP UC- UC [Mon Dec 9 06:17:03 2019][ 0.000000] e820: last_pfn = 0x70000 max_arch_pfn = 0x400000000 [Mon Dec 9 06:17:03 2019][ 0.000000] Using GB pages for direct mapping [Mon Dec 9 06:17:03 2019][ 0.000000] RAMDISK: [mem 0x37c12000-0x38f51fff] [Mon Dec 9 06:17:03 2019][ 0.000000] Early table checksum verification disabled [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: RSDP 000000006fffe014 00024 (v02 DELL ) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: XSDT 000000006fffd0e8 000AC (v01 DELL PE_SC3 00000002 DELL 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: FACP 000000006fff0000 00114 (v06 DELL PE_SC3 00000002 DELL 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: DSDT 000000006ffdc000 1038C (v02 DELL PE_SC3 00000002 DELL 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: FACS 000000006fdd3000 00040 [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: SSDT 000000006fffc000 000D2 (v02 DELL PE_SC3 00000002 MSFT 04000000) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: BERT 000000006fffb000 00030 (v01 DELL BERT 00000001 DELL 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: HEST 000000006fffa000 006DC (v01 DELL HEST 00000001 DELL 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: SSDT 000000006fff9000 00294 (v01 DELL PE_SC3 00000001 AMD 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: SRAT 000000006fff8000 00420 (v03 DELL PE_SC3 00000001 AMD 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: MSCT 000000006fff7000 0004E (v01 DELL PE_SC3 00000000 AMD 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: SLIT 000000006fff6000 0003C (v01 DELL PE_SC3 00000001 AMD 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: CRAT 000000006fff3000 02DC0 (v01 DELL PE_SC3 00000001 AMD 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: EINJ 000000006fff2000 00150 (v01 DELL PE_SC3 00000001 AMD 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: SLIC 000000006fff1000 00024 (v01 DELL PE_SC3 00000002 DELL 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: HPET 000000006ffef000 00038 (v01 DELL PE_SC3 00000002 DELL 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: APIC 000000006ffee000 004B2 (v03 DELL PE_SC3 00000002 DELL 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: MCFG 000000006ffed000 0003C (v01 DELL PE_SC3 00000002 DELL 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: SSDT 000000006ffdb000 00629 (v02 DELL xhc_port 00000001 INTL 20170119) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: IVRS 000000006ffda000 00210 (v02 DELL PE_SC3 00000001 AMD 00000000) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: SSDT 000000006ffd8000 01658 (v01 AMD CPMCMN 00000001 INTL 20170119) [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x00 -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x01 -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x02 -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x03 -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x04 -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x05 -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x08 -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x09 -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x0a -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x0b -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x0c -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x0d -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x10 -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x11 -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x12 -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x13 -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x14 -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x15 -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x18 -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x19 -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x1a -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x1b -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x1c -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x1d -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x20 -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x21 -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x22 -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x23 -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x24 -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x25 -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x28 -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x29 -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x2a -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x2b -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x2c -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x2d -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x30 -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x31 -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x32 -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x33 -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x34 -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x35 -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x38 -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x39 -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x3a -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x3b -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x3c -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x3d -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: Node 0 PXM 0 [mem 0x100000000-0x107fffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: Node 1 PXM 1 [mem 0x1080000000-0x207fffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: Node 2 PXM 2 [mem 0x2080000000-0x307fffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: Node 3 PXM 3 [mem 0x3080000000-0x407fffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] NUMA: Node 0 [mem 0x00000000-0x7fffffff] + [mem 0x100000000-0x107fffffff] -> [mem 0x00000000-0x107fffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] NODE_DATA(0) allocated [mem 0x107f359000-0x107f37ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] NODE_DATA(1) allocated [mem 0x207ff59000-0x207ff7ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] NODE_DATA(2) allocated [mem 0x307ff59000-0x307ff7ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] NODE_DATA(3) allocated [mem 0x407ff58000-0x407ff7efff] [Mon Dec 9 06:17:03 2019][ 0.000000] Reserving 176MB of memory at 704MB for crashkernel (System RAM: 261692MB) [Mon Dec 9 06:17:03 2019][ 0.000000] Zone ranges: [Mon Dec 9 06:17:03 2019][ 0.000000] DMA [mem 0x00001000-0x00ffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] DMA32 [mem 0x01000000-0xffffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] Normal [mem 0x100000000-0x407ff7ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] Movable zone start for each node [Mon Dec 9 06:17:03 2019][ 0.000000] Early memory node ranges [Mon Dec 9 06:17:03 2019][ 0.000000] node 0: [mem 0x00001000-0x0008efff] [Mon Dec 9 06:17:03 2019][ 0.000000] node 0: [mem 0x00090000-0x0009ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] node 0: [mem 0x00100000-0x4f773fff] [Mon Dec 9 06:17:03 2019][ 0.000000] node 0: [mem 0x5777d000-0x6cacefff] [Mon Dec 9 06:17:03 2019][ 0.000000] node 0: [mem 0x6ffff000-0x6fffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] node 0: [mem 0x100000000-0x107f37ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] node 1: [mem 0x1080000000-0x207ff7ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] node 2: [mem 0x2080000000-0x307ff7ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] node 3: [mem 0x3080000000-0x407ff7ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] Initmem setup node 0 [mem 0x00001000-0x107f37ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] Initmem setup node 1 [mem 0x1080000000-0x207ff7ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] Initmem setup node 2 [mem 0x2080000000-0x307ff7ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] Initmem setup node 3 [mem 0x3080000000-0x407ff7ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: PM-Timer IO Port: 0x408 [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x00] lapic_id[0x00] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x01] lapic_id[0x10] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x02] lapic_id[0x20] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x03] lapic_id[0x30] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x04] lapic_id[0x08] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x05] lapic_id[0x18] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x06] lapic_id[0x28] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x07] lapic_id[0x38] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x08] lapic_id[0x02] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x09] lapic_id[0x12] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0a] lapic_id[0x22] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0b] lapic_id[0x32] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0c] lapic_id[0x0a] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0d] lapic_id[0x1a] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0e] lapic_id[0x2a] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0f] lapic_id[0x3a] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x10] lapic_id[0x04] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x11] lapic_id[0x14] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x12] lapic_id[0x24] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x13] lapic_id[0x34] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x14] lapic_id[0x0c] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x15] lapic_id[0x1c] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x16] lapic_id[0x2c] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x17] lapic_id[0x3c] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x18] lapic_id[0x01] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x19] lapic_id[0x11] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1a] lapic_id[0x21] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1b] lapic_id[0x31] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1c] lapic_id[0x09] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1d] lapic_id[0x19] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1e] lapic_id[0x29] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1f] lapic_id[0x39] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x20] lapic_id[0x03] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x21] lapic_id[0x13] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x22] lapic_id[0x23] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x23] lapic_id[0x33] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x24] lapic_id[0x0b] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x25] lapic_id[0x1b] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x26] lapic_id[0x2b] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x27] lapic_id[0x3b] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x28] lapic_id[0x05] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x29] lapic_id[0x15] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2a] lapic_id[0x25] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2b] lapic_id[0x35] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2c] lapic_id[0x0d] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2d] lapic_id[0x1d] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2e] lapic_id[0x2d] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2f] lapic_id[0x3d] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x30] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x31] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x32] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x33] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x34] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x35] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x36] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x37] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x38] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x39] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3a] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3b] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3c] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3d] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3e] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3f] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x40] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x41] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x42] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x43] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x44] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x45] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x46] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x47] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x48] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x49] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4a] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4b] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4c] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4d] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4e] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4f] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x50] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x51] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x52] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x53] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x54] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x55] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x56] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x57] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x58] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x59] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5a] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5b] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5c] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5d] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5e] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5f] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x60] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x61] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x62] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x63] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x64] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x65] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x66] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x67] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x68] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x69] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6a] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6b] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6c] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6d] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6e] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6f] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x70] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x71] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x72] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x73] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x74] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x75] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x76] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x77] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x78] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x79] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7a] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7b] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7c] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7d] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7e] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7f] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC_NMI (acpi_id[0xff] high edge lint[0x1]) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: IOAPIC (id[0x80] address[0xfec00000] gsi_base[0]) [Mon Dec 9 06:17:04 2019][ 0.000000] IOAPIC[0]: apic_id 128, version 33, address 0xfec00000, GSI 0-23 [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: IOAPIC (id[0x81] address[0xfd880000] gsi_base[24]) [Mon Dec 9 06:17:04 2019][ 0.000000] IOAPIC[1]: apic_id 129, version 33, address 0xfd880000, GSI 24-55 [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: IOAPIC (id[0x82] address[0xe0900000] gsi_base[56]) [Mon Dec 9 06:17:04 2019][ 0.000000] IOAPIC[2]: apic_id 130, version 33, address 0xe0900000, GSI 56-87 [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: IOAPIC (id[0x83] address[0xc5900000] gsi_base[88]) [Mon Dec 9 06:17:04 2019][ 0.000000] IOAPIC[3]: apic_id 131, version 33, address 0xc5900000, GSI 88-119 [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: IOAPIC (id[0x84] address[0xaa900000] gsi_base[120]) [Mon Dec 9 06:17:04 2019][ 0.000000] IOAPIC[4]: apic_id 132, version 33, address 0xaa900000, GSI 120-151 [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 low level) [Mon Dec 9 06:17:04 2019][ 0.000000] Using ACPI (MADT) for SMP configuration information [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: HPET id: 0x10228201 base: 0xfed00000 [Mon Dec 9 06:17:04 2019][ 0.000000] smpboot: Allowing 128 CPUs, 80 hotplug CPUs [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x0008f000-0x0008ffff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x000a0000-0x000fffff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37ac0000-0x37ac0fff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37ad8000-0x37ad8fff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37ad9000-0x37ad9fff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b02000-0x37b02fff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b03000-0x37b03fff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b0b000-0x37b0bfff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b0c000-0x37b0cfff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b3d000-0x37b3dfff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b3e000-0x37b3efff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b6f000-0x37b6ffff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b70000-0x37b70fff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37c11000-0x37c11fff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x4f774000-0x5777cfff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x6cacf000-0x6efcefff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x6efcf000-0x6fdfefff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x6fdff000-0x6fffefff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x70000000-0x8fffffff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x90000000-0xfec0ffff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0xfec10000-0xfec10fff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0xfec11000-0xfed7ffff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0xfed80000-0xfed80fff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0xfed81000-0xffffffff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x107f380000-0x107fffffff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x207ff80000-0x207fffffff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x307ff80000-0x307fffffff] [Mon Dec 9 06:17:04 2019][ 0.000000] e820: [mem 0x90000000-0xfec0ffff] available for PCI devices [Mon Dec 9 06:17:04 2019][ 0.000000] Booting paravirtualized kernel on bare hardware [Mon Dec 9 06:17:04 2019][ 0.000000] setup_percpu: NR_CPUS:5120 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:4 [Mon Dec 9 06:17:04 2019][ 0.000000] PERCPU: Embedded 38 pages/cpu @ffff88f2fee00000 s118784 r8192 d28672 u262144 [Mon Dec 9 06:17:04 2019][ 0.000000] Built 4 zonelists in Zone order, mobility grouping on. Total pages: 65945355 [Mon Dec 9 06:17:04 2019][ 0.000000] Policy zone: Normal [Mon Dec 9 06:17:04 2019][ 0.000000] Kernel command line: BOOT_IMAGE=/boot/vmlinuz-3.10.0-957.27.2.el7_lustre.pl2.x86_64 root=UUID=c4f754c4-e7db-49b7-baed-d6c7905c5cdc ro crashkernel=auto nomodeset console=ttyS0,115200 LANG=en_US.UTF-8 [Mon Dec 9 06:17:04 2019][ 0.000000] PID hash table entries: 4096 (order: 3, 32768 bytes) [Mon Dec 9 06:17:04 2019][ 0.000000] x86/fpu: xstate_offset[2]: 0240, xstate_sizes[2]: 0100 [Mon Dec 9 06:17:04 2019][ 0.000000] xsave: enabled xstate_bv 0x7, cntxt size 0x340 using standard form [Mon Dec 9 06:17:04 2019][ 0.000000] Memory: 9561188k/270532096k available (7676k kernel code, 2559084k absent, 4706776k reserved, 6045k data, 1876k init) [Mon Dec 9 06:17:04 2019][ 0.000000] SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=4 [Mon Dec 9 06:17:04 2019][ 0.000000] Hierarchical RCU implementation. [Mon Dec 9 06:17:04 2019][ 0.000000] RCU restricting CPUs from NR_CPUS=5120 to nr_cpu_ids=128. [Mon Dec 9 06:17:04 2019][ 0.000000] NR_IRQS:327936 nr_irqs:3624 0 [Mon Dec 9 06:17:05 2019][ 0.000000] Console: colour dummy device 80x25 [Mon Dec 9 06:17:05 2019][ 0.000000] console [ttyS0] enabled [Mon Dec 9 06:17:05 2019][ 0.000000] allocated 1072693248 bytes of page_cgroup [Mon Dec 9 06:17:05 2019][ 0.000000] please try 'cgroup_disable=memory' option if you don't want memory cgroups [Mon Dec 9 06:17:05 2019][ 0.000000] Enabling automatic NUMA balancing. Configure with numa_balancing= or the kernel.numa_balancing sysctl [Mon Dec 9 06:17:05 2019][ 0.000000] tsc: Fast TSC calibration using PIT [Mon Dec 9 06:17:05 2019][ 0.000000] tsc: Detected 1996.203 MHz processor [Mon Dec 9 06:17:05 2019][ 0.000054] Calibrating delay loop (skipped), value calculated using timer frequency.. 3992.40 BogoMIPS (lpj=1996203) [Mon Dec 9 06:17:05 2019][ 0.010696] pid_max: default: 131072 minimum: 1024 [Mon Dec 9 06:17:05 2019][ 0.016302] Security Framework initialized [Mon Dec 9 06:17:05 2019][ 0.020434] SELinux: Initializing. [Mon Dec 9 06:17:05 2019][ 0.023996] Yama: becoming mindful. [Mon Dec 9 06:17:05 2019][ 0.044183] Dentry cache hash table entries: 33554432 (order: 16, 268435456 bytes) [Mon Dec 9 06:17:05 2019][ 0.100110] Inode-cache hash table entries: 16777216 (order: 15, 134217728 bytes) [Mon Dec 9 06:17:05 2019][ 0.127915] Mount-cache hash table entries: 524288 (order: 10, 4194304 bytes) [Mon Dec 9 06:17:05 2019][ 0.135314] Mountpoint-cache hash table entries: 524288 (order: 10, 4194304 bytes) [Mon Dec 9 06:17:05 2019][ 0.144434] Initializing cgroup subsys memory [Mon Dec 9 06:17:05 2019][ 0.148828] Initializing cgroup subsys devices [Mon Dec 9 06:17:05 2019][ 0.153284] Initializing cgroup subsys freezer [Mon Dec 9 06:17:05 2019][ 0.157739] Initializing cgroup subsys net_cls [Mon Dec 9 06:17:05 2019][ 0.162192] Initializing cgroup subsys blkio [Mon Dec 9 06:17:05 2019][ 0.166473] Initializing cgroup subsys perf_event [Mon Dec 9 06:17:05 2019][ 0.171199] Initializing cgroup subsys hugetlb [Mon Dec 9 06:17:05 2019][ 0.175651] Initializing cgroup subsys pids [Mon Dec 9 06:17:05 2019][ 0.179848] Initializing cgroup subsys net_prio [Mon Dec 9 06:17:05 2019][ 0.190074] LVT offset 2 assigned for vector 0xf4 [Mon Dec 9 06:17:05 2019][ 0.194799] Last level iTLB entries: 4KB 1024, 2MB 1024, 4MB 512 [Mon Dec 9 06:17:05 2019][ 0.200811] Last level dTLB entries: 4KB 1536, 2MB 1536, 4MB 768 [Mon Dec 9 06:17:05 2019][ 0.206826] tlb_flushall_shift: 6 [Mon Dec 9 06:17:05 2019][ 0.210173] Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl and seccomp [Mon Dec 9 06:17:05 2019][ 0.219739] FEATURE SPEC_CTRL Not Present [Mon Dec 9 06:17:05 2019][ 0.223761] FEATURE IBPB_SUPPORT Present [Mon Dec 9 06:17:05 2019][ 0.227698] Spectre V2 : Enabling Indirect Branch Prediction Barrier [Mon Dec 9 06:17:05 2019][ 0.234130] Spectre V2 : Mitigation: Full retpoline [Mon Dec 9 06:17:05 2019][ 0.239995] Freeing SMP alternatives: 28k freed [Mon Dec 9 06:17:05 2019][ 0.246175] ACPI: Core revision 20130517 [Mon Dec 9 06:17:05 2019][ 0.254932] ACPI: All ACPI Tables successfully acquired [Mon Dec 9 06:17:05 2019][ 0.266870] ftrace: allocating 29216 entries in 115 pages [Mon Dec 9 06:17:05 2019][ 0.606033] Switched APIC routing to physical flat. [Mon Dec 9 06:17:05 2019][ 0.612952] ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 [Mon Dec 9 06:17:05 2019][ 0.628964] smpboot: CPU0: AMD EPYC 7401P 24-Core Processor (fam: 17, model: 01, stepping: 02) [Mon Dec 9 06:17:05 2019][ 0.713416] random: fast init done [Mon Dec 9 06:17:05 2019][ 0.741417] APIC calibration not consistent with PM-Timer: 101ms instead of 100ms [Mon Dec 9 06:17:05 2019][ 0.748893] APIC delta adjusted to PM-Timer: 623826 (636296) [Mon Dec 9 06:17:05 2019][ 0.754582] Performance Events: Fam17h core perfctr, AMD PMU driver. [Mon Dec 9 06:17:05 2019][ 0.761013] ... version: 0 [Mon Dec 9 06:17:05 2019][ 0.765022] ... bit width: 48 [Mon Dec 9 06:17:05 2019][ 0.769123] ... generic registers: 6 [Mon Dec 9 06:17:05 2019][ 0.773136] ... value mask: 0000ffffffffffff [Mon Dec 9 06:17:05 2019][ 0.778448] ... max period: 00007fffffffffff [Mon Dec 9 06:17:05 2019][ 0.783761] ... fixed-purpose events: 0 [Mon Dec 9 06:17:05 2019][ 0.787774] ... event mask: 000000000000003f [Mon Dec 9 06:17:06 2019][ 0.795988] NMI watchdog: enabled on all CPUs, permanently consumes one hw-PMU counter. [Mon Dec 9 06:17:06 2019][ 0.804070] smpboot: Booting Node 1, Processors #1 OK [Mon Dec 9 06:17:06 2019][ 0.817266] smpboot: Booting Node 2, Processors #2 OK [Mon Dec 9 06:17:06 2019][ 0.830467] smpboot: Booting Node 3, Processors #3 OK [Mon Dec 9 06:17:06 2019][ 0.843661] smpboot: Booting Node 0, Processors #4 OK [Mon Dec 9 06:17:06 2019][ 0.856842] smpboot: Booting Node 1, Processors #5 OK [Mon Dec 9 06:17:06 2019][ 0.870024] smpboot: Booting Node 2, Processors #6 OK [Mon Dec 9 06:17:06 2019][ 0.883205] smpboot: Booting Node 3, Processors #7 OK [Mon Dec 9 06:17:06 2019][ 0.896387] smpboot: Booting Node 0, Processors #8 OK [Mon Dec 9 06:17:06 2019][ 0.909790] smpboot: Booting Node 1, Processors #9 OK [Mon Dec 9 06:17:06 2019][ 0.922979] smpboot: Booting Node 2, Processors #10 OK [Mon Dec 9 06:17:06 2019][ 0.936263] smpboot: Booting Node 3, Processors #11 OK [Mon Dec 9 06:17:06 2019][ 0.949533] smpboot: Booting Node 0, Processors #12 OK [Mon Dec 9 06:17:06 2019][ 0.962805] smpboot: Booting Node 1, Processors #13 OK [Mon Dec 9 06:17:06 2019][ 0.976086] smpboot: Booting Node 2, Processors #14 OK [Mon Dec 9 06:17:06 2019][ 0.989356] smpboot: Booting Node 3, Processors #15 OK [Mon Dec 9 06:17:06 2019][ 1.002629] smpboot: Booting Node 0, Processors #16 OK [Mon Dec 9 06:17:06 2019][ 1.016009] smpboot: Booting Node 1, Processors #17 OK [Mon Dec 9 06:17:06 2019][ 1.029283] smpboot: Booting Node 2, Processors #18 OK [Mon Dec 9 06:17:06 2019][ 1.042562] smpboot: Booting Node 3, Processors #19 OK [Mon Dec 9 06:17:06 2019][ 1.055835] smpboot: Booting Node 0, Processors #20 OK [Mon Dec 9 06:17:06 2019][ 1.069101] smpboot: Booting Node 1, Processors #21 OK [Mon Dec 9 06:17:06 2019][ 1.082370] smpboot: Booting Node 2, Processors #22 OK [Mon Dec 9 06:17:06 2019][ 1.095651] smpboot: Booting Node 3, Processors #23 OK [Mon Dec 9 06:17:06 2019][ 1.108919] smpboot: Booting Node 0, Processors #24 OK [Mon Dec 9 06:17:06 2019][ 1.122639] smpboot: Booting Node 1, Processors #25 OK [Mon Dec 9 06:17:06 2019][ 1.135889] smpboot: Booting Node 2, Processors #26 OK [Mon Dec 9 06:17:06 2019][ 1.149138] smpboot: Booting Node 3, Processors #27 OK [Mon Dec 9 06:17:06 2019][ 1.162374] smpboot: Booting Node 0, Processors #28 OK [Mon Dec 9 06:17:06 2019][ 1.175603] smpboot: Booting Node 1, Processors #29 OK [Mon Dec 9 06:17:06 2019][ 1.188826] smpboot: Booting Node 2, Processors #30 OK [Mon Dec 9 06:17:06 2019][ 1.202060] smpboot: Booting Node 3, Processors #31 OK [Mon Dec 9 06:17:06 2019][ 1.215284] smpboot: Booting Node 0, Processors #32 OK [Mon Dec 9 06:17:06 2019][ 1.228617] smpboot: Booting Node 1, Processors #33 OK [Mon Dec 9 06:17:06 2019][ 1.241860] smpboot: Booting Node 2, Processors #34 OK [Mon Dec 9 06:17:06 2019][ 1.255109] smpboot: Booting Node 3, Processors #35 OK [Mon Dec 9 06:17:06 2019][ 1.268334] smpboot: Booting Node 0, Processors #36 OK [Mon Dec 9 06:17:06 2019][ 1.281561] smpboot: Booting Node 1, Processors #37 OK [Mon Dec 9 06:17:06 2019][ 1.294892] smpboot: Booting Node 2, Processors #38 OK [Mon Dec 9 06:17:06 2019][ 1.308133] smpboot: Booting Node 3, Processors #39 OK [Mon Dec 9 06:17:06 2019][ 1.321359] smpboot: Booting Node 0, Processors #40 OK [Mon Dec 9 06:17:06 2019][ 1.334690] smpboot: Booting Node 1, Processors #41 OK [Mon Dec 9 06:17:06 2019][ 1.348038] smpboot: Booting Node 2, Processors #42 OK [Mon Dec 9 06:17:06 2019][ 1.361269] smpboot: Booting Node 3, Processors #43 OK [Mon Dec 9 06:17:06 2019][ 1.374495] smpboot: Booting Node 0, Processors #44 OK [Mon Dec 9 06:17:06 2019][ 1.387731] smpboot: Booting Node 1, Processors #45 OK [Mon Dec 9 06:17:06 2019][ 1.400965] smpboot: Booting Node 2, Processors #46 OK [Mon Dec 9 06:17:06 2019][ 1.414198] smpboot: Booting Node 3, Processors #47 [Mon Dec 9 06:17:06 2019][ 1.426902] Brought up 48 CPUs [Mon Dec 9 06:17:06 2019][ 1.430165] smpboot: Max logical packages: 3 [Mon Dec 9 06:17:06 2019][ 1.434442] smpboot: Total of 48 processors activated (191635.48 BogoMIPS) [Mon Dec 9 06:17:06 2019][ 1.725902] node 0 initialised, 15458277 pages in 278ms [Mon Dec 9 06:17:06 2019][ 1.731707] node 2 initialised, 15989367 pages in 280ms [Mon Dec 9 06:17:06 2019][ 1.731964] node 1 initialised, 15989367 pages in 280ms [Mon Dec 9 06:17:06 2019][ 1.731979] node 3 initialised, 15989247 pages in 279ms [Mon Dec 9 06:17:06 2019][ 1.747929] devtmpfs: initialized [Mon Dec 9 06:17:06 2019][ 1.773779] EVM: security.selinux [Mon Dec 9 06:17:06 2019][ 1.777101] EVM: security.ima [Mon Dec 9 06:17:06 2019][ 1.780073] EVM: security.capability [Mon Dec 9 06:17:07 2019][ 1.783749] PM: Registering ACPI NVS region [mem 0x0008f000-0x0008ffff] (4096 bytes) [Mon Dec 9 06:17:07 2019][ 1.791491] PM: Registering ACPI NVS region [mem 0x6efcf000-0x6fdfefff] (14876672 bytes) [Mon Dec 9 06:17:07 2019][ 1.801105] atomic64 test passed for x86-64 platform with CX8 and with SSE [Mon Dec 9 06:17:07 2019][ 1.807981] pinctrl core: initialized pinctrl subsystem [Mon Dec 9 06:17:07 2019][ 1.813308] RTC time: 14:17:06, date: 12/09/19 [Mon Dec 9 06:17:07 2019][ 1.817906] NET: Registered protocol family 16 [Mon Dec 9 06:17:07 2019][ 1.822697] ACPI FADT declares the system doesn't support PCIe ASPM, so disable it [Mon Dec 9 06:17:07 2019][ 1.830263] ACPI: bus type PCI registered [Mon Dec 9 06:17:07 2019][ 1.834276] acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 [Mon Dec 9 06:17:07 2019][ 1.840854] PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0x80000000-0x8fffffff] (base 0x80000000) [Mon Dec 9 06:17:07 2019][ 1.850155] PCI: MMCONFIG at [mem 0x80000000-0x8fffffff] reserved in E820 [Mon Dec 9 06:17:07 2019][ 1.856947] PCI: Using configuration type 1 for base access [Mon Dec 9 06:17:07 2019][ 1.862529] PCI: Dell System detected, enabling pci=bfsort. [Mon Dec 9 06:17:07 2019][ 1.877653] ACPI: Added _OSI(Module Device) [Mon Dec 9 06:17:07 2019][ 1.881842] ACPI: Added _OSI(Processor Device) [Mon Dec 9 06:17:07 2019][ 1.886284] ACPI: Added _OSI(3.0 _SCP Extensions) [Mon Dec 9 06:17:07 2019][ 1.890992] ACPI: Added _OSI(Processor Aggregator Device) [Mon Dec 9 06:17:07 2019][ 1.896391] ACPI: Added _OSI(Linux-Dell-Video) [Mon Dec 9 06:17:07 2019][ 1.902621] ACPI: Executed 2 blocks of module-level executable AML code [Mon Dec 9 06:17:07 2019][ 1.914664] ACPI: Interpreter enabled [Mon Dec 9 06:17:07 2019][ 1.918342] ACPI: (supports S0 S5) [Mon Dec 9 06:17:07 2019][ 1.921750] ACPI: Using IOAPIC for interrupt routing [Mon Dec 9 06:17:07 2019][ 1.926928] HEST: Table parsing has been initialized. [Mon Dec 9 06:17:07 2019][ 1.931988] PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug [Mon Dec 9 06:17:07 2019][ 1.941136] ACPI: Enabled 1 GPEs in block 00 to 1F [Mon Dec 9 06:17:07 2019][ 1.952803] ACPI: PCI Interrupt Link [LNKA] (IRQs 4 5 7 10 11 14 15) *0 [Mon Dec 9 06:17:07 2019][ 1.959713] ACPI: PCI Interrupt Link [LNKB] (IRQs 4 5 7 10 11 14 15) *0 [Mon Dec 9 06:17:07 2019][ 1.966620] ACPI: PCI Interrupt Link [LNKC] (IRQs 4 5 7 10 11 14 15) *0 [Mon Dec 9 06:17:07 2019][ 1.973524] ACPI: PCI Interrupt Link [LNKD] (IRQs 4 5 7 10 11 14 15) *0 [Mon Dec 9 06:17:07 2019][ 1.980433] ACPI: PCI Interrupt Link [LNKE] (IRQs 4 5 7 10 11 14 15) *0 [Mon Dec 9 06:17:07 2019][ 1.987340] ACPI: PCI Interrupt Link [LNKF] (IRQs 4 5 7 10 11 14 15) *0 [Mon Dec 9 06:17:07 2019][ 1.994247] ACPI: PCI Interrupt Link [LNKG] (IRQs 4 5 7 10 11 14 15) *0 [Mon Dec 9 06:17:07 2019][ 2.001153] ACPI: PCI Interrupt Link [LNKH] (IRQs 4 5 7 10 11 14 15) *0 [Mon Dec 9 06:17:07 2019][ 2.008204] ACPI: PCI Root Bridge [PC00] (domain 0000 [bus 00-3f]) [Mon Dec 9 06:17:07 2019][ 2.014386] acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI] [Mon Dec 9 06:17:07 2019][ 2.022603] acpi PNP0A08:00: PCIe AER handled by firmware [Mon Dec 9 06:17:07 2019][ 2.028047] acpi PNP0A08:00: _OSC: platform does not support [SHPCHotplug] [Mon Dec 9 06:17:07 2019][ 2.034993] acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] [Mon Dec 9 06:17:07 2019][ 2.042646] acpi PNP0A08:00: FADT indicates ASPM is unsupported, using BIOS configuration [Mon Dec 9 06:17:07 2019][ 2.051101] PCI host bridge to bus 0000:00 [Mon Dec 9 06:17:07 2019][ 2.055205] pci_bus 0000:00: root bus resource [io 0x0000-0x03af window] [Mon Dec 9 06:17:07 2019][ 2.061990] pci_bus 0000:00: root bus resource [io 0x03e0-0x0cf7 window] [Mon Dec 9 06:17:07 2019][ 2.068777] pci_bus 0000:00: root bus resource [mem 0x000c0000-0x000c3fff window] [Mon Dec 9 06:17:07 2019][ 2.076255] pci_bus 0000:00: root bus resource [mem 0x000c4000-0x000c7fff window] [Mon Dec 9 06:17:07 2019][ 2.083734] pci_bus 0000:00: root bus resource [mem 0x000c8000-0x000cbfff window] [Mon Dec 9 06:17:07 2019][ 2.091213] pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000cffff window] [Mon Dec 9 06:17:07 2019][ 2.098694] pci_bus 0000:00: root bus resource [mem 0x000d0000-0x000d3fff window] [Mon Dec 9 06:17:07 2019][ 2.106171] pci_bus 0000:00: root bus resource [mem 0x000d4000-0x000d7fff window] [Mon Dec 9 06:17:07 2019][ 2.113653] pci_bus 0000:00: root bus resource [mem 0x000d8000-0x000dbfff window] [Mon Dec 9 06:17:07 2019][ 2.121131] pci_bus 0000:00: root bus resource [mem 0x000dc000-0x000dffff window] [Mon Dec 9 06:17:07 2019][ 2.128611] pci_bus 0000:00: root bus resource [mem 0x000e0000-0x000e3fff window] [Mon Dec 9 06:17:07 2019][ 2.136091] pci_bus 0000:00: root bus resource [mem 0x000e4000-0x000e7fff window] [Mon Dec 9 06:17:07 2019][ 2.143571] pci_bus 0000:00: root bus resource [mem 0x000e8000-0x000ebfff window] [Mon Dec 9 06:17:07 2019][ 2.151051] pci_bus 0000:00: root bus resource [mem 0x000ec000-0x000effff window] [Mon Dec 9 06:17:07 2019][ 2.158529] pci_bus 0000:00: root bus resource [mem 0x000f0000-0x000fffff window] [Mon Dec 9 06:17:07 2019][ 2.166009] pci_bus 0000:00: root bus resource [io 0x0d00-0x3fff window] [Mon Dec 9 06:17:07 2019][ 2.172796] pci_bus 0000:00: root bus resource [mem 0xe1000000-0xfebfffff window] [Mon Dec 9 06:17:07 2019][ 2.180274] pci_bus 0000:00: root bus resource [mem 0x10000000000-0x2bf3fffffff window] [Mon Dec 9 06:17:07 2019][ 2.188276] pci_bus 0000:00: root bus resource [bus 00-3f] [Mon Dec 9 06:17:07 2019][ 2.200937] pci 0000:00:03.1: PCI bridge to [bus 01] [Mon Dec 9 06:17:07 2019][ 2.206321] pci 0000:00:07.1: PCI bridge to [bus 02] [Mon Dec 9 06:17:07 2019][ 2.212092] pci 0000:00:08.1: PCI bridge to [bus 03] [Mon Dec 9 06:17:07 2019][ 2.217459] ACPI: PCI Root Bridge [PC01] (domain 0000 [bus 40-7f]) [Mon Dec 9 06:17:07 2019][ 2.223637] acpi PNP0A08:01: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI] [Mon Dec 9 06:17:07 2019][ 2.231847] acpi PNP0A08:01: PCIe AER handled by firmware [Mon Dec 9 06:17:07 2019][ 2.237292] acpi PNP0A08:01: _OSC: platform does not support [SHPCHotplug] [Mon Dec 9 06:17:07 2019][ 2.244238] acpi PNP0A08:01: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] [Mon Dec 9 06:17:07 2019][ 2.251889] acpi PNP0A08:01: FADT indicates ASPM is unsupported, using BIOS configuration [Mon Dec 9 06:17:07 2019][ 2.260311] PCI host bridge to bus 0000:40 [Mon Dec 9 06:17:07 2019][ 2.264413] pci_bus 0000:40: root bus resource [io 0x4000-0x7fff window] [Mon Dec 9 06:17:07 2019][ 2.271198] pci_bus 0000:40: root bus resource [mem 0xc6000000-0xe0ffffff window] [Mon Dec 9 06:17:07 2019][ 2.278679] pci_bus 0000:40: root bus resource [mem 0x2bf40000000-0x47e7fffffff window] [Mon Dec 9 06:17:07 2019][ 2.286679] pci_bus 0000:40: root bus resource [bus 40-7f] [Mon Dec 9 06:17:07 2019][ 2.294569] pci 0000:40:07.1: PCI bridge to [bus 41] [Mon Dec 9 06:17:07 2019][ 2.299887] pci 0000:40:08.1: PCI bridge to [bus 42] [Mon Dec 9 06:17:07 2019][ 2.305046] ACPI: PCI Root Bridge [PC02] (domain 0000 [bus 80-bf]) [Mon Dec 9 06:17:07 2019][ 2.311227] acpi PNP0A08:02: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI] [Mon Dec 9 06:17:07 2019][ 2.319434] acpi PNP0A08:02: PCIe AER handled by firmware [Mon Dec 9 06:17:07 2019][ 2.324878] acpi PNP0A08:02: _OSC: platform does not support [SHPCHotplug] [Mon Dec 9 06:17:07 2019][ 2.331826] acpi PNP0A08:02: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] [Mon Dec 9 06:17:07 2019][ 2.339475] acpi PNP0A08:02: FADT indicates ASPM is unsupported, using BIOS configuration [Mon Dec 9 06:17:07 2019][ 2.347918] PCI host bridge to bus 0000:80 [Mon Dec 9 06:17:07 2019][ 2.352018] pci_bus 0000:80: root bus resource [io 0x03b0-0x03df window] [Mon Dec 9 06:17:07 2019][ 2.358804] pci_bus 0000:80: root bus resource [mem 0x000a0000-0x000bffff window] [Mon Dec 9 06:17:07 2019][ 2.366284] pci_bus 0000:80: root bus resource [io 0x8000-0xbfff window] [Mon Dec 9 06:17:07 2019][ 2.373070] pci_bus 0000:80: root bus resource [mem 0xab000000-0xc5ffffff window] [Mon Dec 9 06:17:07 2019][ 2.380549] pci_bus 0000:80: root bus resource [mem 0x47e80000000-0x63dbfffffff window] [Mon Dec 9 06:17:07 2019][ 2.388549] pci_bus 0000:80: root bus resource [bus 80-bf] [Mon Dec 9 06:17:07 2019][ 2.397569] pci 0000:80:01.1: PCI bridge to [bus 81] [Mon Dec 9 06:17:07 2019][ 2.405307] pci 0000:80:01.2: PCI bridge to [bus 82-83] [Mon Dec 9 06:17:07 2019][ 2.410776] pci 0000:82:00.0: PCI bridge to [bus 83] [Mon Dec 9 06:17:07 2019][ 2.418306] pci 0000:80:03.1: PCI bridge to [bus 84] [Mon Dec 9 06:17:07 2019][ 2.423595] pci 0000:80:07.1: PCI bridge to [bus 85] [Mon Dec 9 06:17:07 2019][ 2.429315] pci 0000:80:08.1: PCI bridge to [bus 86] [Mon Dec 9 06:17:07 2019][ 2.434488] ACPI: PCI Root Bridge [PC03] (domain 0000 [bus c0-ff]) [Mon Dec 9 06:17:07 2019][ 2.440674] acpi PNP0A08:03: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI] [Mon Dec 9 06:17:07 2019][ 2.448881] acpi PNP0A08:03: PCIe AER handled by firmware [Mon Dec 9 06:17:07 2019][ 2.454318] acpi PNP0A08:03: _OSC: platform does not support [SHPCHotplug] [Mon Dec 9 06:17:07 2019][ 2.461265] acpi PNP0A08:03: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] [Mon Dec 9 06:17:07 2019][ 2.468918] acpi PNP0A08:03: FADT indicates ASPM is unsupported, using BIOS configuration [Mon Dec 9 06:17:07 2019][ 2.477248] acpi PNP0A08:03: host bridge window [mem 0x63dc0000000-0xffffffffffff window] ([0x80000000000-0xffffffffffff] ignored, not CPU addressable) [Mon Dec 9 06:17:07 2019][ 2.490882] PCI host bridge to bus 0000:c0 [Mon Dec 9 06:17:07 2019][ 2.494988] pci_bus 0000:c0: root bus resource [io 0xc000-0xffff window] [Mon Dec 9 06:17:07 2019][ 2.501772] pci_bus 0000:c0: root bus resource [mem 0x90000000-0xaaffffff window] [Mon Dec 9 06:17:07 2019][ 2.509254] pci_bus 0000:c0: root bus resource [mem 0x63dc0000000-0x7ffffffffff window] [Mon Dec 9 06:17:07 2019][ 2.517253] pci_bus 0000:c0: root bus resource [bus c0-ff] [Mon Dec 9 06:17:07 2019][ 2.524746] pci 0000:c0:01.1: PCI bridge to [bus c1] [Mon Dec 9 06:17:07 2019][ 2.530403] pci 0000:c0:07.1: PCI bridge to [bus c2] [Mon Dec 9 06:17:07 2019][ 2.535711] pci 0000:c0:08.1: PCI bridge to [bus c3] [Mon Dec 9 06:17:07 2019][ 2.542817] vgaarb: device added: PCI:0000:83:00.0,decodes=io+mem,owns=io+mem,locks=none [Mon Dec 9 06:17:07 2019][ 2.550909] vgaarb: loaded [Mon Dec 9 06:17:07 2019][ 2.553616] vgaarb: bridge control possible 0000:83:00.0 [Mon Dec 9 06:17:07 2019][ 2.559039] SCSI subsystem initialized [Mon Dec 9 06:17:07 2019][ 2.562823] ACPI: bus type USB registered [Mon Dec 9 06:17:07 2019][ 2.566850] usbcore: registered new interface driver usbfs [Mon Dec 9 06:17:07 2019][ 2.572343] usbcore: registered new interface driver hub [Mon Dec 9 06:17:07 2019][ 2.577857] usbcore: registered new device driver usb [Mon Dec 9 06:17:07 2019][ 2.583224] EDAC MC: Ver: 3.0.0 [Mon Dec 9 06:17:07 2019][ 2.586619] PCI: Using ACPI for IRQ routing [Mon Dec 9 06:17:07 2019][ 2.610186] NetLabel: Initializing [Mon Dec 9 06:17:07 2019][ 2.613594] NetLabel: domain hash size = 128 [Mon Dec 9 06:17:07 2019][ 2.617952] NetLabel: protocols = UNLABELED CIPSOv4 [Mon Dec 9 06:17:07 2019][ 2.622933] NetLabel: unlabeled traffic allowed by default [Mon Dec 9 06:17:07 2019][ 2.628701] hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 [Mon Dec 9 06:17:07 2019][ 2.633681] hpet0: 3 comparators, 32-bit 14.318180 MHz counter [Mon Dec 9 06:17:07 2019][ 2.641701] Switched to clocksource hpet [Mon Dec 9 06:17:07 2019][ 2.650499] pnp: PnP ACPI init [Mon Dec 9 06:17:07 2019][ 2.653591] ACPI: bus type PNP registered [Mon Dec 9 06:17:07 2019][ 2.657839] system 00:00: [mem 0x80000000-0x8fffffff] has been reserved [Mon Dec 9 06:17:07 2019][ 2.665065] pnp: PnP ACPI: found 4 devices [Mon Dec 9 06:17:07 2019][ 2.669174] ACPI: bus type PNP unregistered [Mon Dec 9 06:17:07 2019][ 2.680691] pci 0000:01:00.0: can't claim BAR 6 [mem 0xfff00000-0xffffffff pref]: no compatible bridge window [Mon Dec 9 06:17:07 2019][ 2.690611] pci 0000:81:00.0: can't claim BAR 6 [mem 0xfffc0000-0xffffffff pref]: no compatible bridge window [Mon Dec 9 06:17:07 2019][ 2.700530] pci 0000:81:00.1: can't claim BAR 6 [mem 0xfffc0000-0xffffffff pref]: no compatible bridge window [Mon Dec 9 06:17:07 2019][ 2.710447] pci 0000:84:00.0: can't claim BAR 6 [mem 0xfffc0000-0xffffffff pref]: no compatible bridge window [Mon Dec 9 06:17:07 2019][ 2.720362] pci 0000:c1:00.0: can't claim BAR 6 [mem 0xfff00000-0xffffffff pref]: no compatible bridge window [Mon Dec 9 06:17:07 2019][ 2.730301] pci 0000:00:03.1: BAR 14: assigned [mem 0xe1000000-0xe10fffff] [Mon Dec 9 06:17:07 2019][ 2.737184] pci 0000:01:00.0: BAR 6: assigned [mem 0xe1000000-0xe10fffff pref] [Mon Dec 9 06:17:07 2019][ 2.744411] pci 0000:00:03.1: PCI bridge to [bus 01] [Mon Dec 9 06:17:07 2019][ 2.749389] pci 0000:00:03.1: bridge window [mem 0xe1000000-0xe10fffff] [Mon Dec 9 06:17:07 2019][ 2.756182] pci 0000:00:03.1: bridge window [mem 0xe2000000-0xe3ffffff 64bit pref] [Mon Dec 9 06:17:07 2019][ 2.763931] pci 0000:00:07.1: PCI bridge to [bus 02] [Mon Dec 9 06:17:07 2019][ 2.768908] pci 0000:00:07.1: bridge window [mem 0xf7200000-0xf74fffff] [Mon Dec 9 06:17:07 2019][ 2.775711] pci 0000:00:08.1: PCI bridge to [bus 03] [Mon Dec 9 06:17:07 2019][ 2.780691] pci 0000:00:08.1: bridge window [mem 0xf7000000-0xf71fffff] [Mon Dec 9 06:17:08 2019][ 2.787541] pci 0000:40:07.1: PCI bridge to [bus 41] [Mon Dec 9 06:17:08 2019][ 2.792514] pci 0000:40:07.1: bridge window [mem 0xdb200000-0xdb4fffff] [Mon Dec 9 06:17:08 2019][ 2.799310] pci 0000:40:08.1: PCI bridge to [bus 42] [Mon Dec 9 06:17:08 2019][ 2.804282] pci 0000:40:08.1: bridge window [mem 0xdb000000-0xdb1fffff] [Mon Dec 9 06:17:08 2019][ 2.811119] pci 0000:80:01.1: BAR 14: assigned [mem 0xac300000-0xac3fffff] [Mon Dec 9 06:17:08 2019][ 2.818002] pci 0000:81:00.0: BAR 6: assigned [mem 0xac300000-0xac33ffff pref] [Mon Dec 9 06:17:08 2019][ 2.825230] pci 0000:81:00.1: BAR 6: assigned [mem 0xac340000-0xac37ffff pref] [Mon Dec 9 06:17:08 2019][ 2.832458] pci 0000:80:01.1: PCI bridge to [bus 81] [Mon Dec 9 06:17:08 2019][ 2.837433] pci 0000:80:01.1: bridge window [mem 0xac300000-0xac3fffff] [Mon Dec 9 06:17:08 2019][ 2.844229] pci 0000:80:01.1: bridge window [mem 0xac200000-0xac2fffff 64bit pref] [Mon Dec 9 06:17:08 2019][ 2.851977] pci 0000:82:00.0: PCI bridge to [bus 83] [Mon Dec 9 06:17:08 2019][ 2.856954] pci 0000:82:00.0: bridge window [mem 0xc0000000-0xc08fffff] [Mon Dec 9 06:17:08 2019][ 2.863748] pci 0000:82:00.0: bridge window [mem 0xab000000-0xabffffff 64bit pref] [Mon Dec 9 06:17:08 2019][ 2.871498] pci 0000:80:01.2: PCI bridge to [bus 82-83] [Mon Dec 9 06:17:08 2019][ 2.876737] pci 0000:80:01.2: bridge window [mem 0xc0000000-0xc08fffff] [Mon Dec 9 06:17:08 2019][ 2.883533] pci 0000:80:01.2: bridge window [mem 0xab000000-0xabffffff 64bit pref] [Mon Dec 9 06:17:08 2019][ 2.891282] pci 0000:84:00.0: BAR 6: no space for [mem size 0x00040000 pref] [Mon Dec 9 06:17:08 2019][ 2.898337] pci 0000:84:00.0: BAR 6: failed to assign [mem size 0x00040000 pref] [Mon Dec 9 06:17:08 2019][ 2.905736] pci 0000:80:03.1: PCI bridge to [bus 84] [Mon Dec 9 06:17:08 2019][ 2.910710] pci 0000:80:03.1: bridge window [io 0x8000-0x8fff] [Mon Dec 9 06:17:08 2019][ 2.916813] pci 0000:80:03.1: bridge window [mem 0xc0d00000-0xc0dfffff] [Mon Dec 9 06:17:08 2019][ 2.923609] pci 0000:80:03.1: bridge window [mem 0xac000000-0xac1fffff 64bit pref] [Mon Dec 9 06:17:08 2019][ 2.931357] pci 0000:80:07.1: PCI bridge to [bus 85] [Mon Dec 9 06:17:08 2019][ 2.936332] pci 0000:80:07.1: bridge window [mem 0xc0b00000-0xc0cfffff] [Mon Dec 9 06:17:08 2019][ 2.943129] pci 0000:80:08.1: PCI bridge to [bus 86] [Mon Dec 9 06:17:08 2019][ 2.948109] pci 0000:80:08.1: bridge window [mem 0xc0900000-0xc0afffff] [Mon Dec 9 06:17:08 2019][ 2.954948] pci 0000:c1:00.0: BAR 6: no space for [mem size 0x00100000 pref] [Mon Dec 9 06:17:08 2019][ 2.962001] pci 0000:c1:00.0: BAR 6: failed to assign [mem size 0x00100000 pref] [Mon Dec 9 06:17:08 2019][ 2.969404] pci 0000:c0:01.1: PCI bridge to [bus c1] [Mon Dec 9 06:17:08 2019][ 2.974378] pci 0000:c0:01.1: bridge window [io 0xc000-0xcfff] [Mon Dec 9 06:17:08 2019][ 2.980481] pci 0000:c0:01.1: bridge window [mem 0xa5400000-0xa55fffff] [Mon Dec 9 06:17:08 2019][ 2.987277] pci 0000:c0:07.1: PCI bridge to [bus c2] [Mon Dec 9 06:17:08 2019][ 2.992249] pci 0000:c0:07.1: bridge window [mem 0xa5200000-0xa53fffff] [Mon Dec 9 06:17:08 2019][ 2.999047] pci 0000:c0:08.1: PCI bridge to [bus c3] [Mon Dec 9 06:17:08 2019][ 3.004019] pci 0000:c0:08.1: bridge window [mem 0xa5000000-0xa51fffff] [Mon Dec 9 06:17:08 2019][ 3.010915] NET: Registered protocol family 2 [Mon Dec 9 06:17:08 2019][ 3.016003] TCP established hash table entries: 524288 (order: 10, 4194304 bytes) [Mon Dec 9 06:17:08 2019][ 3.024149] TCP bind hash table entries: 65536 (order: 8, 1048576 bytes) [Mon Dec 9 06:17:08 2019][ 3.030977] TCP: Hash tables configured (established 524288 bind 65536) [Mon Dec 9 06:17:08 2019][ 3.037640] TCP: reno registered [Mon Dec 9 06:17:08 2019][ 3.040983] UDP hash table entries: 65536 (order: 9, 2097152 bytes) [Mon Dec 9 06:17:08 2019][ 3.047582] UDP-Lite hash table entries: 65536 (order: 9, 2097152 bytes) [Mon Dec 9 06:17:08 2019][ 3.054784] NET: Registered protocol family 1 [Mon Dec 9 06:17:08 2019][ 3.059718] Unpacking initramfs... [Mon Dec 9 06:17:08 2019][ 3.329591] Freeing initrd memory: 19712k freed [Mon Dec 9 06:17:08 2019][ 3.336944] AMD-Vi: IOMMU performance counters supported [Mon Dec 9 06:17:08 2019][ 3.342348] AMD-Vi: IOMMU performance counters supported [Mon Dec 9 06:17:08 2019][ 3.347709] AMD-Vi: IOMMU performance counters supported [Mon Dec 9 06:17:08 2019][ 3.353070] AMD-Vi: IOMMU performance counters supported [Mon Dec 9 06:17:08 2019][ 3.359733] iommu: Adding device 0000:00:01.0 to group 0 [Mon Dec 9 06:17:08 2019][ 3.365769] iommu: Adding device 0000:00:02.0 to group 1 [Mon Dec 9 06:17:08 2019][ 3.371784] iommu: Adding device 0000:00:03.0 to group 2 [Mon Dec 9 06:17:08 2019][ 3.377877] iommu: Adding device 0000:00:03.1 to group 3 [Mon Dec 9 06:17:08 2019][ 3.383905] iommu: Adding device 0000:00:04.0 to group 4 [Mon Dec 9 06:17:08 2019][ 3.389926] iommu: Adding device 0000:00:07.0 to group 5 [Mon Dec 9 06:17:08 2019][ 3.395929] iommu: Adding device 0000:00:07.1 to group 6 [Mon Dec 9 06:17:08 2019][ 3.401936] iommu: Adding device 0000:00:08.0 to group 7 [Mon Dec 9 06:17:08 2019][ 3.407929] iommu: Adding device 0000:00:08.1 to group 8 [Mon Dec 9 06:17:08 2019][ 3.413919] iommu: Adding device 0000:00:14.0 to group 9 [Mon Dec 9 06:17:08 2019][ 3.419257] iommu: Adding device 0000:00:14.3 to group 9 [Mon Dec 9 06:17:08 2019][ 3.425367] iommu: Adding device 0000:00:18.0 to group 10 [Mon Dec 9 06:17:08 2019][ 3.430794] iommu: Adding device 0000:00:18.1 to group 10 [Mon Dec 9 06:17:08 2019][ 3.436218] iommu: Adding device 0000:00:18.2 to group 10 [Mon Dec 9 06:17:08 2019][ 3.441643] iommu: Adding device 0000:00:18.3 to group 10 [Mon Dec 9 06:17:08 2019][ 3.447067] iommu: Adding device 0000:00:18.4 to group 10 [Mon Dec 9 06:17:08 2019][ 3.452494] iommu: Adding device 0000:00:18.5 to group 10 [Mon Dec 9 06:17:08 2019][ 3.457920] iommu: Adding device 0000:00:18.6 to group 10 [Mon Dec 9 06:17:08 2019][ 3.463350] iommu: Adding device 0000:00:18.7 to group 10 [Mon Dec 9 06:17:08 2019][ 3.469514] iommu: Adding device 0000:00:19.0 to group 11 [Mon Dec 9 06:17:08 2019][ 3.474946] iommu: Adding device 0000:00:19.1 to group 11 [Mon Dec 9 06:17:08 2019][ 3.480366] iommu: Adding device 0000:00:19.2 to group 11 [Mon Dec 9 06:17:08 2019][ 3.485791] iommu: Adding device 0000:00:19.3 to group 11 [Mon Dec 9 06:17:08 2019][ 3.491220] iommu: Adding device 0000:00:19.4 to group 11 [Mon Dec 9 06:17:08 2019][ 3.496644] iommu: Adding device 0000:00:19.5 to group 11 [Mon Dec 9 06:17:08 2019][ 3.502076] iommu: Adding device 0000:00:19.6 to group 11 [Mon Dec 9 06:17:08 2019][ 3.507502] iommu: Adding device 0000:00:19.7 to group 11 [Mon Dec 9 06:17:08 2019][ 3.513666] iommu: Adding device 0000:00:1a.0 to group 12 [Mon Dec 9 06:17:08 2019][ 3.519089] iommu: Adding device 0000:00:1a.1 to group 12 [Mon Dec 9 06:17:08 2019][ 3.524517] iommu: Adding device 0000:00:1a.2 to group 12 [Mon Dec 9 06:17:08 2019][ 3.529944] iommu: Adding device 0000:00:1a.3 to group 12 [Mon Dec 9 06:17:08 2019][ 3.535369] iommu: Adding device 0000:00:1a.4 to group 12 [Mon Dec 9 06:17:08 2019][ 3.540795] iommu: Adding device 0000:00:1a.5 to group 12 [Mon Dec 9 06:17:08 2019][ 3.546219] iommu: Adding device 0000:00:1a.6 to group 12 [Mon Dec 9 06:17:08 2019][ 3.551647] iommu: Adding device 0000:00:1a.7 to group 12 [Mon Dec 9 06:17:08 2019][ 3.557837] iommu: Adding device 0000:00:1b.0 to group 13 [Mon Dec 9 06:17:08 2019][ 3.563271] iommu: Adding device 0000:00:1b.1 to group 13 [Mon Dec 9 06:17:08 2019][ 3.568706] iommu: Adding device 0000:00:1b.2 to group 13 [Mon Dec 9 06:17:08 2019][ 3.574138] iommu: Adding device 0000:00:1b.3 to group 13 [Mon Dec 9 06:17:08 2019][ 3.579560] iommu: Adding device 0000:00:1b.4 to group 13 [Mon Dec 9 06:17:08 2019][ 3.584986] iommu: Adding device 0000:00:1b.5 to group 13 [Mon Dec 9 06:17:08 2019][ 3.590415] iommu: Adding device 0000:00:1b.6 to group 13 [Mon Dec 9 06:17:08 2019][ 3.595841] iommu: Adding device 0000:00:1b.7 to group 13 [Mon Dec 9 06:17:08 2019][ 3.601985] iommu: Adding device 0000:01:00.0 to group 14 [Mon Dec 9 06:17:08 2019][ 3.608074] iommu: Adding device 0000:02:00.0 to group 15 [Mon Dec 9 06:17:08 2019][ 3.614213] iommu: Adding device 0000:02:00.2 to group 16 [Mon Dec 9 06:17:08 2019][ 3.620289] iommu: Adding device 0000:02:00.3 to group 17 [Mon Dec 9 06:17:08 2019][ 3.626367] iommu: Adding device 0000:03:00.0 to group 18 [Mon Dec 9 06:17:08 2019][ 3.632463] iommu: Adding device 0000:03:00.1 to group 19 [Mon Dec 9 06:17:08 2019][ 3.638555] iommu: Adding device 0000:40:01.0 to group 20 [Mon Dec 9 06:17:08 2019][ 3.644605] iommu: Adding device 0000:40:02.0 to group 21 [Mon Dec 9 06:17:08 2019][ 3.650721] iommu: Adding device 0000:40:03.0 to group 22 [Mon Dec 9 06:17:08 2019][ 3.656830] iommu: Adding device 0000:40:04.0 to group 23 [Mon Dec 9 06:17:08 2019][ 3.662957] iommu: Adding device 0000:40:07.0 to group 24 [Mon Dec 9 06:17:08 2019][ 3.668968] iommu: Adding device 0000:40:07.1 to group 25 [Mon Dec 9 06:17:08 2019][ 3.675007] iommu: Adding device 0000:40:08.0 to group 26 [Mon Dec 9 06:17:08 2019][ 3.681028] iommu: Adding device 0000:40:08.1 to group 27 [Mon Dec 9 06:17:08 2019][ 3.687086] iommu: Adding device 0000:41:00.0 to group 28 [Mon Dec 9 06:17:08 2019][ 3.693146] iommu: Adding device 0000:41:00.2 to group 29 [Mon Dec 9 06:17:08 2019][ 3.699180] iommu: Adding device 0000:41:00.3 to group 30 [Mon Dec 9 06:17:08 2019][ 3.705263] iommu: Adding device 0000:42:00.0 to group 31 [Mon Dec 9 06:17:08 2019][ 3.711315] iommu: Adding device 0000:42:00.1 to group 32 [Mon Dec 9 06:17:08 2019][ 3.717362] iommu: Adding device 0000:80:01.0 to group 33 [Mon Dec 9 06:17:08 2019][ 3.723370] iommu: Adding device 0000:80:01.1 to group 34 [Mon Dec 9 06:17:08 2019][ 3.729543] iommu: Adding device 0000:80:01.2 to group 35 [Mon Dec 9 06:17:08 2019][ 3.735555] iommu: Adding device 0000:80:02.0 to group 36 [Mon Dec 9 06:17:08 2019][ 3.741617] iommu: Adding device 0000:80:03.0 to group 37 [Mon Dec 9 06:17:08 2019][ 3.747635] iommu: Adding device 0000:80:03.1 to group 38 [Mon Dec 9 06:17:08 2019][ 3.753682] iommu: Adding device 0000:80:04.0 to group 39 [Mon Dec 9 06:17:08 2019][ 3.759718] iommu: Adding device 0000:80:07.0 to group 40 [Mon Dec 9 06:17:08 2019][ 3.765734] iommu: Adding device 0000:80:07.1 to group 41 [Mon Dec 9 06:17:08 2019][ 3.771799] iommu: Adding device 0000:80:08.0 to group 42 [Mon Dec 9 06:17:08 2019][ 3.777819] iommu: Adding device 0000:80:08.1 to group 43 [Mon Dec 9 06:17:08 2019][ 3.783868] iommu: Adding device 0000:81:00.0 to group 44 [Mon Dec 9 06:17:09 2019][ 3.789313] iommu: Adding device 0000:81:00.1 to group 44 [Mon Dec 9 06:17:09 2019][ 3.795370] iommu: Adding device 0000:82:00.0 to group 45 [Mon Dec 9 06:17:09 2019][ 3.800792] iommu: Adding device 0000:83:00.0 to group 45 [Mon Dec 9 06:17:09 2019][ 3.806798] iommu: Adding device 0000:84:00.0 to group 46 [Mon Dec 9 06:17:09 2019][ 3.812810] iommu: Adding device 0000:85:00.0 to group 47 [Mon Dec 9 06:17:09 2019][ 3.818840] iommu: Adding device 0000:85:00.2 to group 48 [Mon Dec 9 06:17:09 2019][ 3.824857] iommu: Adding device 0000:86:00.0 to group 49 [Mon Dec 9 06:17:09 2019][ 3.830881] iommu: Adding device 0000:86:00.1 to group 50 [Mon Dec 9 06:17:09 2019][ 3.836908] iommu: Adding device 0000:86:00.2 to group 51 [Mon Dec 9 06:17:09 2019][ 3.842960] iommu: Adding device 0000:c0:01.0 to group 52 [Mon Dec 9 06:17:09 2019][ 3.849012] iommu: Adding device 0000:c0:01.1 to group 53 [Mon Dec 9 06:17:09 2019][ 3.855075] iommu: Adding device 0000:c0:02.0 to group 54 [Mon Dec 9 06:17:09 2019][ 3.861148] iommu: Adding device 0000:c0:03.0 to group 55 [Mon Dec 9 06:17:09 2019][ 3.867227] iommu: Adding device 0000:c0:04.0 to group 56 [Mon Dec 9 06:17:09 2019][ 3.873264] iommu: Adding device 0000:c0:07.0 to group 57 [Mon Dec 9 06:17:09 2019][ 3.879291] iommu: Adding device 0000:c0:07.1 to group 58 [Mon Dec 9 06:17:09 2019][ 3.885338] iommu: Adding device 0000:c0:08.0 to group 59 [Mon Dec 9 06:17:09 2019][ 3.891360] iommu: Adding device 0000:c0:08.1 to group 60 [Mon Dec 9 06:17:09 2019][ 3.899761] iommu: Adding device 0000:c1:00.0 to group 61 [Mon Dec 9 06:17:09 2019][ 3.905789] iommu: Adding device 0000:c2:00.0 to group 62 [Mon Dec 9 06:17:09 2019][ 3.911848] iommu: Adding device 0000:c2:00.2 to group 63 [Mon Dec 9 06:17:09 2019][ 3.917927] iommu: Adding device 0000:c3:00.0 to group 64 [Mon Dec 9 06:17:09 2019][ 3.924009] iommu: Adding device 0000:c3:00.1 to group 65 [Mon Dec 9 06:17:09 2019][ 3.929651] AMD-Vi: Found IOMMU at 0000:00:00.2 cap 0x40 [Mon Dec 9 06:17:09 2019][ 3.934971] AMD-Vi: Extended features (0xf77ef22294ada): [Mon Dec 9 06:17:09 2019][ 3.940292] PPR NX GT IA GA PC GA_vAPIC [Mon Dec 9 06:17:09 2019][ 3.944425] AMD-Vi: Found IOMMU at 0000:40:00.2 cap 0x40 [Mon Dec 9 06:17:09 2019][ 3.949749] AMD-Vi: Extended features (0xf77ef22294ada): [Mon Dec 9 06:17:09 2019][ 3.955071] PPR NX GT IA GA PC GA_vAPIC [Mon Dec 9 06:17:09 2019][ 3.959204] AMD-Vi: Found IOMMU at 0000:80:00.2 cap 0x40 [Mon Dec 9 06:17:09 2019][ 3.964524] AMD-Vi: Extended features (0xf77ef22294ada): [Mon Dec 9 06:17:09 2019][ 3.969847] PPR NX GT IA GA PC GA_vAPIC [Mon Dec 9 06:17:09 2019][ 3.973988] AMD-Vi: Found IOMMU at 0000:c0:00.2 cap 0x40 [Mon Dec 9 06:17:09 2019][ 3.979312] AMD-Vi: Extended features (0xf77ef22294ada): [Mon Dec 9 06:17:09 2019][ 3.984632] PPR NX GT IA GA PC GA_vAPIC [Mon Dec 9 06:17:09 2019][ 3.988774] AMD-Vi: Interrupt remapping enabled [Mon Dec 9 06:17:09 2019][ 3.993315] AMD-Vi: virtual APIC enabled [Mon Dec 9 06:17:09 2019][ 3.997653] AMD-Vi: Lazy IO/TLB flushing enabled [Mon Dec 9 06:17:09 2019][ 4.003999] perf: AMD NB counters detected [Mon Dec 9 06:17:09 2019][ 4.008151] perf: AMD LLC counters detected [Mon Dec 9 06:17:09 2019][ 4.018402] sha1_ssse3: Using SHA-NI optimized SHA-1 implementation [Mon Dec 9 06:17:09 2019][ 4.024780] sha256_ssse3: Using SHA-256-NI optimized SHA-256 implementation [Mon Dec 9 06:17:09 2019][ 4.033366] futex hash table entries: 32768 (order: 9, 2097152 bytes) [Mon Dec 9 06:17:09 2019][ 4.040000] Initialise system trusted keyring [Mon Dec 9 06:17:09 2019][ 4.044403] audit: initializing netlink socket (disabled) [Mon Dec 9 06:17:09 2019][ 4.049824] type=2000 audit(1575901024.202:1): initialized [Mon Dec 9 06:17:09 2019][ 4.080727] HugeTLB registered 1 GB page size, pre-allocated 0 pages [Mon Dec 9 06:17:09 2019][ 4.087091] HugeTLB registered 2 MB page size, pre-allocated 0 pages [Mon Dec 9 06:17:09 2019][ 4.094745] zpool: loaded [Mon Dec 9 06:17:09 2019][ 4.097380] zbud: loaded [Mon Dec 9 06:17:09 2019][ 4.100291] VFS: Disk quotas dquot_6.6.0 [Mon Dec 9 06:17:09 2019][ 4.104320] Dquot-cache hash table entries: 512 (order 0, 4096 bytes) [Mon Dec 9 06:17:09 2019][ 4.111124] msgmni has been set to 32768 [Mon Dec 9 06:17:09 2019][ 4.115148] Key type big_key registered [Mon Dec 9 06:17:09 2019][ 4.121386] NET: Registered protocol family 38 [Mon Dec 9 06:17:09 2019][ 4.125848] Key type asymmetric registered [Mon Dec 9 06:17:09 2019][ 4.129960] Asymmetric key parser 'x509' registered [Mon Dec 9 06:17:09 2019][ 4.134897] Block layer SCSI generic (bsg) driver version 0.4 loaded (major 248) [Mon Dec 9 06:17:09 2019][ 4.142452] io scheduler noop registered [Mon Dec 9 06:17:09 2019][ 4.146390] io scheduler deadline registered (default) [Mon Dec 9 06:17:09 2019][ 4.151575] io scheduler cfq registered [Mon Dec 9 06:17:09 2019][ 4.155421] io scheduler mq-deadline registered [Mon Dec 9 06:17:09 2019][ 4.159963] io scheduler kyber registered [Mon Dec 9 06:17:09 2019][ 4.170680] pcieport 0000:00:03.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.177657] pci 0000:01:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.184205] pcieport 0000:00:07.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.191175] pci 0000:02:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.197718] pci 0000:02:00.2: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.204251] pci 0000:02:00.3: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.210798] pcieport 0000:00:08.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.217765] pci 0000:03:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.224300] pci 0000:03:00.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.230855] pcieport 0000:40:07.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.237818] pci 0000:41:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.244351] pci 0000:41:00.2: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.250888] pci 0000:41:00.3: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.257442] pcieport 0000:40:08.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.264409] pci 0000:42:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.270941] pci 0000:42:00.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.277492] pcieport 0000:80:01.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.284464] pci 0000:81:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.290999] pci 0000:81:00.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.297549] pcieport 0000:80:01.2: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.304516] pci 0000:82:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.311054] pci 0000:83:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.317602] pcieport 0000:80:03.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.324573] pci 0000:84:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.331119] pcieport 0000:80:07.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.338087] pci 0000:85:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.344622] pci 0000:85:00.2: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.351169] pcieport 0000:80:08.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.358130] pci 0000:86:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.364664] pci 0000:86:00.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.371201] pci 0000:86:00.2: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.377756] pcieport 0000:c0:01.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.384721] pci 0000:c1:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.391272] pcieport 0000:c0:07.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.398242] pci 0000:c2:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.404777] pci 0000:c2:00.2: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.411330] pcieport 0000:c0:08.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.418297] pci 0000:c3:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.424835] pci 0000:c3:00.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.431391] pci_hotplug: PCI Hot Plug PCI Core version: 0.5 [Mon Dec 9 06:17:09 2019][ 4.436981] pciehp: PCI Express Hot Plug Controller Driver version: 0.4 [Mon Dec 9 06:17:09 2019][ 4.443631] shpchp: Standard Hot Plug PCI Controller Driver version: 0.4 [Mon Dec 9 06:17:09 2019][ 4.450438] efifb: probing for efifb [Mon Dec 9 06:17:09 2019][ 4.454032] efifb: framebuffer at 0xab000000, mapped to 0xffffa48f59800000, using 3072k, total 3072k [Mon Dec 9 06:17:09 2019][ 4.463163] efifb: mode is 1024x768x32, linelength=4096, pages=1 [Mon Dec 9 06:17:09 2019][ 4.469178] efifb: scrolling: redraw [Mon Dec 9 06:17:09 2019][ 4.472771] efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 [Mon Dec 9 06:17:09 2019][ 4.494050] Console: switching to colour frame buffer device 128x48 [Mon Dec 9 06:17:09 2019][ 4.515784] fb0: EFI VGA frame buffer device [Mon Dec 9 06:17:09 2019][ 4.520168] input: Power Button as /devices/LNXSYSTM:00/device:00/PNP0C0C:00/input/input0 [Mon Dec 9 06:17:09 2019][ 4.528352] ACPI: Power Button [PWRB] [Mon Dec 9 06:17:09 2019][ 4.532073] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input1 [Mon Dec 9 06:17:09 2019][ 4.539478] ACPI: Power Button [PWRF] [Mon Dec 9 06:17:09 2019][ 4.544365] GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. [Mon Dec 9 06:17:09 2019][ 4.551850] Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled [Mon Dec 9 06:17:09 2019][ 4.579045] 00:02: ttyS1 at I/O 0x2f8 (irq = 3) is a 16550A [Mon Dec 9 06:17:09 2019][ 4.605587] 00:03: ttyS0 at I/O 0x3f8 (irq = 4) is a 16550A [Mon Dec 9 06:17:09 2019][ 4.611666] Non-volatile memory driver v1.3 [Mon Dec 9 06:17:09 2019][ 4.615896] Linux agpgart interface v0.103 [Mon Dec 9 06:17:09 2019][ 4.622439] crash memory driver: version 1.1 [Mon Dec 9 06:17:09 2019][ 4.626950] rdac: device handler registered [Mon Dec 9 06:17:09 2019][ 4.631203] hp_sw: device handler registered [Mon Dec 9 06:17:09 2019][ 4.635483] emc: device handler registered [Mon Dec 9 06:17:09 2019][ 4.639743] alua: device handler registered [Mon Dec 9 06:17:09 2019][ 4.643975] libphy: Fixed MDIO Bus: probed [Mon Dec 9 06:17:09 2019][ 4.648134] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver [Mon Dec 9 06:17:09 2019][ 4.654672] ehci-pci: EHCI PCI platform driver [Mon Dec 9 06:17:09 2019][ 4.659140] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver [Mon Dec 9 06:17:09 2019][ 4.665331] ohci-pci: OHCI PCI platform driver [Mon Dec 9 06:17:09 2019][ 4.669798] uhci_hcd: USB Universal Host Controller Interface driver [Mon Dec 9 06:17:09 2019][ 4.676292] xhci_hcd 0000:02:00.3: xHCI Host Controller [Mon Dec 9 06:17:09 2019][ 4.681595] xhci_hcd 0000:02:00.3: new USB bus registered, assigned bus number 1 [Mon Dec 9 06:17:09 2019][ 4.689101] xhci_hcd 0000:02:00.3: hcc params 0x0270f665 hci version 0x100 quirks 0x00000410 [Mon Dec 9 06:17:09 2019][ 4.697869] usb usb1: New USB device found, idVendor=1d6b, idProduct=0002 [Mon Dec 9 06:17:09 2019][ 4.704662] usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [Mon Dec 9 06:17:09 2019][ 4.711889] usb usb1: Product: xHCI Host Controller [Mon Dec 9 06:17:09 2019][ 4.716776] usb usb1: Manufacturer: Linux 3.10.0-957.27.2.el7_lustre.pl2.x86_64 xhci-hcd [Mon Dec 9 06:17:09 2019][ 4.724871] usb usb1: SerialNumber: 0000:02:00.3 [Mon Dec 9 06:17:09 2019][ 4.729607] hub 1-0:1.0: USB hub found [Mon Dec 9 06:17:09 2019][ 4.733368] hub 1-0:1.0: 2 ports detected [Mon Dec 9 06:17:09 2019][ 4.737616] xhci_hcd 0000:02:00.3: xHCI Host Controller [Mon Dec 9 06:17:09 2019][ 4.742908] xhci_hcd 0000:02:00.3: new USB bus registered, assigned bus number 2 [Mon Dec 9 06:17:09 2019][ 4.750334] usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. [Mon Dec 9 06:17:09 2019][ 4.758443] usb usb2: New USB device found, idVendor=1d6b, idProduct=0003 [Mon Dec 9 06:17:09 2019][ 4.765234] usb usb2: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [Mon Dec 9 06:17:09 2019][ 4.772461] usb usb2: Product: xHCI Host Controller [Mon Dec 9 06:17:09 2019][ 4.777350] usb usb2: Manufacturer: Linux 3.10.0-957.27.2.el7_lustre.pl2.x86_64 xhci-hcd [Mon Dec 9 06:17:09 2019][ 4.785444] usb usb2: SerialNumber: 0000:02:00.3 [Mon Dec 9 06:17:10 2019][ 4.790160] hub 2-0:1.0: USB hub found [Mon Dec 9 06:17:10 2019][ 4.793926] hub 2-0:1.0: 2 ports detected [Mon Dec 9 06:17:10 2019][ 4.798267] xhci_hcd 0000:41:00.3: xHCI Host Controller [Mon Dec 9 06:17:10 2019][ 4.803578] xhci_hcd 0000:41:00.3: new USB bus registered, assigned bus number 3 [Mon Dec 9 06:17:10 2019][ 4.811082] xhci_hcd 0000:41:00.3: hcc params 0x0270f665 hci version 0x100 quirks 0x00000410 [Mon Dec 9 06:17:10 2019][ 4.819869] usb usb3: New USB device found, idVendor=1d6b, idProduct=0002 [Mon Dec 9 06:17:10 2019][ 4.826664] usb usb3: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [Mon Dec 9 06:17:10 2019][ 4.833893] usb usb3: Product: xHCI Host Controller [Mon Dec 9 06:17:10 2019][ 4.838780] usb usb3: Manufacturer: Linux 3.10.0-957.27.2.el7_lustre.pl2.x86_64 xhci-hcd [Mon Dec 9 06:17:10 2019][ 4.846874] usb usb3: SerialNumber: 0000:41:00.3 [Mon Dec 9 06:17:10 2019][ 4.851606] hub 3-0:1.0: USB hub found [Mon Dec 9 06:17:10 2019][ 4.855374] hub 3-0:1.0: 2 ports detected [Mon Dec 9 06:17:10 2019][ 4.859641] xhci_hcd 0000:41:00.3: xHCI Host Controller [Mon Dec 9 06:17:10 2019][ 4.864916] xhci_hcd 0000:41:00.3: new USB bus registered, assigned bus number 4 [Mon Dec 9 06:17:10 2019][ 4.872357] usb usb4: We don't know the algorithms for LPM for this host, disabling LPM. [Mon Dec 9 06:17:10 2019][ 4.880469] usb usb4: New USB device found, idVendor=1d6b, idProduct=0003 [Mon Dec 9 06:17:10 2019][ 4.887264] usb usb4: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [Mon Dec 9 06:17:10 2019][ 4.894491] usb usb4: Product: xHCI Host Controller [Mon Dec 9 06:17:10 2019][ 4.899381] usb usb4: Manufacturer: Linux 3.10.0-957.27.2.el7_lustre.pl2.x86_64 xhci-hcd [Mon Dec 9 06:17:10 2019][ 4.907475] usb usb4: SerialNumber: 0000:41:00.3 [Mon Dec 9 06:17:10 2019][ 4.912191] hub 4-0:1.0: USB hub found [Mon Dec 9 06:17:10 2019][ 4.915953] hub 4-0:1.0: 2 ports detected [Mon Dec 9 06:17:10 2019][ 4.920215] usbcore: registered new interface driver usbserial_generic [Mon Dec 9 06:17:10 2019][ 4.926756] usbserial: USB Serial support registered for generic [Mon Dec 9 06:17:10 2019][ 4.932808] i8042: PNP: No PS/2 controller found. Probing ports directly. [Mon Dec 9 06:17:10 2019][ 5.048749] usb 1-1: new high-speed USB device number 2 using xhci_hcd [Mon Dec 9 06:17:10 2019][ 5.170748] usb 3-1: new high-speed USB device number 2 using xhci_hcd [Mon Dec 9 06:17:10 2019][ 5.180593] usb 1-1: New USB device found, idVendor=0424, idProduct=2744 [Mon Dec 9 06:17:10 2019][ 5.187307] usb 1-1: New USB device strings: Mfr=1, Product=2, SerialNumber=0 [Mon Dec 9 06:17:10 2019][ 5.194453] usb 1-1: Product: USB2734 [Mon Dec 9 06:17:10 2019][ 5.198123] usb 1-1: Manufacturer: Microchip Tech [Mon Dec 9 06:17:10 2019][ 5.230491] hub 1-1:1.0: USB hub found [Mon Dec 9 06:17:10 2019][ 5.234468] hub 1-1:1.0: 4 ports detected [Mon Dec 9 06:17:10 2019][ 5.291842] usb 2-1: new SuperSpeed USB device number 2 using xhci_hcd [Mon Dec 9 06:17:10 2019][ 5.300747] usb 3-1: New USB device found, idVendor=1604, idProduct=10c0 [Mon Dec 9 06:17:10 2019][ 5.307459] usb 3-1: New USB device strings: Mfr=0, Product=0, SerialNumber=0 [Mon Dec 9 06:17:10 2019][ 5.312969] usb 2-1: New USB device found, idVendor=0424, idProduct=5744 [Mon Dec 9 06:17:10 2019][ 5.312970] usb 2-1: New USB device strings: Mfr=2, Product=3, SerialNumber=0 [Mon Dec 9 06:17:10 2019][ 5.312971] usb 2-1: Product: USB5734 [Mon Dec 9 06:17:10 2019][ 5.312972] usb 2-1: Manufacturer: Microchip Tech [Mon Dec 9 06:17:10 2019][ 5.326487] hub 2-1:1.0: USB hub found [Mon Dec 9 06:17:10 2019][ 5.326843] hub 2-1:1.0: 4 ports detected [Mon Dec 9 06:17:10 2019][ 5.327898] usb: port power management may be unreliable [Mon Dec 9 06:17:10 2019][ 5.352513] hub 3-1:1.0: USB hub found [Mon Dec 9 06:17:10 2019][ 5.356496] hub 3-1:1.0: 4 ports detected [Mon Dec 9 06:17:11 2019][ 5.973232] i8042: No controller found [Mon Dec 9 06:17:11 2019][ 5.977008] tsc: Refined TSC clocksource calibration: 1996.249 MHz [Mon Dec 9 06:17:11 2019][ 5.977072] mousedev: PS/2 mouse device common for all mice [Mon Dec 9 06:17:11 2019][ 5.977261] rtc_cmos 00:01: RTC can wake from S4 [Mon Dec 9 06:17:11 2019][ 5.977606] rtc_cmos 00:01: rtc core: registered rtc_cmos as rtc0 [Mon Dec 9 06:17:11 2019][ 5.977706] rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram, hpet irqs [Mon Dec 9 06:17:11 2019][ 5.977768] cpuidle: using governor menu [Mon Dec 9 06:17:11 2019][ 5.978018] EFI Variables Facility v0.08 2004-May-17 [Mon Dec 9 06:17:11 2019][ 5.998627] hidraw: raw HID events driver (C) Jiri Kosina [Mon Dec 9 06:17:11 2019][ 5.998723] usbcore: registered new interface driver usbhid [Mon Dec 9 06:17:11 2019][ 5.998723] usbhid: USB HID core driver [Mon Dec 9 06:17:11 2019][ 5.998840] drop_monitor: Initializing network drop monitor service [Mon Dec 9 06:17:11 2019][ 5.998985] TCP: cubic registered [Mon Dec 9 06:17:11 2019][ 5.998990] Initializing XFRM netlink socket [Mon Dec 9 06:17:11 2019][ 5.999196] NET: Registered protocol family 10 [Mon Dec 9 06:17:11 2019][ 5.999731] NET: Registered protocol family 17 [Mon Dec 9 06:17:11 2019][ 5.999734] mpls_gso: MPLS GSO support [Mon Dec 9 06:17:11 2019][ 6.000786] mce: Using 23 MCE banks [Mon Dec 9 06:17:11 2019][ 6.000832] microcode: CPU0: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000847] microcode: CPU1: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000861] microcode: CPU2: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000873] microcode: CPU3: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000885] microcode: CPU4: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000900] microcode: CPU5: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000912] microcode: CPU6: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000928] microcode: CPU7: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000937] microcode: CPU8: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000946] microcode: CPU9: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000954] microcode: CPU10: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000964] microcode: CPU11: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000975] microcode: CPU12: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000986] microcode: CPU13: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000997] microcode: CPU14: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001008] microcode: CPU15: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001019] microcode: CPU16: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001030] microcode: CPU17: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001041] microcode: CPU18: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001051] microcode: CPU19: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001061] microcode: CPU20: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001073] microcode: CPU21: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001083] microcode: CPU22: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001094] microcode: CPU23: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001104] microcode: CPU24: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001115] microcode: CPU25: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001123] microcode: CPU26: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001134] microcode: CPU27: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001144] microcode: CPU28: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001152] microcode: CPU29: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001308] microcode: CPU30: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001318] microcode: CPU31: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001329] microcode: CPU32: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001337] microcode: CPU33: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001348] microcode: CPU34: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001358] microcode: CPU35: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001369] microcode: CPU36: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001380] microcode: CPU37: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001391] microcode: CPU38: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001402] microcode: CPU39: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001410] microcode: CPU40: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001419] microcode: CPU41: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001427] microcode: CPU42: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001438] microcode: CPU43: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001445] microcode: CPU44: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001454] microcode: CPU45: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001462] microcode: CPU46: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001473] microcode: CPU47: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001518] microcode: Microcode Update Driver: v2.01 , Peter Oruba [Mon Dec 9 06:17:11 2019][ 6.001659] Loading compiled-in X.509 certificates [Mon Dec 9 06:17:11 2019][ 6.001683] Loaded X.509 cert 'CentOS Linux kpatch signing key: ea0413152cde1d98ebdca3fe6f0230904c9ef717' [Mon Dec 9 06:17:11 2019][ 6.001697] Loaded X.509 cert 'CentOS Linux Driver update signing key: 7f421ee0ab69461574bb358861dbe77762a4201b' [Mon Dec 9 06:17:11 2019][ 6.002084] Loaded X.509 cert 'CentOS Linux kernel signing key: 468656045a39b52ff2152c315f6198c3e658f24d' [Mon Dec 9 06:17:11 2019][ 6.002097] registered taskstats version 1 [Mon Dec 9 06:17:11 2019][ 6.004213] Key type trusted registered [Mon Dec 9 06:17:11 2019][ 6.005787] Key type encrypted registered [Mon Dec 9 06:17:11 2019][ 6.005834] IMA: No TPM chip found, activating TPM-bypass! (rc=-19) [Mon Dec 9 06:17:11 2019][ 6.007298] Magic number: 15:915:282 [Mon Dec 9 06:17:11 2019][ 6.007380] tty tty19: hash matches [Mon Dec 9 06:17:11 2019][ 6.007519] memory memory148: hash matches [Mon Dec 9 06:17:11 2019][ 6.015092] rtc_cmos 00:01: setting system clock to 2019-12-09 14:17:10 UTC (1575901030) [Mon Dec 9 06:17:11 2019] ²ršÂº¢ºÊêusb 3-1.1: new high-speed USB device number 3 using xhci_hcd [Mon Dec 9 06:17:11 2019][ 6.387500] Switched to clocksource tsc [Mon Dec 9 06:17:11 2019][ 6.399037] Freeing unused kernel memory: 1876k freed [Mon Dec 9 06:17:11 2019][ 6.404317] Write protecting the kernel read-only data: 12288k [Mon Dec 9 06:17:11 2019][ 6.411541] Freeing unused kernel memory: 504k freed [Mon Dec 9 06:17:11 2019][ 6.417887] Freeing unused kernel memory: 596k freed [Mon Dec 9 06:17:11 2019][ 6.470116] systemd[1]: systemd 219 running in system mode. (+PAM +AUDIT +SELINUX +IMA -APPARMOR +SMACK +SYSVINIT +UTMP +LIBCRYPTSETUP +GCRYPT +GNUTLS +ACL +XZ +LZ4 -SECCOMP +BLKID +ELFUTILS +KMOD +IDN) [Mon Dec 9 06:17:11 2019][ 6.473772] usb 3-1.1: New USB device found, idVendor=1604, idProduct=10c0 [Mon Dec 9 06:17:11 2019][ 6.473773] usb 3-1.1: New USB device strings: Mfr=0, Product=0, SerialNumber=0 [Mon Dec 9 06:17:11 2019][ 6.503414] systemd[1]: Detected architecture x86-64. [Mon Dec 9 06:17:11 2019][ 6.504537] hub 3-1.1:1.0: USB hub found [Mon Dec 9 06:17:11 2019][ 6.504895] hub 3-1.1:1.0: 4 ports detected [Mon Dec 9 06:17:11 2019][ 6.516595] systemd[1]: Running in initial RAM disk. [Mon Dec 9 06:17:11 2019] [Mon Dec 9 06:17:11 2019]Welcome to CentOS Linux 7 (Core) dracut-033-554.el7 (Initramfs)! [Mon Dec 9 06:17:11 2019] [Mon Dec 9 06:17:11 2019][ 6.529901] systemd[1]: Set hostname to . [Mon Dec 9 06:17:11 2019][ 6.565168] systemd[1]: Reached target Local File Systems. [Mon Dec 9 06:17:11 2019][ 6.569767] usb 3-1.4: new high-speed USB device number 4 using xhci_hcd [Mon Dec 9 06:17:11 2019][ OK ] Reached target Local File Systems. [Mon Dec 9 06:17:11 2019][ 6.582855] systemd[1]: Reached target Swap. [Mon Dec 9 06:17:11 2019][ OK ] Reached target Swap. [Mon Dec 9 06:17:11 2019][ 6.591821] systemd[1]: Reached target Timers. [Mon Dec 9 06:17:11 2019][ OK ] Reached target Timers. [Mon Dec 9 06:17:11 2019][ 6.601036] systemd[1]: Created slice Root Slice. [Mon Dec 9 06:17:11 2019][ OK ] Created slice Root Slice. [Mon Dec 9 06:17:11 2019][ 6.611865] systemd[1]: Listening on udev Control Socket. [Mon Dec 9 06:17:11 2019][ OK ] Listening on udev Control Socket. [Mon Dec 9 06:17:11 2019][ 6.622848] systemd[1]: Listening on udev Kernel Socket. [Mon Dec 9 06:17:11 2019][ OK ] Listening on udev Kernel Socket. [Mon Dec 9 06:17:11 2019][ 6.633883] systemd[1]: Created slice System Slice. [Mon Dec 9 06:17:11 2019][ OK ] Created slice System Slice. [Mon Dec 9 06:17:11 2019][ 6.643778] usb 3-1.4: New USB device found, idVendor=1604, idProduct=10c0 [Mon Dec 9 06:17:11 2019][ 6.650657] usb 3-1.4: New USB device strings: Mfr=0, Product=0, SerialNumber=0 [Mon Dec 9 06:17:11 2019][ 6.658073] systemd[1]: Listening on Journal Socket. [Mon Dec 9 06:17:11 2019][ OK ] Listening on J[ 6.664547] hub 3-1.4:1.0: USB hub found [Mon Dec 9 06:17:11 2019]ournal Socket. [Mon Dec 9 06:17:11 2019][ 6.670149] hub 3-1.4:1.0: 4 ports detected [Mon Dec 9 06:17:11 2019][ 6.675994] systemd[1]: Starting Setup Virtual Console... [Mon Dec 9 06:17:11 2019] Starting Setup Virtual Console... [Mon Dec 9 06:17:11 2019][ 6.686254] systemd[1]: Starting Create list of required static device nodes for the current kernel... [Mon Dec 9 06:17:11 2019] Starting Create list of required st... nodes for the current kernel... [Mon Dec 9 06:17:11 2019][ 6.717827] systemd[1]: Reached target Sockets. [Mon Dec 9 06:17:11 2019][ OK ] Reached target Sockets. [Mon Dec 9 06:17:11 2019][ 6.727301] systemd[1]: Starting Apply Kernel Variables... [Mon Dec 9 06:17:11 2019] Starting Apply Kernel Variables... [Mon Dec 9 06:17:11 2019][ 6.738860] systemd[1]: Reached target Slices. [Mon Dec 9 06:17:11 2019][ OK ] Reached target Slices. [Mon Dec 9 06:17:11 2019][ 6.748191] systemd[1]: Starting Journal Service... [Mon Dec 9 06:17:11 2019] Starting Journal Service... [Mon Dec 9 06:17:11 2019][ 6.758310] systemd[1]: Starting dracut cmdline hook... [Mon Dec 9 06:17:11 2019] Starting dracut cmdline hook... [Mon Dec 9 06:17:11 2019][ 6.768194] systemd[1]: Started Setup Virtual Console. [Mon Dec 9 06:17:11 2019][ OK ] Started Setup Virtual Console. [Mon Dec 9 06:17:12 2019][ 6.779123] systemd[1]: Started Create list of required static device nodes for the current kernel. [Mon Dec 9 06:17:12 2019][ OK ] Started Create list of required sta...ce nodes for the current kernel. [Mon Dec 9 06:17:12 2019][ 6.797008] systemd[1]: Started Apply Kernel Variables. [Mon Dec 9 06:17:12 2019][ OK ] Started Apply Kernel Variables. [Mon Dec 9 06:17:12 2019][ 6.808097] systemd[1]: Started Journal Service. [Mon Dec 9 06:17:12 2019][ OK ] Started Journal Service. [Mon Dec 9 06:17:12 2019] Starting Create Static Device Nodes in /dev... [Mon Dec 9 06:17:12 2019][ OK ] Started Create Static Device Nodes in /dev. [Mon Dec 9 06:17:12 2019][ OK ] Started dracut cmdline hook. [Mon Dec 9 06:17:12 2019] Starting dracut pre-udev hook... [Mon Dec 9 06:17:12 2019][ OK ] Started dracut pre-udev hook. [Mon Dec 9 06:17:12 2019] Starting udev Kernel Device Manager... [Mon Dec 9 06:17:12 2019][ OK ] Started udev Kernel Device Manager. [Mon Dec 9 06:17:12 2019] Starting udev Coldplug all Devices... [Mon Dec 9 06:17:12 2019] Mounting Configuration File System... [Mon Dec 9 06:17:12 2019][ OK ] Mounted Configuration File System. [Mon Dec 9 06:17:12 2019][ 6.943776] pps_core: LinuxPPS API ver. 1 registered [Mon Dec 9 06:17:12 2019][ 6.948745] pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti [Mon Dec 9 06:17:12 2019][ 6.961430] megasas: 07.705.02.00-rh1 [Mon Dec 9 06:17:12 2019][ 6.965430] megaraid_sas 0000:c1:00.0: FW now in Ready state [Mon Dec 9 06:17:12 2019][ 6.971107] megaraid_sas 0000:c1:00.0: 64 bit DMA mask and 32 bit consistent mask [Mon Dec 9 06:17:12 2019][ 6.972575] megaraid_sas 0000:c1:00.0: firmware supports msix : (96) [Mon Dec 9 06:17:12 2019][ 6.972576] megaraid_sas 0000:c1:00.0: current msix/online cpus : (48/48) [Mon Dec 9 06:17:12 2019][ 6.972577] megaraid_sas 0000:c1:00.0: RDPQ mode : (disabled) [Mon Dec 9 06:17:12 2019][ 6.972580] megaraid_sas 0000:c1:00.0: Current firmware supports maximum commands: 928 LDIO threshold: 237 [Mon Dec 9 06:17:12 2019][ 6.972847] megaraid_sas 0000:c1:00.0: Configured max firmware commands: 927 [Mon Dec 9 06:17:12 2019][ 6.974885] megaraid_sas 0000:c1:00.0: FW supports sync cache : No [Mon Dec 9 06:17:12 2019][ 6.978810] PTP clock support registered [Mon Dec 9 06:17:12 2019][ OK ] Started udev Coldplug all Devi[ 7.028321] mpt3sas: loading out-of-tree module taints kernel. [Mon Dec 9 06:17:12 2019]ces. [Mon Dec 9 06:17:12 2019][ 7.039156] tg3.c:v3.137 (May 11, 2014) [Mon Dec 9 06:17:12 2019][ 7.039203] mlx_compat: module verification failed: signature and/or required key missing - tainting kernel [Mon Dec 9 06:17:12 2019] Starting Show Plymouth Boot Screen... [Mon Dec 9 06:17:12 2019][ 7.056141] Compat-mlnx-ofed backport release: 1c4bf42 [Mon Dec 9 06:17:12 2019] 7.062229] Backport based on mlnx_ofed/mlnx-ofa_kernel-4.0.git 1c4bf42 [Mon Dec 9 06:17:12 2019][ 7.062230] compat.git: mlnx_ofed/mlnx-ofa_kernel-4.0.git [Mon Dec 9 06:17:12 2019][ 7.066699] tg3 0000:81:00.0 eth0: Tigon3 [partno(BCM95720) rev 5720000] (PCI Express) MAC address 4c:d9:8f:7d:ad:d7 [Mon Dec 9 06:17:12 2019][ 7.066703] tg3 0000:81:00.0 eth0: attached PHY is 5720C (10/100/1000Base-T Ethernet) (WireSpeed[1], EEE[1]) [Mon Dec 9 06:17:12 2019][ 7.066705] tg3 0000:81:00.0 eth0: RXcsums[1] LinkChgREG[0] MIirq[0] ASF[1] TSOcap[1] [Mon Dec 9 06:17:12 2019][ 7.066707] tg3 0000:81:00.0 eth0: dma_rwctrl[00000001] dma_mask[64-bit] [Mon Dec 9 06:17:12 2019][ 7.096872] tg3 0000:81:00.1 eth1: Tigon3 [partno(BCM95720) rev 5720000] (PCI Express) MAC address 4c:d9:8f:7d:ad:d8 [Mon Dec 9 06:17:12 2019][ 7.096875] tg3 0000:81:00.1 eth1: attached PHY is 5720C (10/100/1000Base-T Ethernet) (WireSpeed[1], EEE[1]) [Mon Dec 9 06:17:12 2019][ 7.096877] tg3 0000:81:00.1 eth1: RXcsums[1] LinkChgREG[0] MIirq[0] ASF[1] TSOcap[1] [Mon Dec 9 06:17:12 2019][ 7.096879] tg3 0000:81:00.1 eth1: dma_rwctrl[00000001] dma_mask[64-bit] [Mon Dec 9 06:17:12 2019][ 7.144503] mpt3sas version 31.00.00.00 loaded [Mon Dec 9 06:17:12 2019][ 7.150357] mpt3sas_cm0: 63 BIT PCI BUS DMA ADDRESSING SUPPORTED, total mem (263564416 kB) [Mon Dec 9 06:17:12 2019][ OK ] Reached target System Initiali[ 7.168033] ahci 0000:86:00.2: AHCI 0001.0301 32 slots 1 ports 6 Gbps 0x1 impl SATA mode [Mon Dec 9 06:17:12 2019]zation. [Mon Dec 9 06:17:12 2019] [ 7.177094] ahci 0000:86:00.2: flags: 64bit ncq sntf ilck pm led clo only pmp fbs pio slum part [Mon Dec 9 06:17:12 2019] Starting drac[ 7.187545] scsi host2: ahci [Mon Dec 9 06:17:12 2019]ut initqueue hoo[ 7.192196] ata1: SATA max UDMA/133 abar m4096@0xc0a02000 port 0xc0a02100 irq 120 [Mon Dec 9 06:17:12 2019]k... [Mon Dec 9 06:17:12 2019][ OK ] Started[ 7.202263] mlx5_core 0000:01:00.0: firmware version: 20.26.1040 [Mon Dec 9 06:17:12 2019] Show Plymouth B[ 7.209296] mlx5_core 0000:01:00.0: 126.016 Gb/s available PCIe bandwidth, limited by 8 GT/s x16 link at 0000:00:03.1 (capable of 252.048 Gb/s with 16 GT/s x16 link) [Mon Dec 9 06:17:12 2019]oot Screen. [Mon Dec 9 06:17:12 2019][ OK ] Started Forward Password Requests to Plymouth Directory Watch. [Mon Dec 9 06:17:12 2019][ OK ] Reached targe[ 7.235790] mpt3sas_cm0: IOC Number : 0 [Mon Dec 9 06:17:12 2019]t Paths. [Mon Dec 9 06:17:12 2019][ OK ] Rea[ 7.242790] mpt3sas0-msix0: PCI-MSI-X enabled: IRQ 137 [Mon Dec 9 06:17:12 2019]ched target Basi[ 7.248115] mpt3sas0-msix1: PCI-MSI-X enabled: IRQ 138 [Mon Dec 9 06:17:12 2019]c System. [Mon Dec 9 06:17:12 2019][ 7.254588] mpt3sas0-msix2: PCI-MSI-X enabled: IRQ 139 [Mon Dec 9 06:17:12 2019][ 7.260763] mpt3sas0-msix3: PCI-MSI-X enabled: IRQ 140 [Mon Dec 9 06:17:12 2019][ 7.265911] mpt3sas0-msix4: PCI-MSI-X enabled: IRQ 141 [Mon Dec 9 06:17:12 2019][ 7.271062] mpt3sas0-msix5: PCI-MSI-X enabled: IRQ 142 [Mon Dec 9 06:17:12 2019][ 7.271064] mpt3sas0-msix6: PCI-MSI-X enabled: IRQ 143 [Mon Dec 9 06:17:12 2019][ 7.271065] mpt3sas0-msix7: PCI-MSI-X enabled: IRQ 144 [Mon Dec 9 06:17:12 2019][ 7.271068] mpt3sas0-msix8: PCI-MSI-X enabled: IRQ 145 [Mon Dec 9 06:17:12 2019][ 7.271071] mpt3sas0-msix9: PCI-MSI-X enabled: IRQ 146 [Mon Dec 9 06:17:12 2019][ 7.271072] mpt3sas0-msix10: PCI-MSI-X enabled: IRQ 147 [Mon Dec 9 06:17:12 2019][ 7.271072] mpt3sas0-msix11: PCI-MSI-X enabled: IRQ 148 [Mon Dec 9 06:17:12 2019][ 7.271073] mpt3sas0-msix12: PCI-MSI-X enabled: IRQ 149 [Mon Dec 9 06:17:12 2019][ 7.271074] mpt3sas0-msix13: PCI-MSI-X enabled: IRQ 150 [Mon Dec 9 06:17:12 2019][ 7.271074] mpt3sas0-msix14: PCI-MSI-X enabled: IRQ 151 [Mon Dec 9 06:17:12 2019][ 7.271075] mpt3sas0-msix15: PCI-MSI-X enabled: IRQ 152 [Mon Dec 9 06:17:12 2019][ 7.271075] mpt3sas0-msix16: PCI-MSI-X enabled: IRQ 153 [Mon Dec 9 06:17:12 2019][ 7.271076] mpt3sas0-msix17: PCI-MSI-X enabled: IRQ 154 [Mon Dec 9 06:17:12 2019][ 7.271076] mpt3sas0-msix18: PCI-MSI-X enabled: IRQ 155 [Mon Dec 9 06:17:12 2019][ 7.271077] mpt3sas0-msix19: PCI-MSI-X enabled: IRQ 156 [Mon Dec 9 06:17:12 2019][ 7.271077] mpt3sas0-msix20: PCI-MSI-X enabled: IRQ 157 [Mon Dec 9 06:17:12 2019][ 7.271079] mpt3sas0-msix21: PCI-MSI-X enabled: IRQ 158 [Mon Dec 9 06:17:12 2019][ 7.271080] mpt3sas0-msix22: PCI-MSI-X enabled: IRQ 159 [Mon Dec 9 06:17:12 2019][ 7.271082] mpt3sas0-msix23: PCI-MSI-X enabled: IRQ 160 [Mon Dec 9 06:17:12 2019][ 7.271083] mpt3sas0-msix24: PCI-MSI-X enabled: IRQ 161 [Mon Dec 9 06:17:12 2019][ 7.271083] mpt3sas0-msix25: PCI-MSI-X enabled: IRQ 162 [Mon Dec 9 06:17:12 2019][ 7.271084] mpt3sas0-msix26: PCI-MSI-X enabled: IRQ 163 [Mon Dec 9 06:17:12 2019][ 7.271084] mpt3sas0-msix27: PCI-MSI-X enabled: IRQ 164 [Mon Dec 9 06:17:12 2019][ 7.271085] mpt3sas0-msix28: PCI-MSI-X enabled: IRQ 165 [Mon Dec 9 06:17:12 2019][ 7.271085] mpt3sas0-msix29: PCI-MSI-X enabled: IRQ 166 [Mon Dec 9 06:17:12 2019][ 7.271086] mpt3sas0-msix30: PCI-MSI-X enabled: IRQ 167 [Mon Dec 9 06:17:12 2019][ 7.271086] mpt3sas0-msix31: PCI-MSI-X enabled: IRQ 168 [Mon Dec 9 06:17:12 2019][ 7.271087] mpt3sas0-msix32: PCI-MSI-X enabled: IRQ 169 [Mon Dec 9 06:17:12 2019][ 7.271087] mpt3sas0-msix33: PCI-MSI-X enabled: IRQ 170 [Mon Dec 9 06:17:12 2019][ 7.271088] mpt3sas0-msix34: PCI-MSI-X enabled: IRQ 171 [Mon Dec 9 06:17:12 2019][ 7.271089] mpt3sas0-msix35: PCI-MSI-X enabled: IRQ 172 [Mon Dec 9 06:17:12 2019][ 7.271089] mpt3sas0-msix36: PCI-MSI-X enabled: IRQ 173 [Mon Dec 9 06:17:12 2019][ 7.271090] mpt3sas0-msix37: PCI-MSI-X enabled: IRQ 174 [Mon Dec 9 06:17:12 2019][ 7.271091] mpt3sas0-msix38: PCI-MSI-X enabled: IRQ 175 [Mon Dec 9 06:17:12 2019][ 7.271093] mpt3sas0-msix39: PCI-MSI-X enabled: IRQ 176 [Mon Dec 9 06:17:12 2019][ 7.271094] mpt3sas0-msix40: PCI-MSI-X enabled: IRQ 177 [Mon Dec 9 06:17:12 2019][ 7.271095] mpt3sas0-msix41: PCI-MSI-X enabled: IRQ 178 [Mon Dec 9 06:17:12 2019][ 7.271095] mpt3sas0-msix42: PCI-MSI-X enabled: IRQ 179 [Mon Dec 9 06:17:12 2019][ 7.271096] mpt3sas0-msix43: PCI-MSI-X enabled: IRQ 180 [Mon Dec 9 06:17:12 2019][ 7.271096] mpt3sas0-msix44: PCI-MSI-X enabled: IRQ 181 [Mon Dec 9 06:17:12 2019][ 7.271097] mpt3sas0-msix45: PCI-MSI-X enabled: IRQ 182 [Mon Dec 9 06:17:12 2019][ 7.271097] mpt3sas0-msix46: PCI-MSI-X enabled: IRQ 183 [Mon Dec 9 06:17:12 2019][ 7.271098] mpt3sas0-msix47: PCI-MSI-X enabled: IRQ 184 [Mon Dec 9 06:17:12 2019][ 7.271100] mpt3sas_cm0: iomem(0x00000000ac000000), mapped(0xffffa48f5a000000), size(1048576) [Mon Dec 9 06:17:12 2019][ 7.271101] mpt3sas_cm0: ioport(0x0000000000008000), size(256) [Mon Dec 9 06:17:12 2019][ 7.332790] megaraid_sas 0000:c1:00.0: Init cmd return status SUCCESS for SCSI host 0 [Mon Dec 9 06:17:12 2019][ 7.349787] mpt3sas_cm0: IOC Number : 0 [Mon Dec 9 06:17:12 2019][ 7.349790] mpt3sas_cm0: sending message unit reset !! [Mon Dec 9 06:17:12 2019][ 7.351785] mpt3sas_cm0: message unit reset: SUCCESS [Mon Dec 9 06:17:12 2019][ 7.353787] megaraid_sas 0000:c1:00.0: firmware type : Legacy(64 VD) firmware [Mon Dec 9 06:17:12 2019][ 7.353788] megaraid_sas 0000:c1:00.0: controller type : iMR(0MB) [Mon Dec 9 06:17:12 2019][ 7.353790] megaraid_sas 0000:c1:00.0: Online Controller Reset(OCR) : Enabled [Mon Dec 9 06:17:12 2019][ 7.353791] megaraid_sas 0000:c1:00.0: Secure JBOD support : No [Mon Dec 9 06:17:12 2019][ 7.353792] megaraid_sas 0000:c1:00.0: NVMe passthru support : No [Mon Dec 9 06:17:12 2019][ 7.375309] megaraid_sas 0000:c1:00.0: INIT adapter done [Mon Dec 9 06:17:12 2019][ 7.375311] megaraid_sas 0000:c1:00.0: Jbod map is not supported megasas_setup_jbod_map 5146 [Mon Dec 9 06:17:12 2019][ 7.401656] megaraid_sas 0000:c1:00.0: pci id : (0x1000)/(0x005f)/(0x1028)/(0x1f4b) [Mon Dec 9 06:17:12 2019][ 7.401657] megaraid_sas 0000:c1:00.0: unevenspan support : yes [Mon Dec 9 06:17:12 2019][ 7.401658] megaraid_sas 0000:c1:00.0: firmware crash dump : no [Mon Dec 9 06:17:12 2019][ 7.401659] megaraid_sas 0000:c1:00.0: jbod sync map : no [Mon Dec 9 06:17:12 2019][ 7.401664] scsi host0: Avago SAS based MegaRAID driver [Mon Dec 9 06:17:12 2019][ 7.421497] scsi 0:2:0:0: Direct-Access DELL PERC H330 Mini 4.30 PQ: 0 ANSI: 5 [Mon Dec 9 06:17:12 2019][ 7.487112] mlx5_core 0000:01:00.0: Port module event: module 0, Cable plugged [Mon Dec 9 06:17:12 2019][ 7.487375] mlx5_core 0000:01:00.0: mlx5_pcie_event:303:(pid 319): PCIe slot advertised sufficient power (27W). [Mon Dec 9 06:17:12 2019][ 7.494745] mlx5_core 0000:01:00.0: mlx5_fw_tracer_start:776:(pid 300): FWTracer: Ownership granted and active [Mon Dec 9 06:17:12 2019][ 7.504801] ata1: SATA link down (SStatus 0 SControl 300) [Mon Dec 9 06:17:12 2019][ 7.518414] mpt3sas_cm0: Allocated physical memory: size(38831 kB) [Mon Dec 9 06:17:12 2019][ 7.518416] mpt3sas_cm0: Current Controller Queue Depth(7564), Max Controller Queue Depth(7680) [Mon Dec 9 06:17:12 2019][ 7.518416] mpt3sas_cm0: Scatter Gather Elements per IO(128) [Mon Dec 9 06:17:12 2019][ 7.662398] mpt3sas_cm0: FW Package Version(12.00.00.00) [Mon Dec 9 06:17:12 2019][ 7.662689] mpt3sas_cm0: SAS3616: FWVersion(12.00.00.00), ChipRevision(0x02), BiosVersion(09.21.00.00) [Mon Dec 9 06:17:12 2019][ 7.662693] mpt3sas_cm0: Protocol=(Initiator,Target,NVMe), Capabilities=(TLR,EEDP,Diag Trace Buffer,Task Set Full,NCQ) [Mon Dec 9 06:17:12 2019][ 7.662761] mpt3sas 0000:84:00.0: Enabled Extended Tags as Controller Supports [Mon Dec 9 06:17:12 2019][ 7.662776] mpt3sas_cm0: : host protection capabilities enabled DIF1 DIF2 DIF3 [Mon Dec 9 06:17:12 2019][ 7.662791] scsi host1: Fusion MPT SAS Host [Mon Dec 9 06:17:12 2019][ 7.663034] mpt3sas_cm0: registering trace buffer support [Mon Dec 9 06:17:12 2019][ 7.667351] mpt3sas_cm0: Trace buffer memory 2048 KB allocated [Mon Dec 9 06:17:12 2019][ 7.667351] mpt3sas_cm0: sending port enable !! [Mon Dec 9 06:17:12 2019][ 7.667992] mpt3sas_cm0: hba_port entry: ffff8912eff13780, port: 255 is added to hba_port list [Mon Dec 9 06:17:12 2019][ 7.670505] mpt3sas_cm0: host_add: handle(0x0001), sas_addr(0x500605b00e718b40), phys(21) [Mon Dec 9 06:17:12 2019][ 7.672841] mpt3sas_cm0: detecting: handle(0x0011), sas_address(0x300705b00e718b40), phy(16) [Mon Dec 9 06:17:12 2019][ 7.672845] mpt3sas_cm0: REPORT_LUNS: handle(0x0011), retries(0) [Mon Dec 9 06:17:12 2019][ 7.672869] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0011), lun(0) [Mon Dec 9 06:17:12 2019][ 7.673248] scsi 1:0:0:0: Enclosure LSI VirtualSES 03 PQ: 0 ANSI: 7 [Mon Dec 9 06:17:12 2019][ 7.673272] scsi 1:0:0:0: set ignore_delay_remove for handle(0x0011) [Mon Dec 9 06:17:12 2019][ 7.673275] scsi 1:0:0:0: SES: handle(0x0011), sas_addr(0x300705b00e718b40), phy(16), device_name(0x300705b00e718b40) [Mon Dec 9 06:17:13 2019][ 7.673276] scsi 1:0:0:0: enclosure logical id(0x300605b00e118b40), slot(16) [Mon Dec 9 06:17:13 2019][ 7.673278] scsi 1:0:0:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 7.673279] scsi 1:0:0:0: serial_number(300605B00E118B40) [Mon Dec 9 06:17:13 2019][ 7.673281] scsi 1:0:0:0: qdepth(1), tagged(0), simple(0), ordered(0), scsi_level(8), cmd_que(0) [Mon Dec 9 06:17:13 2019][ 7.673302] mpt3sas_cm0: log_info(0x31200206): originator(PL), code(0x20), sub_code(0x0206) [Mon Dec 9 06:17:13 2019]%G%G[ 7.875511] mpt3sas_cm0: expander_add: handle(0x0058), parent(0x0001), sas_addr(0x5000ccab040371fd), phys(49) [Mon Dec 9 06:17:13 2019][ 7.885870] mlx5_ib: Mellanox Connect-IB Infiniband driver v4.7-1.0.0 [Mon Dec 9 06:17:13 2019][ 7.896333] mpt3sas_cm0: detecting: handle(0x005c), sas_address(0x5000ccab040371fc), phy(48) [Mon Dec 9 06:17:13 2019][ 7.904779] mpt3sas_cm0: REPORT_LUNS: handle(0x005c), retries(0) [Mon Dec 9 06:17:13 2019][ 7.906143] mpt3sas_cm0: TEST_UNIT_READY: handle(0x005c), lun(0) [Mon Dec 9 06:17:13 2019][ 7.907155] scsi 1:0:1:0: Enclosure HGST H4060-J 2033 PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 7.907348] scsi 1:0:1:0: set ignore_delay_remove for handle(0x005c) [Mon Dec 9 06:17:13 2019][ 7.907351] scsi 1:0:1:0: SES: handle(0x005c), sas_addr(0x5000ccab040371fc), phy(48), device_name(0x0000000000000000) [Mon Dec 9 06:17:13 2019][ 7.907352] scsi 1:0:1:0: enclosure logical id(0x5000ccab04037180), slot(60) [Mon Dec 9 06:17:13 2019][ 7.907353] scsi 1:0:1:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 7.907354] scsi 1:0:1:0: serial_number(USWSJ03918EZ0028 ) [Mon Dec 9 06:17:13 2019][ 7.907356] scsi 1:0:1:0: qdepth(1), tagged(0), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 7.975890] sd 0:2:0:0: [sda] 467664896 512-byte logical blocks: (239 GB/223 GiB) [Mon Dec 9 06:17:13 2019][ 7.983560] sd 0:2:0:0: [sda] Write Protect is off [Mon Dec 9 06:17:13 2019][ 7.988423] sd 0:2:0:0: [sda] Write cache: disabled, read cache: disabled, supports DPO and FUA [Mon Dec 9 06:17:13 2019][ 7.999378] sda: sda1 sda2 sda3 [Mon Dec 9 06:17:13 2019][ 8.003064] sd 0:2:0:0: [sda] Attached SCSI disk [Mon Dec 9 06:17:13 2019][ 8.004409] mpt3sas_cm0: expander_add: handle(0x005a), parent(0x0058), sas_addr(0x5000ccab040371f9), phys(68) [Mon Dec 9 06:17:13 2019][ 8.014187] mpt3sas_cm0: detecting: handle(0x005d), sas_address(0x5000cca2525f2a26), phy(0) [Mon Dec 9 06:17:13 2019][ 8.014189] mpt3sas_cm0: REPORT_LUNS: handle(0x005d), retries(0) [Mon Dec 9 06:17:13 2019][ 8.014331] mpt3sas_cm0: TEST_UNIT_READY: handle(0x005d), lun(0) [Mon Dec 9 06:17:13 2019][ 8.015159] scsi 1:0:2:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 8.015265] scsi 1:0:2:0: SSP: handle(0x005d), sas_addr(0x5000cca2525f2a26), phy(0), device_name(0x5000cca2525f2a27) [Mon Dec 9 06:17:13 2019][ 8.015266] scsi 1:0:2:0: enclosure logical id(0x5000ccab04037180), slot(0) [Mon Dec 9 06:17:13 2019][ 8.015268] scsi 1:0:2:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 8.015269] scsi 1:0:2:0: serial_number( 7SHPAG1W) [Mon Dec 9 06:17:13 2019][ 8.015271] scsi 1:0:2:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 8.028932] random: crng init done [Mon Dec 9 06:17:13 2019][ 8.101982] mpt3sas_cm0: detecting: handle(0x005e), sas_address(0x5000cca2525e977e), phy(1) [Mon Dec 9 06:17:13 2019][ 8.110337] mpt3sas_cm0: REPORT_LUNS: handle(0x005e), retries(0) [Mon Dec 9 06:17:13 2019][ 8.116464] mpt3sas_cm0: TEST_UNIT_READY: handle(0x005e), lun(0) [Mon Dec 9 06:17:13 2019][ 8.123114] scsi 1:0:3:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 8.131363] scsi 1:0:3:0: SSP: handle(0x005e), sas_addr(0x5000cca2525e977e), phy(1), device_name(0x5000cca2525e977f) [Mon Dec 9 06:17:13 2019][ 8.141878] scsi 1:0:3:0: enclosure logical id(0x5000ccab04037180), slot(2) [Mon Dec 9 06:17:13 2019][ 8.148923] scsi 1:0:3:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 8.155644] scsi 1:0:3:0: serial_number( 7SHP0P8W) [Mon Dec 9 06:17:13 2019][ 8.161042] scsi 1:0:3:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 8.180986] mpt3sas_cm0: detecting: handle(0x005f), sas_address(0x5000cca2525ed2be), phy(2) [Mon Dec 9 06:17:13 2019][ 8.189343] mpt3sas_cm0: REPORT_LUNS: handle(0x005f), retries(0) [Mon Dec 9 06:17:13 2019][ 8.195505] mpt3sas_cm0: TEST_UNIT_READY: handle(0x005f), lun(0) [Mon Dec 9 06:17:13 2019][ 8.202347] scsi 1:0:4:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 8.210576] scsi 1:0:4:0: SSP: handle(0x005f), sas_addr(0x5000cca2525ed2be), phy(2), device_name(0x5000cca2525ed2bf) [Mon Dec 9 06:17:13 2019][ 8.221092] scsi 1:0:4:0: enclosure logical id(0x5000ccab04037180), slot(11) [Mon Dec 9 06:17:13 2019][ 8.228226] scsi 1:0:4:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 8.234945] scsi 1:0:4:0: serial_number( 7SHP4MLW) [Mon Dec 9 06:17:13 2019][ 8.240345] scsi 1:0:4:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 8.259985] mpt3sas_cm0: detecting: handle(0x0060), sas_address(0x5000cca2525ec04a), phy(3) [Mon Dec 9 06:17:13 2019][ 8.268332] mpt3sas_cm0: REPORT_LUNS: handle(0x0060), retries(0) [Mon Dec 9 06:17:13 2019][ 8.274464] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0060), lun(0) [Mon Dec 9 06:17:13 2019][ 8.281104] scsi 1:0:5:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 8.289316] scsi 1:0:5:0: SSP: handle(0x0060), sas_addr(0x5000cca2525ec04a), phy(3), device_name(0x5000cca2525ec04b) [Mon Dec 9 06:17:13 2019][ 8.299832] scsi 1:0:5:0: enclosure logical id(0x5000ccab04037180), slot(12) [Mon Dec 9 06:17:13 2019][ 8.306963] scsi 1:0:5:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 8.313683] scsi 1:0:5:0: serial_number( 7SHP3DHW) [Mon Dec 9 06:17:13 2019][ 8.319083] scsi 1:0:5:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 8.342020] mpt3sas_cm0: detecting: handle(0x0061), sas_address(0x5000cca2525ff612), phy(4) [Mon Dec 9 06:17:13 2019][ 8.350373] mpt3sas_cm0: REPORT_LUNS: handle(0x0061), retries(0) [Mon Dec 9 06:17:13 2019][ 8.356520] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0061), lun(0) [Mon Dec 9 06:17:13 2019][ 8.363432] scsi 1:0:6:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 8.371654] scsi 1:0:6:0: SSP: handle(0x0061), sas_addr(0x5000cca2525ff612), phy(4), device_name(0x5000cca2525ff613) [Mon Dec 9 06:17:13 2019][ 8.382166] scsi 1:0:6:0: enclosure logical id(0x5000ccab04037180), slot(13) [Mon Dec 9 06:17:13 2019][ 8.389299] scsi 1:0:6:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 8.396020] scsi 1:0:6:0: serial_number( 7SHPT11W) [Mon Dec 9 06:17:13 2019][ 8.401418] scsi 1:0:6:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 8.424003] mpt3sas_cm0: detecting: handle(0x0062), sas_address(0x5000cca2526016ee), phy(5) [Mon Dec 9 06:17:13 2019][ 8.432355] mpt3sas_cm0: REPORT_LUNS: handle(0x0062), retries(0) [Mon Dec 9 06:17:13 2019][ 8.438525] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0062), lun(0) [Mon Dec 9 06:17:13 2019][ 8.445150] scsi 1:0:7:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 8.453366] scsi 1:0:7:0: SSP: handle(0x0062), sas_addr(0x5000cca2526016ee), phy(5), device_name(0x5000cca2526016ef) [Mon Dec 9 06:17:13 2019][ 8.463879] scsi 1:0:7:0: enclosure logical id(0x5000ccab04037180), slot(14) [Mon Dec 9 06:17:13 2019][ 8.471012] scsi 1:0:7:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 8.477729] scsi 1:0:7:0: serial_number( 7SHPV6WW) [Mon Dec 9 06:17:13 2019][ 8.483129] scsi 1:0:7:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 8.506002] mpt3sas_cm0: detecting: handle(0x0063), sas_address(0x5000cca2525f4872), phy(6) [Mon Dec 9 06:17:13 2019][ 8.514351] mpt3sas_cm0: REPORT_LUNS: handle(0x0063), retries(0) [Mon Dec 9 06:17:13 2019][ 8.520518] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0063), lun(0) [Mon Dec 9 06:17:13 2019][ 8.527324] scsi 1:0:8:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 8.535539] scsi 1:0:8:0: SSP: handle(0x0063), sas_addr(0x5000cca2525f4872), phy(6), device_name(0x5000cca2525f4873) [Mon Dec 9 06:17:13 2019][ 8.546049] scsi 1:0:8:0: enclosure logical id(0x5000ccab04037180), slot(15) [Mon Dec 9 06:17:13 2019][ 8.553182] scsi 1:0:8:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 8.559900] scsi 1:0:8:0: serial_number( 7SHPDGLW) [Mon Dec 9 06:17:13 2019][ 8.565302] scsi 1:0:8:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 8.588007] mpt3sas_cm0: detecting: handle(0x0064), sas_address(0x5000cca2525f568e), phy(7) [Mon Dec 9 06:17:13 2019][ 8.596357] mpt3sas_cm0: REPORT_LUNS: handle(0x0064), retries(0) [Mon Dec 9 06:17:13 2019][ 8.602528] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0064), lun(0) [Mon Dec 9 06:17:13 2019][ 8.609165] scsi 1:0:9:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 8.617378] scsi 1:0:9:0: SSP: handle(0x0064), sas_addr(0x5000cca2525f568e), phy(7), device_name(0x5000cca2525f568f) [Mon Dec 9 06:17:13 2019][ 8.627891] scsi 1:0:9:0: enclosure logical id(0x5000ccab04037180), slot(16) [Mon Dec 9 06:17:13 2019][ 8.635023] scsi 1:0:9:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 8.641740] scsi 1:0:9:0: serial_number( 7SHPEDRW) [Mon Dec 9 06:17:13 2019][ 8.647141] scsi 1:0:9:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 8.666999] mpt3sas_cm0: detecting: handle(0x0065), sas_address(0x5000cca2525f6c26), phy(8) [Mon Dec 9 06:17:13 2019][ 8.675346] mpt3sas_cm0: REPORT_LUNS: handle(0x0065), retries(0) [Mon Dec 9 06:17:13 2019][ 8.681507] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0065), lun(0) [Mon Dec 9 06:17:13 2019][ 8.688143] scsi 1:0:10:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 8.696441] scsi 1:0:10:0: SSP: handle(0x0065), sas_addr(0x5000cca2525f6c26), phy(8), device_name(0x5000cca2525f6c27) [Mon Dec 9 06:17:13 2019][ 8.707035] scsi 1:0:10:0: enclosure logical id(0x5000ccab04037180), slot(17) [Mon Dec 9 06:17:13 2019][ 8.714255] scsi 1:0:10:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 8.721059] scsi 1:0:10:0: serial_number( 7SHPGV9W) [Mon Dec 9 06:17:13 2019][ 8.726548] scsi 1:0:10:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 8.748996] mpt3sas_cm0: detecting: handle(0x0066), sas_address(0x5000cca2525ed402), phy(9) [Mon Dec 9 06:17:13 2019][ 8.757344] mpt3sas_cm0: REPORT_LUNS: handle(0x0066), retries(0) [Mon Dec 9 06:17:13 2019][ 8.763511] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0066), lun(0) [Mon Dec 9 06:17:14 2019][ 8.786731] scsi 1:0:11:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 8.795017] scsi 1:0:11:0: SSP: handle(0x0066), sas_addr(0x5000cca2525ed402), phy(9), device_name(0x5000cca2525ed403) [Mon Dec 9 06:17:14 2019][ 8.805614] scsi 1:0:11:0: enclosure logical id(0x5000ccab04037180), slot(18) [Mon Dec 9 06:17:14 2019][ 8.812834] scsi 1:0:11:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 8.819623] scsi 1:0:11:0: serial_number( 7SHP4R6W) [Mon Dec 9 06:17:14 2019][ 8.825107] scsi 1:0:11:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 8.844998] mpt3sas_cm0: detecting: handle(0x0067), sas_address(0x5000cca2525e0406), phy(10) [Mon Dec 9 06:17:14 2019][ 8.853433] mpt3sas_cm0: REPORT_LUNS: handle(0x0067), retries(0) [Mon Dec 9 06:17:14 2019][ 8.859594] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0067), lun(0) [Mon Dec 9 06:17:14 2019][ 8.866219] scsi 1:0:12:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 8.874521] scsi 1:0:12:0: SSP: handle(0x0067), sas_addr(0x5000cca2525e0406), phy(10), device_name(0x5000cca2525e0407) [Mon Dec 9 06:17:14 2019][ 8.885211] scsi 1:0:12:0: enclosure logical id(0x5000ccab04037180), slot(19) [Mon Dec 9 06:17:14 2019][ 8.892431] scsi 1:0:12:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 8.899234] scsi 1:0:12:0: serial_number( 7SHNPVUW) [Mon Dec 9 06:17:14 2019][ 8.904723] scsi 1:0:12:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 8.928001] mpt3sas_cm0: detecting: handle(0x0068), sas_address(0x5000cca2525ea9e6), phy(11) [Mon Dec 9 06:17:14 2019][ 8.936436] mpt3sas_cm0: REPORT_LUNS: handle(0x0068), retries(0) [Mon Dec 9 06:17:14 2019][ 8.942610] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0068), lun(0) [Mon Dec 9 06:17:14 2019][ 8.949232] scsi 1:0:13:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 8.957533] scsi 1:0:13:0: SSP: handle(0x0068), sas_addr(0x5000cca2525ea9e6), phy(11), device_name(0x5000cca2525ea9e7) [Mon Dec 9 06:17:14 2019][ 8.968222] scsi 1:0:13:0: enclosure logical id(0x5000ccab04037180), slot(20) [Mon Dec 9 06:17:14 2019][ 8.975440] scsi 1:0:13:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 8.982245] scsi 1:0:13:0: serial_number( 7SHP1X8W) [Mon Dec 9 06:17:14 2019][ 8.987734] scsi 1:0:13:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 9.011014] mpt3sas_cm0: detecting: handle(0x0069), sas_address(0x5000cca2525f1d3a), phy(12) [Mon Dec 9 06:17:14 2019][ 9.019446] mpt3sas_cm0: REPORT_LUNS: handle(0x0069), retries(0) [Mon Dec 9 06:17:14 2019][ 9.025579] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0069), lun(0) [Mon Dec 9 06:17:14 2019][ 9.032209] scsi 1:0:14:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 9.040511] scsi 1:0:14:0: SSP: handle(0x0069), sas_addr(0x5000cca2525f1d3a), phy(12), device_name(0x5000cca2525f1d3b) [Mon Dec 9 06:17:14 2019][ 9.051199] scsi 1:0:14:0: enclosure logical id(0x5000ccab04037180), slot(21) [Mon Dec 9 06:17:14 2019][ 9.058418] scsi 1:0:14:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 9.065227] scsi 1:0:14:0: serial_number( 7SHP9LBW) [Mon Dec 9 06:17:14 2019][ 9.070721] scsi 1:0:14:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 9.093009] mpt3sas_cm0: detecting: handle(0x006a), sas_address(0x5000cca2525ea49a), phy(13) [Mon Dec 9 06:17:14 2019][ 9.101446] mpt3sas_cm0: REPORT_LUNS: handle(0x006a), retries(0) [Mon Dec 9 06:17:14 2019][ 9.107619] mpt3sas_cm0: TEST_UNIT_READY: handle(0x006a), lun(0) [Mon Dec 9 06:17:14 2019][ 9.114444] scsi 1:0:15:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 9.122753] scsi 1:0:15:0: SSP: handle(0x006a), sas_addr(0x5000cca2525ea49a), phy(13), device_name(0x5000cca2525ea49b) [Mon Dec 9 06:17:14 2019][ 9.133438] scsi 1:0:15:0: enclosure logical id(0x5000ccab04037180), slot(22) [Mon Dec 9 06:17:14 2019][ 9.140657] scsi 1:0:15:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 9.147461] scsi 1:0:15:0: serial_number( 7SHP1KAW) [Mon Dec 9 06:17:14 2019][ 9.152951] scsi 1:0:15:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 9.324024] mpt3sas_cm0: detecting: handle(0x006b), sas_address(0x5000cca2525fba06), phy(14) [Mon Dec 9 06:17:14 2019][ 9.332459] mpt3sas_cm0: REPORT_LUNS: handle(0x006b), retries(0) [Mon Dec 9 06:17:14 2019][ 9.338592] mpt3sas_cm0: TEST_UNIT_READY: handle(0x006b), lun(0) [Mon Dec 9 06:17:14 2019][ 9.355556] scsi 1:0:16:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 9.363907] scsi 1:0:16:0: SSP: handle(0x006b), sas_addr(0x5000cca2525fba06), phy(14), device_name(0x5000cca2525fba07) [Mon Dec 9 06:17:14 2019][ 9.374594] scsi 1:0:16:0: enclosure logical id(0x5000ccab04037180), slot(23) [Mon Dec 9 06:17:14 2019][ 9.381814] scsi 1:0:16:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 9.388619] scsi 1:0:16:0: serial_number( 7SHPN12W) [Mon Dec 9 06:17:14 2019][ 9.394105] scsi 1:0:16:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 9.420024] mpt3sas_cm0: detecting: handle(0x006c), sas_address(0x5000cca2525e121e), phy(15) [Mon Dec 9 06:17:14 2019][ 9.428466] mpt3sas_cm0: REPORT_LUNS: handle(0x006c), retries(0) [Mon Dec 9 06:17:14 2019][ 9.434607] mpt3sas_cm0: TEST_UNIT_READY: handle(0x006c), lun(0) [Mon Dec 9 06:17:14 2019][ 9.441234] scsi 1:0:17:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 9.449532] scsi 1:0:17:0: SSP: handle(0x006c), sas_addr(0x5000cca2525e121e), phy(15), device_name(0x5000cca2525e121f) [Mon Dec 9 06:17:14 2019][ 9.460214] scsi 1:0:17:0: enclosure logical id(0x5000ccab04037180), slot(24) [Mon Dec 9 06:17:14 2019][ 9.467433] scsi 1:0:17:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 9.474238] scsi 1:0:17:0: serial_number( 7SHNRTXW) [Mon Dec 9 06:17:14 2019][ 9.479728] scsi 1:0:17:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 9.500015] mpt3sas_cm0: detecting: handle(0x006d), sas_address(0x5000cca2525e98f6), phy(16) [Mon Dec 9 06:17:14 2019][ 9.508450] mpt3sas_cm0: REPORT_LUNS: handle(0x006d), retries(0) [Mon Dec 9 06:17:14 2019][ 9.514580] mpt3sas_cm0: TEST_UNIT_READY: handle(0x006d), lun(0) [Mon Dec 9 06:17:14 2019][ 9.528640] scsi 1:0:18:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 9.536926] scsi 1:0:18:0: SSP: handle(0x006d), sas_addr(0x5000cca2525e98f6), phy(16), device_name(0x5000cca2525e98f7) [Mon Dec 9 06:17:14 2019][ 9.547611] scsi 1:0:18:0: enclosure logical id(0x5000ccab04037180), slot(25) [Mon Dec 9 06:17:14 2019][ 9.554831] scsi 1:0:18:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 9.561624] scsi 1:0:18:0: serial_number( 7SHP0T9W) [Mon Dec 9 06:17:14 2019][ 9.567117] scsi 1:0:18:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 9.590261] mpt3sas_cm0: detecting: handle(0x006e), sas_address(0x5000cca2525f8176), phy(17) [Mon Dec 9 06:17:14 2019][ 9.598700] mpt3sas_cm0: REPORT_LUNS: handle(0x006e), retries(0) [Mon Dec 9 06:17:14 2019][ 9.604839] mpt3sas_cm0: TEST_UNIT_READY: handle(0x006e), lun(0) [Mon Dec 9 06:17:14 2019][ 9.611468] scsi 1:0:19:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 9.619774] scsi 1:0:19:0: SSP: handle(0x006e), sas_addr(0x5000cca2525f8176), phy(17), device_name(0x5000cca2525f8177) [Mon Dec 9 06:17:14 2019][ 9.630457] scsi 1:0:19:0: enclosure logical id(0x5000ccab04037180), slot(26) [Mon Dec 9 06:17:14 2019][ 9.637677] scsi 1:0:19:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 9.644484] scsi 1:0:19:0: serial_number( 7SHPJ89W) [Mon Dec 9 06:17:14 2019][ 9.649970] scsi 1:0:19:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 9.670019] mpt3sas_cm0: detecting: handle(0x006f), sas_address(0x5000cca2525fb01e), phy(18) [Mon Dec 9 06:17:14 2019][ 9.678459] mpt3sas_cm0: REPORT_LUNS: handle(0x006f), retries(0) [Mon Dec 9 06:17:14 2019][ 9.684593] mpt3sas_cm0: TEST_UNIT_READY: handle(0x006f), lun(0) [Mon Dec 9 06:17:14 2019][ 9.691231] scsi 1:0:20:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 9.699531] scsi 1:0:20:0: SSP: handle(0x006f), sas_addr(0x5000cca2525fb01e), phy(18), device_name(0x5000cca2525fb01f) [Mon Dec 9 06:17:14 2019][ 9.710218] scsi 1:0:20:0: enclosure logical id(0x5000ccab04037180), slot(27) [Mon Dec 9 06:17:14 2019][ 9.717438] scsi 1:0:20:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 9.724242] scsi 1:0:20:0: serial_number( 7SHPMBMW) [Mon Dec 9 06:17:14 2019][ 9.729732] scsi 1:0:20:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 9.750021] mpt3sas_cm0: detecting: handle(0x0070), sas_address(0x5000cca2525ed54a), phy(19) [Mon Dec 9 06:17:14 2019][ 9.758455] mpt3sas_cm0: REPORT_LUNS: handle(0x0070), retries(0) [Mon Dec 9 06:17:14 2019][ 9.764619] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0070), lun(0) [Mon Dec 9 06:17:14 2019][ 9.771265] scsi 1:0:21:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 9.779567] scsi 1:0:21:0: SSP: handle(0x0070), sas_addr(0x5000cca2525ed54a), phy(19), device_name(0x5000cca2525ed54b) [Mon Dec 9 06:17:15 2019][ 9.790249] scsi 1:0:21:0: enclosure logical id(0x5000ccab04037180), slot(28) [Mon Dec 9 06:17:15 2019][ 9.797469] scsi 1:0:21:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 9.804273] scsi 1:0:21:0: serial_number( 7SHP4TVW) [Mon Dec 9 06:17:15 2019][ 9.809762] scsi 1:0:21:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 9.832022] mpt3sas_cm0: detecting: handle(0x0071), sas_address(0x5000cca2525fa036), phy(20) [Mon Dec 9 06:17:15 2019][ 9.840462] mpt3sas_cm0: REPORT_LUNS: handle(0x0071), retries(0) [Mon Dec 9 06:17:15 2019][ 9.846635] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0071), lun(0) [Mon Dec 9 06:17:15 2019][ 9.853275] scsi 1:0:22:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 9.861574] scsi 1:0:22:0: SSP: handle(0x0071), sas_addr(0x5000cca2525fa036), phy(20), device_name(0x5000cca2525fa037) [Mon Dec 9 06:17:15 2019][ 9.872255] scsi 1:0:22:0: enclosure logical id(0x5000ccab04037180), slot(29) [Mon Dec 9 06:17:15 2019][ 9.879475] scsi 1:0:22:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 9.886279] scsi 1:0:22:0: serial_number( 7SHPL9TW) [Mon Dec 9 06:17:15 2019][ 9.891767] scsi 1:0:22:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 9.912024] mpt3sas_cm0: detecting: handle(0x0072), sas_address(0x5000cca2525fb942), phy(21) [Mon Dec 9 06:17:15 2019][ 9.920464] mpt3sas_cm0: REPORT_LUNS: handle(0x0072), retries(0) [Mon Dec 9 06:17:15 2019][ 9.926599] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0072), lun(0) [Mon Dec 9 06:17:15 2019][ 9.944464] scsi 1:0:23:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 9.952764] scsi 1:0:23:0: SSP: handle(0x0072), sas_addr(0x5000cca2525fb942), phy(21), device_name(0x5000cca2525fb943) [Mon Dec 9 06:17:15 2019][ 9.963448] scsi 1:0:23:0: enclosure logical id(0x5000ccab04037180), slot(30) [Mon Dec 9 06:17:15 2019][ 9.970667] scsi 1:0:23:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 9.977457] scsi 1:0:23:0: serial_number( 7SHPMZHW) [Mon Dec 9 06:17:15 2019][ 9.982944] scsi 1:0:23:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 10.012617] mpt3sas_cm0: detecting: handle(0x0073), sas_address(0x5000cca2525e22e6), phy(22) [Mon Dec 9 06:17:15 2019][ 10.021052] mpt3sas_cm0: REPORT_LUNS: handle(0x0073), retries(0) [Mon Dec 9 06:17:15 2019][ 10.027184] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0073), lun(0) [Mon Dec 9 06:17:15 2019][ 10.033814] scsi 1:0:24:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 10.042120] scsi 1:0:24:0: SSP: handle(0x0073), sas_addr(0x5000cca2525e22e6), phy(22), device_name(0x5000cca2525e22e7) [Mon Dec 9 06:17:15 2019][ 10.052804] scsi 1:0:24:0: enclosure logical id(0x5000ccab04037180), slot(31) [Mon Dec 9 06:17:15 2019][ 10.060024] scsi 1:0:24:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 10.066833] scsi 1:0:24:0: serial_number( 7SHNSXKW) [Mon Dec 9 06:17:15 2019][ 10.072326] scsi 1:0:24:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 10.092033] mpt3sas_cm0: detecting: handle(0x0074), sas_address(0x5000cca2525fb5be), phy(23) [Mon Dec 9 06:17:15 2019][ 10.100470] mpt3sas_cm0: REPORT_LUNS: handle(0x0074), retries(0) [Mon Dec 9 06:17:15 2019][ 10.106611] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0074), lun(0) [Mon Dec 9 06:17:15 2019][ 10.148418] scsi 1:0:25:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 10.156702] scsi 1:0:25:0: SSP: handle(0x0074), sas_addr(0x5000cca2525fb5be), phy(23), device_name(0x5000cca2525fb5bf) [Mon Dec 9 06:17:15 2019][ 10.167389] scsi 1:0:25:0: enclosure logical id(0x5000ccab04037180), slot(32) [Mon Dec 9 06:17:15 2019][ 10.174610] scsi 1:0:25:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 10.181397] scsi 1:0:25:0: serial_number( 7SHPMS7W) [Mon Dec 9 06:17:15 2019][ 10.186884] scsi 1:0:25:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 10.210035] mpt3sas_cm0: detecting: handle(0x0075), sas_address(0x5000cca2525eb77e), phy(24) [Mon Dec 9 06:17:15 2019][ 10.218475] mpt3sas_cm0: REPORT_LUNS: handle(0x0075), retries(0) [Mon Dec 9 06:17:15 2019][ 10.224608] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0075), lun(0) [Mon Dec 9 06:17:15 2019][ 10.231244] scsi 1:0:26:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 10.239553] scsi 1:0:26:0: SSP: handle(0x0075), sas_addr(0x5000cca2525eb77e), phy(24), device_name(0x5000cca2525eb77f) [Mon Dec 9 06:17:15 2019][ 10.250237] scsi 1:0:26:0: enclosure logical id(0x5000ccab04037180), slot(33) [Mon Dec 9 06:17:15 2019][ 10.257454] scsi 1:0:26:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 10.264260] scsi 1:0:26:0: serial_number( 7SHP2UAW) [Mon Dec 9 06:17:15 2019][ 10.269749] scsi 1:0:26:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 10.290043] mpt3sas_cm0: detecting: handle(0x0076), sas_address(0x5000cca2525e113a), phy(25) [Mon Dec 9 06:17:15 2019][ 10.298480] mpt3sas_cm0: REPORT_LUNS: handle(0x0076), retries(0) [Mon Dec 9 06:17:15 2019][ 10.304624] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0076), lun(0) [Mon Dec 9 06:17:15 2019][ 10.311472] scsi 1:0:27:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 10.319785] scsi 1:0:27:0: SSP: handle(0x0076), sas_addr(0x5000cca2525e113a), phy(25), device_name(0x5000cca2525e113b) [Mon Dec 9 06:17:15 2019][ 10.330475] scsi 1:0:27:0: enclosure logical id(0x5000ccab04037180), slot(34) [Mon Dec 9 06:17:15 2019][ 10.337692] scsi 1:0:27:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 10.344498] scsi 1:0:27:0: serial_number( 7SHNRS2W) [Mon Dec 9 06:17:15 2019][ 10.349986] scsi 1:0:27:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 10.370037] mpt3sas_cm0: detecting: handle(0x0077), sas_address(0x5000cca2526014fa), phy(26) [Mon Dec 9 06:17:15 2019][ 10.378477] mpt3sas_cm0: REPORT_LUNS: handle(0x0077), retries(0) [Mon Dec 9 06:17:15 2019][ 10.384643] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0077), lun(0) [Mon Dec 9 06:17:15 2019][ 10.391254] scsi 1:0:28:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 10.399570] scsi 1:0:28:0: SSP: handle(0x0077), sas_addr(0x5000cca2526014fa), phy(26), device_name(0x5000cca2526014fb) [Mon Dec 9 06:17:15 2019][ 10.410253] scsi 1:0:28:0: enclosure logical id(0x5000ccab04037180), slot(35) [Mon Dec 9 06:17:15 2019][ 10.417473] scsi 1:0:28:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 10.424276] scsi 1:0:28:0: serial_number( 7SHPV2VW) [Mon Dec 9 06:17:15 2019][ 10.429765] scsi 1:0:28:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 10.452040] mpt3sas_cm0: detecting: handle(0x0078), sas_address(0x5000cca252598786), phy(27) [Mon Dec 9 06:17:15 2019][ 10.460481] mpt3sas_cm0: REPORT_LUNS: handle(0x0078), retries(0) [Mon Dec 9 06:17:15 2019][ 10.466613] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0078), lun(0) [Mon Dec 9 06:17:15 2019][ 10.473260] scsi 1:0:29:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 10.481560] scsi 1:0:29:0: SSP: handle(0x0078), sas_addr(0x5000cca252598786), phy(27), device_name(0x5000cca252598787) [Mon Dec 9 06:17:15 2019][ 10.492250] scsi 1:0:29:0: enclosure logical id(0x5000ccab04037180), slot(36) [Mon Dec 9 06:17:15 2019][ 10.499470] scsi 1:0:29:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 10.506275] scsi 1:0:29:0: serial_number( 7SHL7BRW) [Mon Dec 9 06:17:15 2019][ 10.511761] scsi 1:0:29:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 10.535039] mpt3sas_cm0: detecting: handle(0x0079), sas_address(0x5000cca2525f5366), phy(28) [Mon Dec 9 06:17:15 2019][ 10.543476] mpt3sas_cm0: REPORT_LUNS: handle(0x0079), retries(0) [Mon Dec 9 06:17:15 2019][ 10.549642] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0079), lun(0) [Mon Dec 9 06:17:15 2019][ 10.556254] scsi 1:0:30:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 10.564556] scsi 1:0:30:0: SSP: handle(0x0079), sas_addr(0x5000cca2525f5366), phy(28), device_name(0x5000cca2525f5367) [Mon Dec 9 06:17:15 2019][ 10.575244] scsi 1:0:30:0: enclosure logical id(0x5000ccab04037180), slot(37) [Mon Dec 9 06:17:15 2019][ 10.582463] scsi 1:0:30:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 10.589274] scsi 1:0:30:0: serial_number( 7SHPE66W) [Mon Dec 9 06:17:15 2019][ 10.594765] scsi 1:0:30:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 10.622825] mpt3sas_cm0: detecting: handle(0x007a), sas_address(0x5000cca2525e263e), phy(29) [Mon Dec 9 06:17:15 2019][ 10.631262] mpt3sas_cm0: REPORT_LUNS: handle(0x007a), retries(0) [Mon Dec 9 06:17:15 2019][ 10.637394] mpt3sas_cm0: TEST_UNIT_READY: handle(0x007a), lun(0) [Mon Dec 9 06:17:15 2019][ 10.643989] scsi 1:0:31:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 10.652293] scsi 1:0:31:0: SSP: handle(0x007a), sas_addr(0x5000cca2525e263e), phy(29), device_name(0x5000cca2525e263f) [Mon Dec 9 06:17:15 2019][ 10.662980] scsi 1:0:31:0: enclosure logical id(0x5000ccab04037180), slot(38) [Mon Dec 9 06:17:15 2019][ 10.670197] scsi 1:0:31:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 10.677002] scsi 1:0:31:0: serial_number( 7SHNT4GW) [Mon Dec 9 06:17:15 2019][ 10.682491] scsi 1:0:31:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 10.705041] mpt3sas_cm0: detecting: handle(0x007b), sas_address(0x5000cca2525f6082), phy(30) [Mon Dec 9 06:17:15 2019][ 10.713475] mpt3sas_cm0: REPORT_LUNS: handle(0x007b), retries(0) [Mon Dec 9 06:17:15 2019][ 10.723431] mpt3sas_cm0: TEST_UNIT_READY: handle(0x007b), lun(0) [Mon Dec 9 06:17:15 2019][ 10.732342] scsi 1:0:32:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 10.740647] scsi 1:0:32:0: SSP: handle(0x007b), sas_addr(0x5000cca2525f6082), phy(30), device_name(0x5000cca2525f6083) [Mon Dec 9 06:17:15 2019][ 10.751329] scsi 1:0:32:0: enclosure logical id(0x5000ccab04037180), slot(39) [Mon Dec 9 06:17:15 2019][ 10.758548] scsi 1:0:32:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 10.765354] scsi 1:0:32:0: serial_number( 7SHPG28W) [Mon Dec 9 06:17:15 2019][ 10.770840] scsi 1:0:32:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 10.795056] mpt3sas_cm0: detecting: handle(0x007c), sas_address(0x5000cca2525ec83e), phy(31) [Mon Dec 9 06:17:16 2019][ 10.803490] mpt3sas_cm0: REPORT_LUNS: handle(0x007c), retries(0) [Mon Dec 9 06:17:16 2019][ 10.809653] mpt3sas_cm0: TEST_UNIT_READY: handle(0x007c), lun(0) [Mon Dec 9 06:17:16 2019][ 10.816252] scsi 1:0:33:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 10.824552] scsi 1:0:33:0: SSP: handle(0x007c), sas_addr(0x5000cca2525ec83e), phy(31), device_name(0x5000cca2525ec83f) [Mon Dec 9 06:17:16 2019][ 10.835241] scsi 1:0:33:0: enclosure logical id(0x5000ccab04037180), slot(40) [Mon Dec 9 06:17:16 2019][ 10.842461] scsi 1:0:33:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 10.849265] scsi 1:0:33:0: serial_number( 7SHP3XXW) [Mon Dec 9 06:17:16 2019][ 10.854753] scsi 1:0:33:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 10.877045] mpt3sas_cm0: detecting: handle(0x007d), sas_address(0x5000cca2525ec01a), phy(32) [Mon Dec 9 06:17:16 2019][ 10.885479] mpt3sas_cm0: REPORT_LUNS: handle(0x007d), retries(0) [Mon Dec 9 06:17:16 2019][ 10.891641] mpt3sas_cm0: TEST_UNIT_READY: handle(0x007d), lun(0) [Mon Dec 9 06:17:16 2019][ 10.898245] scsi 1:0:34:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 10.906541] scsi 1:0:34:0: SSP: handle(0x007d), sas_addr(0x5000cca2525ec01a), phy(32), device_name(0x5000cca2525ec01b) [Mon Dec 9 06:17:16 2019][ 10.917231] scsi 1:0:34:0: enclosure logical id(0x5000ccab04037180), slot(41) [Mon Dec 9 06:17:16 2019][ 10.924450] scsi 1:0:34:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 10.931254] scsi 1:0:34:0: serial_number( 7SHP3D3W) [Mon Dec 9 06:17:16 2019][ 10.936743] scsi 1:0:34:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 10.963054] mpt3sas_cm0: detecting: handle(0x007e), sas_address(0x5000cca2525ec55a), phy(33) [Mon Dec 9 06:17:16 2019][ 10.971491] mpt3sas_cm0: REPORT_LUNS: handle(0x007e), retries(0) [Mon Dec 9 06:17:16 2019][ 10.977824] mpt3sas_cm0: TEST_UNIT_READY: handle(0x007e), lun(0) [Mon Dec 9 06:17:16 2019][ 10.985415] scsi 1:0:35:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 10.993749] scsi 1:0:35:0: SSP: handle(0x007e), sas_addr(0x5000cca2525ec55a), phy(33), device_name(0x5000cca2525ec55b) [Mon Dec 9 06:17:16 2019][ 11.004436] scsi 1:0:35:0: enclosure logical id(0x5000ccab04037180), slot(42) [Mon Dec 9 06:17:16 2019][ 11.011656] scsi 1:0:35:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.018461] scsi 1:0:35:0: serial_number( 7SHP3RYW) [Mon Dec 9 06:17:16 2019][ 11.023948] scsi 1:0:35:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.044054] mpt3sas_cm0: detecting: handle(0x007f), sas_address(0x5000cca2525fd4a2), phy(34) [Mon Dec 9 06:17:16 2019][ 11.052493] mpt3sas_cm0: REPORT_LUNS: handle(0x007f), retries(0) [Mon Dec 9 06:17:16 2019][ 11.058632] mpt3sas_cm0: TEST_UNIT_READY: handle(0x007f), lun(0) [Mon Dec 9 06:17:16 2019][ 11.065390] scsi 1:0:36:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 11.080944] scsi 1:0:36:0: SSP: handle(0x007f), sas_addr(0x5000cca2525fd4a2), phy(34), device_name(0x5000cca2525fd4a3) [Mon Dec 9 06:17:16 2019][ 11.091634] scsi 1:0:36:0: enclosure logical id(0x5000ccab04037180), slot(43) [Mon Dec 9 06:17:16 2019][ 11.098853] scsi 1:0:36:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.105661] scsi 1:0:36:0: serial_number( 7SHPPU0W) [Mon Dec 9 06:17:16 2019][ 11.111145] scsi 1:0:36:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.131054] mpt3sas_cm0: detecting: handle(0x0080), sas_address(0x5000cca2525eb5f6), phy(35) [Mon Dec 9 06:17:16 2019][ 11.139488] mpt3sas_cm0: REPORT_LUNS: handle(0x0080), retries(0) [Mon Dec 9 06:17:16 2019][ 11.145650] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0080), lun(0) [Mon Dec 9 06:17:16 2019][ 11.152338] scsi 1:0:37:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 11.160646] scsi 1:0:37:0: SSP: handle(0x0080), sas_addr(0x5000cca2525eb5f6), phy(35), device_name(0x5000cca2525eb5f7) [Mon Dec 9 06:17:16 2019][ 11.171335] scsi 1:0:37:0: enclosure logical id(0x5000ccab04037180), slot(44) [Mon Dec 9 06:17:16 2019][ 11.178553] scsi 1:0:37:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.185358] scsi 1:0:37:0: serial_number( 7SHP2R5W) [Mon Dec 9 06:17:16 2019][ 11.190847] scsi 1:0:37:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.211056] mpt3sas_cm0: detecting: handle(0x0081), sas_address(0x5000cca2525ebeb2), phy(36) [Mon Dec 9 06:17:16 2019][ 11.219496] mpt3sas_cm0: REPORT_LUNS: handle(0x0081), retries(0) [Mon Dec 9 06:17:16 2019][ 11.225633] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0081), lun(0) [Mon Dec 9 06:17:16 2019][ 11.232233] scsi 1:0:38:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 11.240537] scsi 1:0:38:0: SSP: handle(0x0081), sas_addr(0x5000cca2525ebeb2), phy(36), device_name(0x5000cca2525ebeb3) [Mon Dec 9 06:17:16 2019][ 11.251227] scsi 1:0:38:0: enclosure logical id(0x5000ccab04037180), slot(45) [Mon Dec 9 06:17:16 2019][ 11.258446] scsi 1:0:38:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.265250] scsi 1:0:38:0: serial_number( 7SHP396W) [Mon Dec 9 06:17:16 2019][ 11.270739] scsi 1:0:38:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.293054] mpt3sas_cm0: detecting: handle(0x0082), sas_address(0x5000cca2525f291a), phy(37) [Mon Dec 9 06:17:16 2019][ 11.301491] mpt3sas_cm0: REPORT_LUNS: handle(0x0082), retries(0) [Mon Dec 9 06:17:16 2019][ 11.307654] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0082), lun(0) [Mon Dec 9 06:17:16 2019][ 11.314407] scsi 1:0:39:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 11.322746] scsi 1:0:39:0: SSP: handle(0x0082), sas_addr(0x5000cca2525f291a), phy(37), device_name(0x5000cca2525f291b) [Mon Dec 9 06:17:16 2019][ 11.333433] scsi 1:0:39:0: enclosure logical id(0x5000ccab04037180), slot(46) [Mon Dec 9 06:17:16 2019][ 11.340652] scsi 1:0:39:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.347457] scsi 1:0:39:0: serial_number( 7SHPABWW) [Mon Dec 9 06:17:16 2019][ 11.352944] scsi 1:0:39:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.373069] mpt3sas_cm0: detecting: handle(0x0083), sas_address(0x5000cca252602c0e), phy(38) [Mon Dec 9 06:17:16 2019][ 11.381503] mpt3sas_cm0: REPORT_LUNS: handle(0x0083), retries(0) [Mon Dec 9 06:17:16 2019][ 11.387645] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0083), lun(0) [Mon Dec 9 06:17:16 2019][ 11.394246] scsi 1:0:40:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 11.402548] scsi 1:0:40:0: SSP: handle(0x0083), sas_addr(0x5000cca252602c0e), phy(38), device_name(0x5000cca252602c0f) [Mon Dec 9 06:17:16 2019][ 11.413235] scsi 1:0:40:0: enclosure logical id(0x5000ccab04037180), slot(47) [Mon Dec 9 06:17:16 2019][ 11.420455] scsi 1:0:40:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.427259] scsi 1:0:40:0: serial_number( 7SHPWMHW) [Mon Dec 9 06:17:16 2019][ 11.432748] scsi 1:0:40:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.455071] mpt3sas_cm0: detecting: handle(0x0084), sas_address(0x5000cca2525e7cfe), phy(39) [Mon Dec 9 06:17:16 2019][ 11.463507] mpt3sas_cm0: REPORT_LUNS: handle(0x0084), retries(0) [Mon Dec 9 06:17:16 2019][ 11.469641] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0084), lun(0) [Mon Dec 9 06:17:16 2019][ 11.476236] scsi 1:0:41:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 11.484533] scsi 1:0:41:0: SSP: handle(0x0084), sas_addr(0x5000cca2525e7cfe), phy(39), device_name(0x5000cca2525e7cff) [Mon Dec 9 06:17:16 2019][ 11.495217] scsi 1:0:41:0: enclosure logical id(0x5000ccab04037180), slot(48) [Mon Dec 9 06:17:16 2019][ 11.502437] scsi 1:0:41:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.509241] scsi 1:0:41:0: serial_number( 7SHNYXKW) [Mon Dec 9 06:17:16 2019][ 11.514729] scsi 1:0:41:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.537061] mpt3sas_cm0: detecting: handle(0x0085), sas_address(0x5000cca2525f6a32), phy(40) [Mon Dec 9 06:17:16 2019][ 11.545498] mpt3sas_cm0: REPORT_LUNS: handle(0x0085), retries(0) [Mon Dec 9 06:17:16 2019][ 11.551973] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0085), lun(0) [Mon Dec 9 06:17:16 2019][ 11.570231] scsi 1:0:42:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 11.578526] scsi 1:0:42:0: SSP: handle(0x0085), sas_addr(0x5000cca2525f6a32), phy(40), device_name(0x5000cca2525f6a33) [Mon Dec 9 06:17:16 2019][ 11.589208] scsi 1:0:42:0: enclosure logical id(0x5000ccab04037180), slot(49) [Mon Dec 9 06:17:16 2019][ 11.596427] scsi 1:0:42:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.603232] scsi 1:0:42:0: serial_number( 7SHPGR8W) [Mon Dec 9 06:17:16 2019][ 11.608721] scsi 1:0:42:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.631070] mpt3sas_cm0: detecting: handle(0x0086), sas_address(0x5000cca2525f7f26), phy(41) [Mon Dec 9 06:17:16 2019][ 11.639507] mpt3sas_cm0: REPORT_LUNS: handle(0x0086), retries(0) [Mon Dec 9 06:17:16 2019][ 11.645641] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0086), lun(0) [Mon Dec 9 06:17:16 2019][ 11.652244] scsi 1:0:43:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 11.660565] scsi 1:0:43:0: SSP: handle(0x0086), sas_addr(0x5000cca2525f7f26), phy(41), device_name(0x5000cca2525f7f27) [Mon Dec 9 06:17:16 2019][ 11.671249] scsi 1:0:43:0: enclosure logical id(0x5000ccab04037180), slot(50) [Mon Dec 9 06:17:16 2019][ 11.678469] scsi 1:0:43:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.685275] scsi 1:0:43:0: serial_number( 7SHPJ3JW) [Mon Dec 9 06:17:16 2019][ 11.690762] scsi 1:0:43:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.711072] mpt3sas_cm0: detecting: handle(0x0087), sas_address(0x5000cca2525eb4b2), phy(42) [Mon Dec 9 06:17:16 2019][ 11.719510] mpt3sas_cm0: REPORT_LUNS: handle(0x0087), retries(0) [Mon Dec 9 06:17:16 2019][ 11.725669] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0087), lun(0) [Mon Dec 9 06:17:16 2019][ 11.732371] scsi 1:0:44:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 11.740677] scsi 1:0:44:0: SSP: handle(0x0087), sas_addr(0x5000cca2525eb4b2), phy(42), device_name(0x5000cca2525eb4b3) [Mon Dec 9 06:17:16 2019][ 11.751366] scsi 1:0:44:0: enclosure logical id(0x5000ccab04037180), slot(51) [Mon Dec 9 06:17:16 2019][ 11.758586] scsi 1:0:44:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.765391] scsi 1:0:44:0: serial_number( 7SHP2MKW) [Mon Dec 9 06:17:16 2019][ 11.770877] scsi 1:0:44:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.791067] mpt3sas_cm0: detecting: handle(0x0088), sas_address(0x5000cca2525e1f9e), phy(43) [Mon Dec 9 06:17:17 2019][ 11.799508] mpt3sas_cm0: REPORT_LUNS: handle(0x0088), retries(0) [Mon Dec 9 06:17:17 2019][ 11.805646] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0088), lun(0) [Mon Dec 9 06:17:17 2019][ 11.812246] scsi 1:0:45:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 11.820543] scsi 1:0:45:0: SSP: handle(0x0088), sas_addr(0x5000cca2525e1f9e), phy(43), device_name(0x5000cca2525e1f9f) [Mon Dec 9 06:17:17 2019][ 11.831230] scsi 1:0:45:0: enclosure logical id(0x5000ccab04037180), slot(52) [Mon Dec 9 06:17:17 2019][ 11.838450] scsi 1:0:45:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 11.845255] scsi 1:0:45:0: serial_number( 7SHNSPTW) [Mon Dec 9 06:17:17 2019][ 11.850744] scsi 1:0:45:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 11.871662] mpt3sas_cm0: detecting: handle(0x0089), sas_address(0x5000cca2525e52fe), phy(44) [Mon Dec 9 06:17:17 2019][ 11.880100] mpt3sas_cm0: REPORT_LUNS: handle(0x0089), retries(0) [Mon Dec 9 06:17:17 2019][ 11.886256] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0089), lun(0) [Mon Dec 9 06:17:17 2019][ 11.892848] scsi 1:0:46:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 11.901148] scsi 1:0:46:0: SSP: handle(0x0089), sas_addr(0x5000cca2525e52fe), phy(44), device_name(0x5000cca2525e52ff) [Mon Dec 9 06:17:17 2019][ 11.911833] scsi 1:0:46:0: enclosure logical id(0x5000ccab04037180), slot(53) [Mon Dec 9 06:17:17 2019][ 11.919052] scsi 1:0:46:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 11.925860] scsi 1:0:46:0: serial_number( 7SHNW3VW) [Mon Dec 9 06:17:17 2019][ 11.931346] scsi 1:0:46:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 11.951075] mpt3sas_cm0: detecting: handle(0x008a), sas_address(0x5000cca2525f4e72), phy(45) [Mon Dec 9 06:17:17 2019][ 11.959520] mpt3sas_cm0: REPORT_LUNS: handle(0x008a), retries(0) [Mon Dec 9 06:17:17 2019][ 11.965681] mpt3sas_cm0: TEST_UNIT_READY: handle(0x008a), lun(0) [Mon Dec 9 06:17:17 2019][ 11.972270] scsi 1:0:47:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 11.980568] scsi 1:0:47:0: SSP: handle(0x008a), sas_addr(0x5000cca2525f4e72), phy(45), device_name(0x5000cca2525f4e73) [Mon Dec 9 06:17:17 2019][ 11.991258] scsi 1:0:47:0: enclosure logical id(0x5000ccab04037180), slot(54) [Mon Dec 9 06:17:17 2019][ 11.998477] scsi 1:0:47:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 12.005281] scsi 1:0:47:0: serial_number( 7SHPDVZW) [Mon Dec 9 06:17:17 2019][ 12.010767] scsi 1:0:47:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 12.033091] mpt3sas_cm0: detecting: handle(0x008b), sas_address(0x5000cca2525fd49a), phy(46) [Mon Dec 9 06:17:17 2019][ 12.041529] mpt3sas_cm0: REPORT_LUNS: handle(0x008b), retries(0) [Mon Dec 9 06:17:17 2019][ 12.047688] mpt3sas_cm0: TEST_UNIT_READY: handle(0x008b), lun(0) [Mon Dec 9 06:17:17 2019][ 12.054320] scsi 1:0:48:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 12.062620] scsi 1:0:48:0: SSP: handle(0x008b), sas_addr(0x5000cca2525fd49a), phy(46), device_name(0x5000cca2525fd49b) [Mon Dec 9 06:17:17 2019][ 12.073307] scsi 1:0:48:0: enclosure logical id(0x5000ccab04037180), slot(55) [Mon Dec 9 06:17:17 2019][ 12.080527] scsi 1:0:48:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 12.087331] scsi 1:0:48:0: serial_number( 7SHPPTYW) [Mon Dec 9 06:17:17 2019][ 12.092817] scsi 1:0:48:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 12.115074] mpt3sas_cm0: detecting: handle(0x008c), sas_address(0x5000cca2525e787a), phy(47) [Mon Dec 9 06:17:17 2019][ 12.123509] mpt3sas_cm0: REPORT_LUNS: handle(0x008c), retries(0) [Mon Dec 9 06:17:17 2019][ 12.129669] mpt3sas_cm0: TEST_UNIT_READY: handle(0x008c), lun(0) [Mon Dec 9 06:17:17 2019][ 12.136257] scsi 1:0:49:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 12.144554] scsi 1:0:49:0: SSP: handle(0x008c), sas_addr(0x5000cca2525e787a), phy(47), device_name(0x5000cca2525e787b) [Mon Dec 9 06:17:17 2019][ 12.155244] scsi 1:0:49:0: enclosure logical id(0x5000ccab04037180), slot(56) [Mon Dec 9 06:17:17 2019][ 12.162463] scsi 1:0:49:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 12.169269] scsi 1:0:49:0: serial_number( 7SHNYM7W) [Mon Dec 9 06:17:17 2019][ 12.174756] scsi 1:0:49:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 12.202979] mpt3sas_cm0: detecting: handle(0x008d), sas_address(0x5000cca2525ca19a), phy(48) [Mon Dec 9 06:17:17 2019][ 12.211416] mpt3sas_cm0: REPORT_LUNS: handle(0x008d), retries(0) [Mon Dec 9 06:17:17 2019][ 12.217550] mpt3sas_cm0: TEST_UNIT_READY: handle(0x008d), lun(0) [Mon Dec 9 06:17:17 2019][ 12.273401] scsi 1:0:50:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 12.281695] scsi 1:0:50:0: SSP: handle(0x008d), sas_addr(0x5000cca2525ca19a), phy(48), device_name(0x5000cca2525ca19b) [Mon Dec 9 06:17:17 2019][ 12.292379] scsi 1:0:50:0: enclosure logical id(0x5000ccab04037180), slot(57) [Mon Dec 9 06:17:17 2019][ 12.299598] scsi 1:0:50:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 12.306389] scsi 1:0:50:0: serial_number( 7SHMY83W) [Mon Dec 9 06:17:17 2019][ 12.311872] scsi 1:0:50:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 12.349544] mpt3sas_cm0: detecting: handle(0x008e), sas_address(0x5000cca2525ffb8a), phy(49) [Mon Dec 9 06:17:17 2019][ 12.357984] mpt3sas_cm0: REPORT_LUNS: handle(0x008e), retries(0) [Mon Dec 9 06:17:17 2019][ 12.364163] mpt3sas_cm0: TEST_UNIT_READY: handle(0x008e), lun(0) [Mon Dec 9 06:17:17 2019][ 12.371027] scsi 1:0:51:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 12.379344] scsi 1:0:51:0: SSP: handle(0x008e), sas_addr(0x5000cca2525ffb8a), phy(49), device_name(0x5000cca2525ffb8b) [Mon Dec 9 06:17:17 2019][ 12.390029] scsi 1:0:51:0: enclosure logical id(0x5000ccab04037180), slot(58) [Mon Dec 9 06:17:17 2019][ 12.397248] scsi 1:0:51:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 12.404052] scsi 1:0:51:0: serial_number( 7SHPTDAW) [Mon Dec 9 06:17:17 2019][ 12.409541] scsi 1:0:51:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 12.432103] mpt3sas_cm0: detecting: handle(0x008f), sas_address(0x5000cca2525f266a), phy(50) [Mon Dec 9 06:17:17 2019][ 12.440542] mpt3sas_cm0: REPORT_LUNS: handle(0x008f), retries(0) [Mon Dec 9 06:17:17 2019][ 12.446685] mpt3sas_cm0: TEST_UNIT_READY: handle(0x008f), lun(0) [Mon Dec 9 06:17:17 2019][ 12.453503] scsi 1:0:52:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 12.461817] scsi 1:0:52:0: SSP: handle(0x008f), sas_addr(0x5000cca2525f266a), phy(50), device_name(0x5000cca2525f266b) [Mon Dec 9 06:17:17 2019][ 12.472502] scsi 1:0:52:0: enclosure logical id(0x5000ccab04037180), slot(59) [Mon Dec 9 06:17:17 2019][ 12.479722] scsi 1:0:52:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 12.486527] scsi 1:0:52:0: serial_number( 7SHPA6AW) [Mon Dec 9 06:17:17 2019][ 12.492015] scsi 1:0:52:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 12.535558] mpt3sas_cm0: expander_add: handle(0x005b), parent(0x0058), sas_addr(0x5000ccab040371fb), phys(68) [Mon Dec 9 06:17:17 2019][ 12.554902] mpt3sas_cm0: detecting: handle(0x0090), sas_address(0x5000cca2525eacc2), phy(42) [Mon Dec 9 06:17:17 2019][ 12.563352] mpt3sas_cm0: REPORT_LUNS: handle(0x0090), retries(0) [Mon Dec 9 06:17:17 2019][ 12.569509] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0090), lun(0) [Mon Dec 9 06:17:17 2019][ 12.576175] scsi 1:0:53:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 12.584481] scsi 1:0:53:0: SSP: handle(0x0090), sas_addr(0x5000cca2525eacc2), phy(42), device_name(0x5000cca2525eacc3) [Mon Dec 9 06:17:17 2019][ 12.595166] scsi 1:0:53:0: enclosure logical id(0x5000ccab04037180), slot(1) [Mon Dec 9 06:17:17 2019][ 12.602298] scsi 1:0:53:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 12.609101] scsi 1:0:53:0: serial_number( 7SHP235W) [Mon Dec 9 06:17:17 2019][ 12.614590] scsi 1:0:53:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 12.637095] mpt3sas_cm0: detecting: handle(0x0091), sas_address(0x5000cca2525f8152), phy(43) [Mon Dec 9 06:17:17 2019][ 12.645532] mpt3sas_cm0: REPORT_LUNS: handle(0x0091), retries(0) [Mon Dec 9 06:17:17 2019][ 12.651665] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0091), lun(0) [Mon Dec 9 06:17:17 2019][ 12.658264] scsi 1:0:54:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 12.666562] scsi 1:0:54:0: SSP: handle(0x0091), sas_addr(0x5000cca2525f8152), phy(43), device_name(0x5000cca2525f8153) [Mon Dec 9 06:17:17 2019][ 12.677249] scsi 1:0:54:0: enclosure logical id(0x5000ccab04037180), slot(3) [Mon Dec 9 06:17:17 2019][ 12.684381] scsi 1:0:54:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 12.691185] scsi 1:0:54:0: serial_number( 7SHPJ80W) [Mon Dec 9 06:17:17 2019][ 12.696673] scsi 1:0:54:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 12.719681] mpt3sas_cm0: detecting: handle(0x0092), sas_address(0x5000cca2525ef83a), phy(44) [Mon Dec 9 06:17:17 2019][ 12.728119] mpt3sas_cm0: REPORT_LUNS: handle(0x0092), retries(0) [Mon Dec 9 06:17:17 2019][ 12.734251] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0092), lun(0) [Mon Dec 9 06:17:17 2019][ 12.740857] scsi 1:0:55:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 12.749158] scsi 1:0:55:0: SSP: handle(0x0092), sas_addr(0x5000cca2525ef83a), phy(44), device_name(0x5000cca2525ef83b) [Mon Dec 9 06:17:17 2019][ 12.759845] scsi 1:0:55:0: enclosure logical id(0x5000ccab04037180), slot(4) [Mon Dec 9 06:17:17 2019][ 12.766979] scsi 1:0:55:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 12.773786] scsi 1:0:55:0: serial_number( 7SHP73ZW) [Mon Dec 9 06:17:17 2019][ 12.779270] scsi 1:0:55:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 12.799089] mpt3sas_cm0: detecting: handle(0x0093), sas_address(0x5000cca2525e72aa), phy(45) [Mon Dec 9 06:17:18 2019][ 12.807527] mpt3sas_cm0: REPORT_LUNS: handle(0x0093), retries(0) [Mon Dec 9 06:17:18 2019][ 12.813698] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0093), lun(0) [Mon Dec 9 06:17:18 2019][ 12.820484] scsi 1:0:56:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 12.829509] scsi 1:0:56:0: SSP: handle(0x0093), sas_addr(0x5000cca2525e72aa), phy(45), device_name(0x5000cca2525e72ab) [Mon Dec 9 06:17:18 2019][ 12.840196] scsi 1:0:56:0: enclosure logical id(0x5000ccab04037180), slot(5) [Mon Dec 9 06:17:18 2019][ 12.847329] scsi 1:0:56:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:18 2019][ 12.854134] scsi 1:0:56:0: serial_number( 7SHNY77W) [Mon Dec 9 06:17:18 2019][ 12.859621] scsi 1:0:56:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 12.882095] mpt3sas_cm0: detecting: handle(0x0094), sas_address(0x5000cca2525d3c8a), phy(46) [Mon Dec 9 06:17:18 2019][ 12.890532] mpt3sas_cm0: REPORT_LUNS: handle(0x0094), retries(0) [Mon Dec 9 06:17:18 2019][ 12.896670] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0094), lun(0) [Mon Dec 9 06:17:18 2019][ 12.903291] scsi 1:0:57:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 12.911597] scsi 1:0:57:0: SSP: handle(0x0094), sas_addr(0x5000cca2525d3c8a), phy(46), device_name(0x5000cca2525d3c8b) [Mon Dec 9 06:17:18 2019][ 12.922281] scsi 1:0:57:0: enclosure logical id(0x5000ccab04037180), slot(6) [Mon Dec 9 06:17:18 2019][ 12.929413] scsi 1:0:57:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:18 2019][ 12.936216] scsi 1:0:57:0: serial_number( 7SHN8KZW) [Mon Dec 9 06:17:18 2019][ 12.941705] scsi 1:0:57:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 12.964105] mpt3sas_cm0: detecting: handle(0x0095), sas_address(0x5000cca2525fae0e), phy(47) [Mon Dec 9 06:17:18 2019][ 12.972543] mpt3sas_cm0: REPORT_LUNS: handle(0x0095), retries(0) [Mon Dec 9 06:17:18 2019][ 12.978676] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0095), lun(0) [Mon Dec 9 06:17:18 2019][ 12.985254] scsi 1:0:58:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 12.993553] scsi 1:0:58:0: SSP: handle(0x0095), sas_addr(0x5000cca2525fae0e), phy(47), device_name(0x5000cca2525fae0f) [Mon Dec 9 06:17:18 2019][ 13.004242] scsi 1:0:58:0: enclosure logical id(0x5000ccab04037180), slot(7) [Mon Dec 9 06:17:18 2019][ 13.011375] scsi 1:0:58:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:18 2019][ 13.018179] scsi 1:0:58:0: serial_number( 7SHPM7BW) [Mon Dec 9 06:17:18 2019][ 13.023667] scsi 1:0:58:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 13.049097] mpt3sas_cm0: detecting: handle(0x0096), sas_address(0x5000cca2525efdae), phy(48) [Mon Dec 9 06:17:18 2019][ 13.057532] mpt3sas_cm0: REPORT_LUNS: handle(0x0096), retries(0) [Mon Dec 9 06:17:18 2019][ 13.063687] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0096), lun(0) [Mon Dec 9 06:17:18 2019][ 13.073156] scsi 1:0:59:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 13.083763] scsi 1:0:59:0: SSP: handle(0x0096), sas_addr(0x5000cca2525efdae), phy(48), device_name(0x5000cca2525efdaf) [Mon Dec 9 06:17:18 2019][ 13.094448] scsi 1:0:59:0: enclosure logical id(0x5000ccab04037180), slot(8) [Mon Dec 9 06:17:18 2019][ 13.101580] scsi 1:0:59:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:18 2019][ 13.108385] scsi 1:0:59:0: serial_number( 7SHP7H7W) [Mon Dec 9 06:17:18 2019][ 13.113871] scsi 1:0:59:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 13.228484] mpt3sas_cm0: detecting: handle(0x0097), sas_address(0x5000cca2525fa302), phy(49) [Mon Dec 9 06:17:18 2019][ 13.236918] mpt3sas_cm0: REPORT_LUNS: handle(0x0097), retries(0) [Mon Dec 9 06:17:18 2019][ 13.243322] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0097), lun(0) [Mon Dec 9 06:17:18 2019][ 13.256804] scsi 1:0:60:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 13.265098] scsi 1:0:60:0: SSP: handle(0x0097), sas_addr(0x5000cca2525fa302), phy(49), device_name(0x5000cca2525fa303) [Mon Dec 9 06:17:18 2019][ 13.275785] scsi 1:0:60:0: enclosure logical id(0x5000ccab04037180), slot(9) [Mon Dec 9 06:17:18 2019][ 13.282917] scsi 1:0:60:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:18 2019][ 13.289707] scsi 1:0:60:0: serial_number( 7SHPLHKW) [Mon Dec 9 06:17:18 2019][ 13.295191] scsi 1:0:60:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 13.316118] mpt3sas_cm0: detecting: handle(0x0098), sas_address(0x5000cca2525fb4be), phy(50) [Mon Dec 9 06:17:18 2019][ 13.324558] mpt3sas_cm0: REPORT_LUNS: handle(0x0098), retries(0) [Mon Dec 9 06:17:18 2019][ 13.330725] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0098), lun(0) [Mon Dec 9 06:17:18 2019][ 13.337397] scsi 1:0:61:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 13.345697] scsi 1:0:61:0: SSP: handle(0x0098), sas_addr(0x5000cca2525fb4be), phy(50), device_name(0x5000cca2525fb4bf) [Mon Dec 9 06:17:18 2019][ 13.356387] scsi 1:0:61:0: enclosure logical id(0x5000ccab04037180), slot(10) [Mon Dec 9 06:17:18 2019][ 13.363607] scsi 1:0:61:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:18 2019][ 13.370410] scsi 1:0:61:0: serial_number( 7SHPMP5W) [Mon Dec 9 06:17:18 2019][ 13.375900] scsi 1:0:61:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 13.398625] mpt3sas_cm0: expander_add: handle(0x00da), parent(0x0002), sas_addr(0x5000ccab040371bd), phys(49) [Mon Dec 9 06:17:18 2019][ 13.419014] mpt3sas_cm0: detecting: handle(0x00de), sas_address(0x5000ccab040371bc), phy(48) [Mon Dec 9 06:17:18 2019][ 13.427450] mpt3sas_cm0: REPORT_LUNS: handle(0x00de), retries(0) [Mon Dec 9 06:17:18 2019][ 13.435324] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00de), lun(0) [Mon Dec 9 06:17:18 2019][ 13.442184] scsi 1:0:62:0: Enclosure HGST H4060-J 2033 PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 13.450673] scsi 1:0:62:0: set ignore_delay_remove for handle(0x00de) [Mon Dec 9 06:17:18 2019][ 13.457114] scsi 1:0:62:0: SES: handle(0x00de), sas_addr(0x5000ccab040371bc), phy(48), device_name(0x0000000000000000) [Mon Dec 9 06:17:18 2019][ 13.467800] scsi 1:0:62:0: enclosure logical id(0x5000ccab04037180), slot(60) [Mon Dec 9 06:17:18 2019][ 13.475018] scsi 1:0:62:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:18 2019][ 13.481822] scsi 1:0:62:0: serial_number(USWSJ03918EZ0028 ) [Mon Dec 9 06:17:18 2019][ 13.487659] scsi 1:0:62:0: qdepth(1), tagged(0), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 13.513530] mpt3sas_cm0: expander_add: handle(0x00dc), parent(0x00da), sas_addr(0x5000ccab040371bf), phys(68) [Mon Dec 9 06:17:18 2019][ 13.534517] mpt3sas_cm0: detecting: handle(0x00df), sas_address(0x5000cca2525f2a25), phy(0) [Mon Dec 9 06:17:18 2019][ 13.542870] mpt3sas_cm0: REPORT_LUNS: handle(0x00df), retries(0) [Mon Dec 9 06:17:18 2019][ 13.548998] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00df), lun(0) [Mon Dec 9 06:17:18 2019][ 13.555809] scsi 1:0:63:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 13.564123] scsi 1:0:63:0: SSP: handle(0x00df), sas_addr(0x5000cca2525f2a25), phy(0), device_name(0x5000cca2525f2a27) [Mon Dec 9 06:17:18 2019][ 13.574724] scsi 1:0:63:0: enclosure logical id(0x5000ccab04037180), slot(0) [Mon Dec 9 06:17:18 2019][ 13.581855] scsi 1:0:63:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:18 2019][ 13.588663] scsi 1:0:63:0: serial_number( 7SHPAG1W) [Mon Dec 9 06:17:18 2019][ 13.594148] scsi 1:0:63:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 13.614122] mpt3sas_cm0: detecting: handle(0x00e0), sas_address(0x5000cca2525e977d), phy(1) [Mon Dec 9 06:17:18 2019][ 13.622472] mpt3sas_cm0: REPORT_LUNS: handle(0x00e0), retries(0) [Mon Dec 9 06:17:18 2019][ 13.628632] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e0), lun(0) [Mon Dec 9 06:17:18 2019][ 13.655932] scsi 1:0:64:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 13.664219] scsi 1:0:64:0: SSP: handle(0x00e0), sas_addr(0x5000cca2525e977d), phy(1), device_name(0x5000cca2525e977f) [Mon Dec 9 06:17:18 2019][ 13.674816] scsi 1:0:64:0: enclosure logical id(0x5000ccab04037180), slot(2) [Mon Dec 9 06:17:18 2019][ 13.681949] scsi 1:0:64:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:18 2019][ 13.688738] scsi 1:0:64:0: serial_number( 7SHP0P8W) [Mon Dec 9 06:17:18 2019][ 13.694223] scsi 1:0:64:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 13.714119] mpt3sas_cm0: detecting: handle(0x00e1), sas_address(0x5000cca2525ed2bd), phy(2) [Mon Dec 9 06:17:18 2019][ 13.722472] mpt3sas_cm0: REPORT_LUNS: handle(0x00e1), retries(0) [Mon Dec 9 06:17:18 2019][ 13.728617] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e1), lun(0) [Mon Dec 9 06:17:18 2019][ 13.735337] scsi 1:0:65:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 13.743641] scsi 1:0:65:0: SSP: handle(0x00e1), sas_addr(0x5000cca2525ed2bd), phy(2), device_name(0x5000cca2525ed2bf) [Mon Dec 9 06:17:18 2019][ 13.754241] scsi 1:0:65:0: enclosure logical id(0x5000ccab04037180), slot(11) [Mon Dec 9 06:17:18 2019][ 13.761461] scsi 1:0:65:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:18 2019][ 13.768264] scsi 1:0:65:0: serial_number( 7SHP4MLW) [Mon Dec 9 06:17:19 2019][ 13.773754] scsi 1:0:65:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 13.796138] mpt3sas_cm0: detecting: handle(0x00e2), sas_address(0x5000cca2525ec049), phy(3) [Mon Dec 9 06:17:19 2019][ 13.804485] mpt3sas_cm0: REPORT_LUNS: handle(0x00e2), retries(0) [Mon Dec 9 06:17:19 2019][ 13.810617] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e2), lun(0) [Mon Dec 9 06:17:19 2019][ 13.817238] scsi 1:0:66:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 13.825540] scsi 1:0:66:0: SSP: handle(0x00e2), sas_addr(0x5000cca2525ec049), phy(3), device_name(0x5000cca2525ec04b) [Mon Dec 9 06:17:19 2019][ 13.836142] scsi 1:0:66:0: enclosure logical id(0x5000ccab04037180), slot(12) [Mon Dec 9 06:17:19 2019][ 13.843361] scsi 1:0:66:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 13.850166] scsi 1:0:66:0: serial_number( 7SHP3DHW) [Mon Dec 9 06:17:19 2019][ 13.855653] scsi 1:0:66:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 13.878127] mpt3sas_cm0: detecting: handle(0x00e3), sas_address(0x5000cca2525ff611), phy(4) [Mon Dec 9 06:17:19 2019][ 13.886474] mpt3sas_cm0: REPORT_LUNS: handle(0x00e3), retries(0) [Mon Dec 9 06:17:19 2019][ 13.892637] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e3), lun(0) [Mon Dec 9 06:17:19 2019][ 13.899409] scsi 1:0:67:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 13.907720] scsi 1:0:67:0: SSP: handle(0x00e3), sas_addr(0x5000cca2525ff611), phy(4), device_name(0x5000cca2525ff613) [Mon Dec 9 06:17:19 2019][ 13.918320] scsi 1:0:67:0: enclosure logical id(0x5000ccab04037180), slot(13) [Mon Dec 9 06:17:19 2019][ 13.925540] scsi 1:0:67:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 13.932344] scsi 1:0:67:0: serial_number( 7SHPT11W) [Mon Dec 9 06:17:19 2019][ 13.937833] scsi 1:0:67:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 13.959124] mpt3sas_cm0: detecting: handle(0x00e4), sas_address(0x5000cca2526016ed), phy(5) [Mon Dec 9 06:17:19 2019][ 13.967477] mpt3sas_cm0: REPORT_LUNS: handle(0x00e4), retries(0) [Mon Dec 9 06:17:19 2019][ 13.973660] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e4), lun(0) [Mon Dec 9 06:17:19 2019][ 14.005489] scsi 1:0:68:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 14.013799] scsi 1:0:68:0: SSP: handle(0x00e4), sas_addr(0x5000cca2526016ed), phy(5), device_name(0x5000cca2526016ef) [Mon Dec 9 06:17:19 2019][ 14.024396] scsi 1:0:68:0: enclosure logical id(0x5000ccab04037180), slot(14) [Mon Dec 9 06:17:19 2019][ 14.031616] scsi 1:0:68:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 14.038407] scsi 1:0:68:0: serial_number( 7SHPV6WW) [Mon Dec 9 06:17:19 2019][ 14.043899] scsi 1:0:68:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 14.064130] mpt3sas_cm0: detecting: handle(0x00e5), sas_address(0x5000cca2525f4871), phy(6) [Mon Dec 9 06:17:19 2019][ 14.072485] mpt3sas_cm0: REPORT_LUNS: handle(0x00e5), retries(0) [Mon Dec 9 06:17:19 2019][ 14.078627] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e5), lun(0) [Mon Dec 9 06:17:19 2019][ 14.085235] scsi 1:0:69:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 14.093540] scsi 1:0:69:0: SSP: handle(0x00e5), sas_addr(0x5000cca2525f4871), phy(6), device_name(0x5000cca2525f4873) [Mon Dec 9 06:17:19 2019][ 14.104140] scsi 1:0:69:0: enclosure logical id(0x5000ccab04037180), slot(15) [Mon Dec 9 06:17:19 2019][ 14.111360] scsi 1:0:69:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 14.118161] scsi 1:0:69:0: serial_number( 7SHPDGLW) [Mon Dec 9 06:17:19 2019][ 14.123650] scsi 1:0:69:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 14.146130] mpt3sas_cm0: detecting: handle(0x00e6), sas_address(0x5000cca2525f568d), phy(7) [Mon Dec 9 06:17:19 2019][ 14.154483] mpt3sas_cm0: REPORT_LUNS: handle(0x00e6), retries(0) [Mon Dec 9 06:17:19 2019][ 14.160652] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e6), lun(0) [Mon Dec 9 06:17:19 2019][ 14.167273] scsi 1:0:70:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 14.175580] scsi 1:0:70:0: SSP: handle(0x00e6), sas_addr(0x5000cca2525f568d), phy(7), device_name(0x5000cca2525f568f) [Mon Dec 9 06:17:19 2019][ 14.186180] scsi 1:0:70:0: enclosure logical id(0x5000ccab04037180), slot(16) [Mon Dec 9 06:17:19 2019][ 14.193399] scsi 1:0:70:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 14.200206] scsi 1:0:70:0: serial_number( 7SHPEDRW) [Mon Dec 9 06:17:19 2019][ 14.205692] scsi 1:0:70:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 14.228132] mpt3sas_cm0: detecting: handle(0x00e7), sas_address(0x5000cca2525f6c25), phy(8) [Mon Dec 9 06:17:19 2019][ 14.236481] mpt3sas_cm0: REPORT_LUNS: handle(0x00e7), retries(0) [Mon Dec 9 06:17:19 2019][ 14.242622] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e7), lun(0) [Mon Dec 9 06:17:19 2019][ 14.254153] scsi 1:0:71:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 14.262443] scsi 1:0:71:0: SSP: handle(0x00e7), sas_addr(0x5000cca2525f6c25), phy(8), device_name(0x5000cca2525f6c27) [Mon Dec 9 06:17:19 2019][ 14.273040] scsi 1:0:71:0: enclosure logical id(0x5000ccab04037180), slot(17) [Mon Dec 9 06:17:19 2019][ 14.280259] scsi 1:0:71:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 14.287050] scsi 1:0:71:0: serial_number( 7SHPGV9W) [Mon Dec 9 06:17:19 2019][ 14.292533] scsi 1:0:71:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 14.315135] mpt3sas_cm0: detecting: handle(0x00e8), sas_address(0x5000cca2525ed401), phy(9) [Mon Dec 9 06:17:19 2019][ 14.323488] mpt3sas_cm0: REPORT_LUNS: handle(0x00e8), retries(0) [Mon Dec 9 06:17:19 2019][ 14.329628] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e8), lun(0) [Mon Dec 9 06:17:19 2019][ 14.336258] scsi 1:0:72:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 14.344556] scsi 1:0:72:0: SSP: handle(0x00e8), sas_addr(0x5000cca2525ed401), phy(9), device_name(0x5000cca2525ed403) [Mon Dec 9 06:17:19 2019][ 14.355158] scsi 1:0:72:0: enclosure logical id(0x5000ccab04037180), slot(18) [Mon Dec 9 06:17:19 2019][ 14.362378] scsi 1:0:72:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 14.369182] scsi 1:0:72:0: serial_number( 7SHP4R6W) [Mon Dec 9 06:17:19 2019][ 14.374671] scsi 1:0:72:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 14.397136] mpt3sas_cm0: detecting: handle(0x00e9), sas_address(0x5000cca2525e0405), phy(10) [Mon Dec 9 06:17:19 2019][ 14.405570] mpt3sas_cm0: REPORT_LUNS: handle(0x00e9), retries(0) [Mon Dec 9 06:17:19 2019][ 14.411737] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e9), lun(0) [Mon Dec 9 06:17:19 2019][ 14.418352] scsi 1:0:73:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 14.426667] scsi 1:0:73:0: SSP: handle(0x00e9), sas_addr(0x5000cca2525e0405), phy(10), device_name(0x5000cca2525e0407) [Mon Dec 9 06:17:19 2019][ 14.437355] scsi 1:0:73:0: enclosure logical id(0x5000ccab04037180), slot(19) [Mon Dec 9 06:17:19 2019][ 14.444575] scsi 1:0:73:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 14.451380] scsi 1:0:73:0: serial_number( 7SHNPVUW) [Mon Dec 9 06:17:19 2019][ 14.456868] scsi 1:0:73:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 14.478140] mpt3sas_cm0: detecting: handle(0x00ea), sas_address(0x5000cca2525ea9e5), phy(11) [Mon Dec 9 06:17:19 2019][ 14.486577] mpt3sas_cm0: REPORT_LUNS: handle(0x00ea), retries(0) [Mon Dec 9 06:17:19 2019][ 14.493255] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ea), lun(0) [Mon Dec 9 06:17:19 2019][ 14.499876] scsi 1:0:74:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 14.508186] scsi 1:0:74:0: SSP: handle(0x00ea), sas_addr(0x5000cca2525ea9e5), phy(11), device_name(0x5000cca2525ea9e7) [Mon Dec 9 06:17:19 2019][ 14.518875] scsi 1:0:74:0: enclosure logical id(0x5000ccab04037180), slot(20) [Mon Dec 9 06:17:19 2019][ 14.526095] scsi 1:0:74:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 14.532904] scsi 1:0:74:0: serial_number( 7SHP1X8W) [Mon Dec 9 06:17:19 2019][ 14.538397] scsi 1:0:74:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 14.558139] mpt3sas_cm0: detecting: handle(0x00eb), sas_address(0x5000cca2525f1d39), phy(12) [Mon Dec 9 06:17:19 2019][ 14.566584] mpt3sas_cm0: REPORT_LUNS: handle(0x00eb), retries(0) [Mon Dec 9 06:17:19 2019][ 14.572745] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00eb), lun(0) [Mon Dec 9 06:17:19 2019][ 14.579363] scsi 1:0:75:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 14.587665] scsi 1:0:75:0: SSP: handle(0x00eb), sas_addr(0x5000cca2525f1d39), phy(12), device_name(0x5000cca2525f1d3b) [Mon Dec 9 06:17:19 2019][ 14.598352] scsi 1:0:75:0: enclosure logical id(0x5000ccab04037180), slot(21) [Mon Dec 9 06:17:19 2019][ 14.605569] scsi 1:0:75:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 14.612376] scsi 1:0:75:0: serial_number( 7SHP9LBW) [Mon Dec 9 06:17:19 2019][ 14.617862] scsi 1:0:75:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 14.640140] mpt3sas_cm0: detecting: handle(0x00ec), sas_address(0x5000cca2525ea499), phy(13) [Mon Dec 9 06:17:19 2019][ 14.648580] mpt3sas_cm0: REPORT_LUNS: handle(0x00ec), retries(0) [Mon Dec 9 06:17:19 2019][ 14.654746] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ec), lun(0) [Mon Dec 9 06:17:19 2019][ 14.661569] scsi 1:0:76:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 14.669878] scsi 1:0:76:0: SSP: handle(0x00ec), sas_addr(0x5000cca2525ea499), phy(13), device_name(0x5000cca2525ea49b) [Mon Dec 9 06:17:19 2019][ 14.680565] scsi 1:0:76:0: enclosure logical id(0x5000ccab04037180), slot(22) [Mon Dec 9 06:17:19 2019][ 14.687784] scsi 1:0:76:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 14.694592] scsi 1:0:76:0: serial_number( 7SHP1KAW) [Mon Dec 9 06:17:19 2019][ 14.700079] scsi 1:0:76:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 14.725144] mpt3sas_cm0: detecting: handle(0x00ed), sas_address(0x5000cca2525fba05), phy(14) [Mon Dec 9 06:17:19 2019][ 14.733584] mpt3sas_cm0: REPORT_LUNS: handle(0x00ed), retries(0) [Mon Dec 9 06:17:19 2019][ 14.739718] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ed), lun(0) [Mon Dec 9 06:17:19 2019][ 14.789076] scsi 1:0:77:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 14.797376] scsi 1:0:77:0: SSP: handle(0x00ed), sas_addr(0x5000cca2525fba05), phy(14), device_name(0x5000cca2525fba07) [Mon Dec 9 06:17:20 2019][ 14.808063] scsi 1:0:77:0: enclosure logical id(0x5000ccab04037180), slot(23) [Mon Dec 9 06:17:20 2019][ 14.815283] scsi 1:0:77:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 14.822073] scsi 1:0:77:0: serial_number( 7SHPN12W) [Mon Dec 9 06:17:20 2019][ 14.827558] scsi 1:0:77:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 14.856154] mpt3sas_cm0: detecting: handle(0x00ee), sas_address(0x5000cca2525e121d), phy(15) [Mon Dec 9 06:17:20 2019][ 14.864595] mpt3sas_cm0: REPORT_LUNS: handle(0x00ee), retries(0) [Mon Dec 9 06:17:20 2019][ 14.870737] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ee), lun(0) [Mon Dec 9 06:17:20 2019][ 14.877356] scsi 1:0:78:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 14.885656] scsi 1:0:78:0: SSP: handle(0x00ee), sas_addr(0x5000cca2525e121d), phy(15), device_name(0x5000cca2525e121f) [Mon Dec 9 06:17:20 2019][ 14.896346] scsi 1:0:78:0: enclosure logical id(0x5000ccab04037180), slot(24) [Mon Dec 9 06:17:20 2019][ 14.903563] scsi 1:0:78:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 14.910369] scsi 1:0:78:0: serial_number( 7SHNRTXW) [Mon Dec 9 06:17:20 2019][ 14.915855] scsi 1:0:78:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 14.938146] mpt3sas_cm0: detecting: handle(0x00ef), sas_address(0x5000cca2525e98f5), phy(16) [Mon Dec 9 06:17:20 2019][ 14.946585] mpt3sas_cm0: REPORT_LUNS: handle(0x00ef), retries(0) [Mon Dec 9 06:17:20 2019][ 14.952725] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ef), lun(0) [Mon Dec 9 06:17:20 2019][ 14.959338] scsi 1:0:79:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 14.967638] scsi 1:0:79:0: SSP: handle(0x00ef), sas_addr(0x5000cca2525e98f5), phy(16), device_name(0x5000cca2525e98f7) [Mon Dec 9 06:17:20 2019][ 14.978326] scsi 1:0:79:0: enclosure logical id(0x5000ccab04037180), slot(25) [Mon Dec 9 06:17:20 2019][ 14.985543] scsi 1:0:79:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 14.992349] scsi 1:0:79:0: serial_number( 7SHP0T9W) [Mon Dec 9 06:17:20 2019][ 14.997838] scsi 1:0:79:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.021146] mpt3sas_cm0: detecting: handle(0x00f0), sas_address(0x5000cca2525f8175), phy(17) [Mon Dec 9 06:17:20 2019][ 15.029585] mpt3sas_cm0: REPORT_LUNS: handle(0x00f0), retries(0) [Mon Dec 9 06:17:20 2019][ 15.035715] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f0), lun(0) [Mon Dec 9 06:17:20 2019][ 15.042329] scsi 1:0:80:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 15.050627] scsi 1:0:80:0: SSP: handle(0x00f0), sas_addr(0x5000cca2525f8175), phy(17), device_name(0x5000cca2525f8177) [Mon Dec 9 06:17:20 2019][ 15.061309] scsi 1:0:80:0: enclosure logical id(0x5000ccab04037180), slot(26) [Mon Dec 9 06:17:20 2019][ 15.068531] scsi 1:0:80:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 15.075336] scsi 1:0:80:0: serial_number( 7SHPJ89W) [Mon Dec 9 06:17:20 2019][ 15.080822] scsi 1:0:80:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.103149] mpt3sas_cm0: detecting: handle(0x00f1), sas_address(0x5000cca2525fb01d), phy(18) [Mon Dec 9 06:17:20 2019][ 15.111581] mpt3sas_cm0: REPORT_LUNS: handle(0x00f1), retries(0) [Mon Dec 9 06:17:20 2019][ 15.117715] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f1), lun(0) [Mon Dec 9 06:17:20 2019][ 15.124332] scsi 1:0:81:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 15.132636] scsi 1:0:81:0: SSP: handle(0x00f1), sas_addr(0x5000cca2525fb01d), phy(18), device_name(0x5000cca2525fb01f) [Mon Dec 9 06:17:20 2019][ 15.143326] scsi 1:0:81:0: enclosure logical id(0x5000ccab04037180), slot(27) [Mon Dec 9 06:17:20 2019][ 15.150545] scsi 1:0:81:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 15.157349] scsi 1:0:81:0: serial_number( 7SHPMBMW) [Mon Dec 9 06:17:20 2019][ 15.162835] scsi 1:0:81:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.186155] mpt3sas_cm0: detecting: handle(0x00f2), sas_address(0x5000cca2525ed549), phy(19) [Mon Dec 9 06:17:20 2019][ 15.194593] mpt3sas_cm0: REPORT_LUNS: handle(0x00f2), retries(0) [Mon Dec 9 06:17:20 2019][ 15.200727] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f2), lun(0) [Mon Dec 9 06:17:20 2019][ 15.207334] scsi 1:0:82:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 15.215630] scsi 1:0:82:0: SSP: handle(0x00f2), sas_addr(0x5000cca2525ed549), phy(19), device_name(0x5000cca2525ed54b) [Mon Dec 9 06:17:20 2019][ 15.226320] scsi 1:0:82:0: enclosure logical id(0x5000ccab04037180), slot(28) [Mon Dec 9 06:17:20 2019][ 15.233539] scsi 1:0:82:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 15.240345] scsi 1:0:82:0: serial_number( 7SHP4TVW) [Mon Dec 9 06:17:20 2019][ 15.245831] scsi 1:0:82:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.266153] mpt3sas_cm0: detecting: handle(0x00f3), sas_address(0x5000cca2525fa035), phy(20) [Mon Dec 9 06:17:20 2019][ 15.274590] mpt3sas_cm0: REPORT_LUNS: handle(0x00f3), retries(0) [Mon Dec 9 06:17:20 2019][ 15.280733] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f3), lun(0) [Mon Dec 9 06:17:20 2019][ 15.287347] scsi 1:0:83:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 15.295655] scsi 1:0:83:0: SSP: handle(0x00f3), sas_addr(0x5000cca2525fa035), phy(20), device_name(0x5000cca2525fa037) [Mon Dec 9 06:17:20 2019][ 15.306340] scsi 1:0:83:0: enclosure logical id(0x5000ccab04037180), slot(29) [Mon Dec 9 06:17:20 2019][ 15.313560] scsi 1:0:83:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 15.320366] scsi 1:0:83:0: serial_number( 7SHPL9TW) [Mon Dec 9 06:17:20 2019][ 15.325854] scsi 1:0:83:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.349158] mpt3sas_cm0: detecting: handle(0x00f4), sas_address(0x5000cca2525fb941), phy(21) [Mon Dec 9 06:17:20 2019][ 15.357593] mpt3sas_cm0: REPORT_LUNS: handle(0x00f4), retries(0) [Mon Dec 9 06:17:20 2019][ 15.363731] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f4), lun(0) [Mon Dec 9 06:17:20 2019][ 15.370345] scsi 1:0:84:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 15.378645] scsi 1:0:84:0: SSP: handle(0x00f4), sas_addr(0x5000cca2525fb941), phy(21), device_name(0x5000cca2525fb943) [Mon Dec 9 06:17:20 2019][ 15.389335] scsi 1:0:84:0: enclosure logical id(0x5000ccab04037180), slot(30) [Mon Dec 9 06:17:20 2019][ 15.396555] scsi 1:0:84:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 15.403359] scsi 1:0:84:0: serial_number( 7SHPMZHW) [Mon Dec 9 06:17:20 2019][ 15.408847] scsi 1:0:84:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.429747] mpt3sas_cm0: detecting: handle(0x00f5), sas_address(0x5000cca2525e22e5), phy(22) [Mon Dec 9 06:17:20 2019][ 15.438185] mpt3sas_cm0: REPORT_LUNS: handle(0x00f5), retries(0) [Mon Dec 9 06:17:20 2019][ 15.444315] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f5), lun(0) [Mon Dec 9 06:17:20 2019][ 15.450935] scsi 1:0:85:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 15.459241] scsi 1:0:85:0: SSP: handle(0x00f5), sas_addr(0x5000cca2525e22e5), phy(22), device_name(0x5000cca2525e22e7) [Mon Dec 9 06:17:20 2019][ 15.469927] scsi 1:0:85:0: enclosure logical id(0x5000ccab04037180), slot(31) [Mon Dec 9 06:17:20 2019][ 15.477147] scsi 1:0:85:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 15.483951] scsi 1:0:85:0: serial_number( 7SHNSXKW) [Mon Dec 9 06:17:20 2019][ 15.489442] scsi 1:0:85:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.509164] mpt3sas_cm0: detecting: handle(0x00f6), sas_address(0x5000cca2525fb5bd), phy(23) [Mon Dec 9 06:17:20 2019][ 15.517601] mpt3sas_cm0: REPORT_LUNS: handle(0x00f6), retries(0) [Mon Dec 9 06:17:20 2019][ 15.523788] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f6), lun(0) [Mon Dec 9 06:17:20 2019][ 15.530401] scsi 1:0:86:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 15.538705] scsi 1:0:86:0: SSP: handle(0x00f6), sas_addr(0x5000cca2525fb5bd), phy(23), device_name(0x5000cca2525fb5bf) [Mon Dec 9 06:17:20 2019][ 15.549394] scsi 1:0:86:0: enclosure logical id(0x5000ccab04037180), slot(32) [Mon Dec 9 06:17:20 2019][ 15.556613] scsi 1:0:86:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 15.563421] scsi 1:0:86:0: serial_number( 7SHPMS7W) [Mon Dec 9 06:17:20 2019][ 15.568909] scsi 1:0:86:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.589160] mpt3sas_cm0: detecting: handle(0x00f7), sas_address(0x5000cca2525eb77d), phy(24) [Mon Dec 9 06:17:20 2019][ 15.597596] mpt3sas_cm0: REPORT_LUNS: handle(0x00f7), retries(0) [Mon Dec 9 06:17:20 2019][ 15.603763] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f7), lun(0) [Mon Dec 9 06:17:20 2019][ 15.610376] scsi 1:0:87:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 15.618679] scsi 1:0:87:0: SSP: handle(0x00f7), sas_addr(0x5000cca2525eb77d), phy(24), device_name(0x5000cca2525eb77f) [Mon Dec 9 06:17:20 2019][ 15.629363] scsi 1:0:87:0: enclosure logical id(0x5000ccab04037180), slot(33) [Mon Dec 9 06:17:20 2019][ 15.636582] scsi 1:0:87:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 15.643386] scsi 1:0:87:0: serial_number( 7SHP2UAW) [Mon Dec 9 06:17:20 2019][ 15.648875] scsi 1:0:87:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.671164] mpt3sas_cm0: detecting: handle(0x00f8), sas_address(0x5000cca2525e1139), phy(25) [Mon Dec 9 06:17:20 2019][ 15.679601] mpt3sas_cm0: REPORT_LUNS: handle(0x00f8), retries(0) [Mon Dec 9 06:17:20 2019][ 15.685766] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f8), lun(0) [Mon Dec 9 06:17:20 2019][ 15.729390] scsi 1:0:88:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 15.737813] scsi 1:0:88:0: SSP: handle(0x00f8), sas_addr(0x5000cca2525e1139), phy(25), device_name(0x5000cca2525e113b) [Mon Dec 9 06:17:20 2019][ 15.748500] scsi 1:0:88:0: enclosure logical id(0x5000ccab04037180), slot(34) [Mon Dec 9 06:17:20 2019][ 15.755717] scsi 1:0:88:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 15.762509] scsi 1:0:88:0: serial_number( 7SHNRS2W) [Mon Dec 9 06:17:20 2019][ 15.767994] scsi 1:0:88:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.788166] mpt3sas_cm0: detecting: handle(0x00f9), sas_address(0x5000cca2526014f9), phy(26) [Mon Dec 9 06:17:21 2019][ 15.796605] mpt3sas_cm0: REPORT_LUNS: handle(0x00f9), retries(0) [Mon Dec 9 06:17:21 2019][ 15.802735] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f9), lun(0) [Mon Dec 9 06:17:21 2019][ 15.810510] scsi 1:0:89:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 15.818854] scsi 1:0:89:0: SSP: handle(0x00f9), sas_addr(0x5000cca2526014f9), phy(26), device_name(0x5000cca2526014fb) [Mon Dec 9 06:17:21 2019][ 15.829544] scsi 1:0:89:0: enclosure logical id(0x5000ccab04037180), slot(35) [Mon Dec 9 06:17:21 2019][ 15.836763] scsi 1:0:89:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 15.843567] scsi 1:0:89:0: serial_number( 7SHPV2VW) [Mon Dec 9 06:17:21 2019][ 15.849054] scsi 1:0:89:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 15.873233] mpt3sas_cm0: detecting: handle(0x00fa), sas_address(0x5000cca252598785), phy(27) [Mon Dec 9 06:17:21 2019][ 15.881670] mpt3sas_cm0: REPORT_LUNS: handle(0x00fa), retries(0) [Mon Dec 9 06:17:21 2019][ 15.887834] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00fa), lun(0) [Mon Dec 9 06:17:21 2019][ 15.894455] scsi 1:0:90:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 15.902803] scsi 1:0:90:0: SSP: handle(0x00fa), sas_addr(0x5000cca252598785), phy(27), device_name(0x5000cca252598787) [Mon Dec 9 06:17:21 2019][ 15.913490] scsi 1:0:90:0: enclosure logical id(0x5000ccab04037180), slot(36) [Mon Dec 9 06:17:21 2019][ 15.920710] scsi 1:0:90:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 15.927515] scsi 1:0:90:0: serial_number( 7SHL7BRW) [Mon Dec 9 06:17:21 2019][ 15.933003] scsi 1:0:90:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 15.953179] mpt3sas_cm0: detecting: handle(0x00fb), sas_address(0x5000cca2525f5365), phy(28) [Mon Dec 9 06:17:21 2019][ 15.961615] mpt3sas_cm0: REPORT_LUNS: handle(0x00fb), retries(0) [Mon Dec 9 06:17:21 2019][ 15.967778] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00fb), lun(0) [Mon Dec 9 06:17:21 2019][ 15.974542] scsi 1:0:91:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 15.982847] scsi 1:0:91:0: SSP: handle(0x00fb), sas_addr(0x5000cca2525f5365), phy(28), device_name(0x5000cca2525f5367) [Mon Dec 9 06:17:21 2019][ 15.993529] scsi 1:0:91:0: enclosure logical id(0x5000ccab04037180), slot(37) [Mon Dec 9 06:17:21 2019][ 16.000749] scsi 1:0:91:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 16.007553] scsi 1:0:91:0: serial_number( 7SHPE66W) [Mon Dec 9 06:17:21 2019][ 16.013042] scsi 1:0:91:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 16.033172] mpt3sas_cm0: detecting: handle(0x00fc), sas_address(0x5000cca2525e263d), phy(29) [Mon Dec 9 06:17:21 2019][ 16.041609] mpt3sas_cm0: REPORT_LUNS: handle(0x00fc), retries(0) [Mon Dec 9 06:17:21 2019][ 16.047767] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00fc), lun(0) [Mon Dec 9 06:17:21 2019][ 16.054436] scsi 1:0:92:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 16.062741] scsi 1:0:92:0: SSP: handle(0x00fc), sas_addr(0x5000cca2525e263d), phy(29), device_name(0x5000cca2525e263f) [Mon Dec 9 06:17:21 2019][ 16.073428] scsi 1:0:92:0: enclosure logical id(0x5000ccab04037180), slot(38) [Mon Dec 9 06:17:21 2019][ 16.080647] scsi 1:0:92:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 16.087455] scsi 1:0:92:0: serial_number( 7SHNT4GW) [Mon Dec 9 06:17:21 2019][ 16.092941] scsi 1:0:92:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 16.127183] mpt3sas_cm0: detecting: handle(0x00fd), sas_address(0x5000cca2525f6081), phy(30) [Mon Dec 9 06:17:21 2019][ 16.135618] mpt3sas_cm0: REPORT_LUNS: handle(0x00fd), retries(0) [Mon Dec 9 06:17:21 2019][ 16.142279] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00fd), lun(0) [Mon Dec 9 06:17:21 2019][ 16.186312] scsi 1:0:93:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 16.194635] scsi 1:0:93:0: SSP: handle(0x00fd), sas_addr(0x5000cca2525f6081), phy(30), device_name(0x5000cca2525f6083) [Mon Dec 9 06:17:21 2019][ 16.205323] scsi 1:0:93:0: enclosure logical id(0x5000ccab04037180), slot(39) [Mon Dec 9 06:17:21 2019][ 16.212540] scsi 1:0:93:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 16.219331] scsi 1:0:93:0: serial_number( 7SHPG28W) [Mon Dec 9 06:17:21 2019][ 16.224817] scsi 1:0:93:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 16.253189] mpt3sas_cm0: detecting: handle(0x00fe), sas_address(0x5000cca2525ec83d), phy(31) [Mon Dec 9 06:17:21 2019][ 16.261629] mpt3sas_cm0: REPORT_LUNS: handle(0x00fe), retries(0) [Mon Dec 9 06:17:21 2019][ 16.267770] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00fe), lun(0) [Mon Dec 9 06:17:21 2019][ 16.274376] scsi 1:0:94:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 16.282685] scsi 1:0:94:0: SSP: handle(0x00fe), sas_addr(0x5000cca2525ec83d), phy(31), device_name(0x5000cca2525ec83f) [Mon Dec 9 06:17:21 2019][ 16.293367] scsi 1:0:94:0: enclosure logical id(0x5000ccab04037180), slot(40) [Mon Dec 9 06:17:21 2019][ 16.300589] scsi 1:0:94:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 16.307392] scsi 1:0:94:0: serial_number( 7SHP3XXW) [Mon Dec 9 06:17:21 2019][ 16.312881] scsi 1:0:94:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 16.342179] mpt3sas_cm0: detecting: handle(0x00ff), sas_address(0x5000cca2525ec019), phy(32) [Mon Dec 9 06:17:21 2019][ 16.350634] mpt3sas_cm0: REPORT_LUNS: handle(0x00ff), retries(0) [Mon Dec 9 06:17:21 2019][ 16.356755] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ff), lun(0) [Mon Dec 9 06:17:21 2019][ 16.363537] scsi 1:0:95:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 16.372787] scsi 1:0:95:0: SSP: handle(0x00ff), sas_addr(0x5000cca2525ec019), phy(32), device_name(0x5000cca2525ec01b) [Mon Dec 9 06:17:21 2019][ 16.383469] scsi 1:0:95:0: enclosure logical id(0x5000ccab04037180), slot(41) [Mon Dec 9 06:17:21 2019][ 16.390689] scsi 1:0:95:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 16.397496] scsi 1:0:95:0: serial_number( 7SHP3D3W) [Mon Dec 9 06:17:21 2019][ 16.402982] scsi 1:0:95:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 16.423183] mpt3sas_cm0: detecting: handle(0x0100), sas_address(0x5000cca2525ec559), phy(33) [Mon Dec 9 06:17:21 2019][ 16.431618] mpt3sas_cm0: REPORT_LUNS: handle(0x0100), retries(0) [Mon Dec 9 06:17:21 2019][ 16.437783] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0100), lun(0) [Mon Dec 9 06:17:21 2019][ 16.444556] scsi 1:0:96:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 16.452863] scsi 1:0:96:0: SSP: handle(0x0100), sas_addr(0x5000cca2525ec559), phy(33), device_name(0x5000cca2525ec55b) [Mon Dec 9 06:17:21 2019][ 16.463552] scsi 1:0:96:0: enclosure logical id(0x5000ccab04037180), slot(42) [Mon Dec 9 06:17:21 2019][ 16.470772] scsi 1:0:96:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 16.477577] scsi 1:0:96:0: serial_number( 7SHP3RYW) [Mon Dec 9 06:17:21 2019][ 16.483064] scsi 1:0:96:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 16.503185] mpt3sas_cm0: detecting: handle(0x0101), sas_address(0x5000cca2525fd4a1), phy(34) [Mon Dec 9 06:17:21 2019][ 16.511624] mpt3sas_cm0: REPORT_LUNS: handle(0x0101), retries(0) [Mon Dec 9 06:17:21 2019][ 16.517799] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0101), lun(0) [Mon Dec 9 06:17:21 2019][ 16.524587] scsi 1:0:97:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 16.532892] scsi 1:0:97:0: SSP: handle(0x0101), sas_addr(0x5000cca2525fd4a1), phy(34), device_name(0x5000cca2525fd4a3) [Mon Dec 9 06:17:21 2019][ 16.543582] scsi 1:0:97:0: enclosure logical id(0x5000ccab04037180), slot(43) [Mon Dec 9 06:17:21 2019][ 16.550801] scsi 1:0:97:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 16.557607] scsi 1:0:97:0: serial_number( 7SHPPU0W) [Mon Dec 9 06:17:21 2019][ 16.563099] scsi 1:0:97:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 16.583186] mpt3sas_cm0: detecting: handle(0x0102), sas_address(0x5000cca2525eb5f5), phy(35) [Mon Dec 9 06:17:21 2019][ 16.591629] mpt3sas_cm0: REPORT_LUNS: handle(0x0102), retries(0) [Mon Dec 9 06:17:21 2019][ 16.597771] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0102), lun(0) [Mon Dec 9 06:17:21 2019][ 16.604392] scsi 1:0:98:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 16.612692] scsi 1:0:98:0: SSP: handle(0x0102), sas_addr(0x5000cca2525eb5f5), phy(35), device_name(0x5000cca2525eb5f7) [Mon Dec 9 06:17:21 2019][ 16.623378] scsi 1:0:98:0: enclosure logical id(0x5000ccab04037180), slot(44) [Mon Dec 9 06:17:21 2019][ 16.630598] scsi 1:0:98:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 16.637402] scsi 1:0:98:0: serial_number( 7SHP2R5W) [Mon Dec 9 06:17:21 2019][ 16.642889] scsi 1:0:98:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 16.666186] mpt3sas_cm0: detecting: handle(0x0103), sas_address(0x5000cca2525ebeb1), phy(36) [Mon Dec 9 06:17:21 2019][ 16.674621] mpt3sas_cm0: REPORT_LUNS: handle(0x0103), retries(0) [Mon Dec 9 06:17:21 2019][ 16.680754] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0103), lun(0) [Mon Dec 9 06:17:21 2019][ 16.689115] scsi 1:0:99:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 16.697422] scsi 1:0:99:0: SSP: handle(0x0103), sas_addr(0x5000cca2525ebeb1), phy(36), device_name(0x5000cca2525ebeb3) [Mon Dec 9 06:17:21 2019][ 16.708106] scsi 1:0:99:0: enclosure logical id(0x5000ccab04037180), slot(45) [Mon Dec 9 06:17:21 2019][ 16.715323] scsi 1:0:99:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 16.722128] scsi 1:0:99:0: serial_number( 7SHP396W) [Mon Dec 9 06:17:21 2019][ 16.727617] scsi 1:0:99:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 16.750191] mpt3sas_cm0: detecting: handle(0x0104), sas_address(0x5000cca2525f2919), phy(37) [Mon Dec 9 06:17:21 2019][ 16.758628] mpt3sas_cm0: REPORT_LUNS: handle(0x0104), retries(0) [Mon Dec 9 06:17:21 2019][ 16.764805] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0104), lun(0) [Mon Dec 9 06:17:21 2019][ 16.783723] scsi 1:0:100:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 16.792107] scsi 1:0:100:0: SSP: handle(0x0104), sas_addr(0x5000cca2525f2919), phy(37), device_name(0x5000cca2525f291b) [Mon Dec 9 06:17:22 2019][ 16.802879] scsi 1:0:100:0: enclosure logical id(0x5000ccab04037180), slot(46) [Mon Dec 9 06:17:22 2019][ 16.810183] scsi 1:0:100:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 16.817060] scsi 1:0:100:0: serial_number( 7SHPABWW) [Mon Dec 9 06:17:22 2019][ 16.822631] scsi 1:0:100:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 16.845209] mpt3sas_cm0: detecting: handle(0x0105), sas_address(0x5000cca252602c0d), phy(38) [Mon Dec 9 06:17:22 2019][ 16.853644] mpt3sas_cm0: REPORT_LUNS: handle(0x0105), retries(0) [Mon Dec 9 06:17:22 2019][ 16.859778] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0105), lun(0) [Mon Dec 9 06:17:22 2019][ 16.879826] scsi 1:0:101:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 16.888194] scsi 1:0:101:0: SSP: handle(0x0105), sas_addr(0x5000cca252602c0d), phy(38), device_name(0x5000cca252602c0f) [Mon Dec 9 06:17:22 2019][ 16.898968] scsi 1:0:101:0: enclosure logical id(0x5000ccab04037180), slot(47) [Mon Dec 9 06:17:22 2019][ 16.906272] scsi 1:0:101:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 16.913151] scsi 1:0:101:0: serial_number( 7SHPWMHW) [Mon Dec 9 06:17:22 2019][ 16.918721] scsi 1:0:101:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 16.939195] mpt3sas_cm0: detecting: handle(0x0106), sas_address(0x5000cca2525e7cfd), phy(39) [Mon Dec 9 06:17:22 2019][ 16.947635] mpt3sas_cm0: REPORT_LUNS: handle(0x0106), retries(0) [Mon Dec 9 06:17:22 2019][ 16.953767] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0106), lun(0) [Mon Dec 9 06:17:22 2019][ 16.960380] scsi 1:0:102:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 16.968765] scsi 1:0:102:0: SSP: handle(0x0106), sas_addr(0x5000cca2525e7cfd), phy(39), device_name(0x5000cca2525e7cff) [Mon Dec 9 06:17:22 2019][ 16.979535] scsi 1:0:102:0: enclosure logical id(0x5000ccab04037180), slot(48) [Mon Dec 9 06:17:22 2019][ 16.986839] scsi 1:0:102:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 16.993733] scsi 1:0:102:0: serial_number( 7SHNYXKW) [Mon Dec 9 06:17:22 2019][ 16.999308] scsi 1:0:102:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.023198] mpt3sas_cm0: detecting: handle(0x0107), sas_address(0x5000cca2525f6a31), phy(40) [Mon Dec 9 06:17:22 2019][ 17.031635] mpt3sas_cm0: REPORT_LUNS: handle(0x0107), retries(0) [Mon Dec 9 06:17:22 2019][ 17.037794] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0107), lun(0) [Mon Dec 9 06:17:22 2019][ 17.045237] scsi 1:0:103:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.053623] scsi 1:0:103:0: SSP: handle(0x0107), sas_addr(0x5000cca2525f6a31), phy(40), device_name(0x5000cca2525f6a33) [Mon Dec 9 06:17:22 2019][ 17.064393] scsi 1:0:103:0: enclosure logical id(0x5000ccab04037180), slot(49) [Mon Dec 9 06:17:22 2019][ 17.071697] scsi 1:0:103:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 17.078594] scsi 1:0:103:0: serial_number( 7SHPGR8W) [Mon Dec 9 06:17:22 2019][ 17.084173] scsi 1:0:103:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.104197] mpt3sas_cm0: detecting: handle(0x0108), sas_address(0x5000cca2525f7f25), phy(41) [Mon Dec 9 06:17:22 2019][ 17.112636] mpt3sas_cm0: REPORT_LUNS: handle(0x0108), retries(0) [Mon Dec 9 06:17:22 2019][ 17.118778] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0108), lun(0) [Mon Dec 9 06:17:22 2019][ 17.125418] scsi 1:0:104:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.133823] scsi 1:0:104:0: SSP: handle(0x0108), sas_addr(0x5000cca2525f7f25), phy(41), device_name(0x5000cca2525f7f27) [Mon Dec 9 06:17:22 2019][ 17.144595] scsi 1:0:104:0: enclosure logical id(0x5000ccab04037180), slot(50) [Mon Dec 9 06:17:22 2019][ 17.151902] scsi 1:0:104:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 17.158793] scsi 1:0:104:0: serial_number( 7SHPJ3JW) [Mon Dec 9 06:17:22 2019][ 17.164367] scsi 1:0:104:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.191206] mpt3sas_cm0: detecting: handle(0x0109), sas_address(0x5000cca2525eb4b1), phy(42) [Mon Dec 9 06:17:22 2019][ 17.199658] mpt3sas_cm0: REPORT_LUNS: handle(0x0109), retries(0) [Mon Dec 9 06:17:22 2019][ 17.205822] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0109), lun(0) [Mon Dec 9 06:17:22 2019][ 17.212648] scsi 1:0:105:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.221083] scsi 1:0:105:0: SSP: handle(0x0109), sas_addr(0x5000cca2525eb4b1), phy(42), device_name(0x5000cca2525eb4b3) [Mon Dec 9 06:17:22 2019][ 17.231852] scsi 1:0:105:0: enclosure logical id(0x5000ccab04037180), slot(51) [Mon Dec 9 06:17:22 2019][ 17.239160] scsi 1:0:105:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 17.246053] scsi 1:0:105:0: serial_number( 7SHP2MKW) [Mon Dec 9 06:17:22 2019][ 17.251624] scsi 1:0:105:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.274209] mpt3sas_cm0: detecting: handle(0x010a), sas_address(0x5000cca2525e1f9d), phy(43) [Mon Dec 9 06:17:22 2019][ 17.282646] mpt3sas_cm0: REPORT_LUNS: handle(0x010a), retries(0) [Mon Dec 9 06:17:22 2019][ 17.288782] mpt3sas_cm0: TEST_UNIT_READY: handle(0x010a), lun(0) [Mon Dec 9 06:17:22 2019][ 17.295399] scsi 1:0:106:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.303786] scsi 1:0:106:0: SSP: handle(0x010a), sas_addr(0x5000cca2525e1f9d), phy(43), device_name(0x5000cca2525e1f9f) [Mon Dec 9 06:17:22 2019][ 17.314563] scsi 1:0:106:0: enclosure logical id(0x5000ccab04037180), slot(52) [Mon Dec 9 06:17:22 2019][ 17.321867] scsi 1:0:106:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 17.328758] scsi 1:0:106:0: serial_number( 7SHNSPTW) [Mon Dec 9 06:17:22 2019][ 17.334334] scsi 1:0:106:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.354793] mpt3sas_cm0: detecting: handle(0x010b), sas_address(0x5000cca2525e52fd), phy(44) [Mon Dec 9 06:17:22 2019][ 17.363231] mpt3sas_cm0: REPORT_LUNS: handle(0x010b), retries(0) [Mon Dec 9 06:17:22 2019][ 17.369394] mpt3sas_cm0: TEST_UNIT_READY: handle(0x010b), lun(0) [Mon Dec 9 06:17:22 2019][ 17.376080] scsi 1:0:107:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.384470] scsi 1:0:107:0: SSP: handle(0x010b), sas_addr(0x5000cca2525e52fd), phy(44), device_name(0x5000cca2525e52ff) [Mon Dec 9 06:17:22 2019][ 17.395242] scsi 1:0:107:0: enclosure logical id(0x5000ccab04037180), slot(53) [Mon Dec 9 06:17:22 2019][ 17.402549] scsi 1:0:107:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 17.409437] scsi 1:0:107:0: serial_number( 7SHNW3VW) [Mon Dec 9 06:17:22 2019][ 17.415014] scsi 1:0:107:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.435209] mpt3sas_cm0: detecting: handle(0x010c), sas_address(0x5000cca2525f4e71), phy(45) [Mon Dec 9 06:17:22 2019][ 17.443651] mpt3sas_cm0: REPORT_LUNS: handle(0x010c), retries(0) [Mon Dec 9 06:17:22 2019][ 17.449804] mpt3sas_cm0: TEST_UNIT_READY: handle(0x010c), lun(0) [Mon Dec 9 06:17:22 2019][ 17.456405] scsi 1:0:108:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.464795] scsi 1:0:108:0: SSP: handle(0x010c), sas_addr(0x5000cca2525f4e71), phy(45), device_name(0x5000cca2525f4e73) [Mon Dec 9 06:17:22 2019][ 17.475567] scsi 1:0:108:0: enclosure logical id(0x5000ccab04037180), slot(54) [Mon Dec 9 06:17:22 2019][ 17.482871] scsi 1:0:108:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 17.489763] scsi 1:0:108:0: serial_number( 7SHPDVZW) [Mon Dec 9 06:17:22 2019][ 17.495339] scsi 1:0:108:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.515209] mpt3sas_cm0: detecting: handle(0x010d), sas_address(0x5000cca2525fd499), phy(46) [Mon Dec 9 06:17:22 2019][ 17.523648] mpt3sas_cm0: REPORT_LUNS: handle(0x010d), retries(0) [Mon Dec 9 06:17:22 2019][ 17.529788] mpt3sas_cm0: TEST_UNIT_READY: handle(0x010d), lun(0) [Mon Dec 9 06:17:22 2019][ 17.536397] scsi 1:0:109:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.544789] scsi 1:0:109:0: SSP: handle(0x010d), sas_addr(0x5000cca2525fd499), phy(46), device_name(0x5000cca2525fd49b) [Mon Dec 9 06:17:22 2019][ 17.555562] scsi 1:0:109:0: enclosure logical id(0x5000ccab04037180), slot(55) [Mon Dec 9 06:17:22 2019][ 17.562866] scsi 1:0:109:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 17.569763] scsi 1:0:109:0: serial_number( 7SHPPTYW) [Mon Dec 9 06:17:22 2019][ 17.575335] scsi 1:0:109:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.598208] mpt3sas_cm0: detecting: handle(0x010e), sas_address(0x5000cca2525e7879), phy(47) [Mon Dec 9 06:17:22 2019][ 17.606649] mpt3sas_cm0: REPORT_LUNS: handle(0x010e), retries(0) [Mon Dec 9 06:17:22 2019][ 17.612788] mpt3sas_cm0: TEST_UNIT_READY: handle(0x010e), lun(0) [Mon Dec 9 06:17:22 2019][ 17.619434] scsi 1:0:110:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.627828] scsi 1:0:110:0: SSP: handle(0x010e), sas_addr(0x5000cca2525e7879), phy(47), device_name(0x5000cca2525e787b) [Mon Dec 9 06:17:22 2019][ 17.638599] scsi 1:0:110:0: enclosure logical id(0x5000ccab04037180), slot(56) [Mon Dec 9 06:17:22 2019][ 17.645906] scsi 1:0:110:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 17.652798] scsi 1:0:110:0: serial_number( 7SHNYM7W) [Mon Dec 9 06:17:22 2019][ 17.658370] scsi 1:0:110:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.678216] mpt3sas_cm0: detecting: handle(0x010f), sas_address(0x5000cca2525ca199), phy(48) [Mon Dec 9 06:17:22 2019][ 17.686653] mpt3sas_cm0: REPORT_LUNS: handle(0x010f), retries(0) [Mon Dec 9 06:17:22 2019][ 17.692792] mpt3sas_cm0: TEST_UNIT_READY: handle(0x010f), lun(0) [Mon Dec 9 06:17:22 2019][ 17.699410] scsi 1:0:111:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.707796] scsi 1:0:111:0: SSP: handle(0x010f), sas_addr(0x5000cca2525ca199), phy(48), device_name(0x5000cca2525ca19b) [Mon Dec 9 06:17:22 2019][ 17.718568] scsi 1:0:111:0: enclosure logical id(0x5000ccab04037180), slot(57) [Mon Dec 9 06:17:22 2019][ 17.725874] scsi 1:0:111:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 17.732765] scsi 1:0:111:0: serial_number( 7SHMY83W) [Mon Dec 9 06:17:22 2019][ 17.738341] scsi 1:0:111:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.758213] mpt3sas_cm0: detecting: handle(0x0110), sas_address(0x5000cca2525ffb89), phy(49) [Mon Dec 9 06:17:22 2019][ 17.766648] mpt3sas_cm0: REPORT_LUNS: handle(0x0110), retries(0) [Mon Dec 9 06:17:22 2019][ 17.772782] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0110), lun(0) [Mon Dec 9 06:17:22 2019][ 17.779564] scsi 1:0:112:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.787965] scsi 1:0:112:0: SSP: handle(0x0110), sas_addr(0x5000cca2525ffb89), phy(49), device_name(0x5000cca2525ffb8b) [Mon Dec 9 06:17:23 2019][ 17.798736] scsi 1:0:112:0: enclosure logical id(0x5000ccab04037180), slot(58) [Mon Dec 9 06:17:23 2019][ 17.806043] scsi 1:0:112:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 17.812936] scsi 1:0:112:0: serial_number( 7SHPTDAW) [Mon Dec 9 06:17:23 2019][ 17.818510] scsi 1:0:112:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 17.842215] mpt3sas_cm0: detecting: handle(0x0111), sas_address(0x5000cca2525f2669), phy(50) [Mon Dec 9 06:17:23 2019][ 17.850647] mpt3sas_cm0: REPORT_LUNS: handle(0x0111), retries(0) [Mon Dec 9 06:17:23 2019][ 17.856779] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0111), lun(0) [Mon Dec 9 06:17:23 2019][ 17.863397] scsi 1:0:113:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 17.871788] scsi 1:0:113:0: SSP: handle(0x0111), sas_addr(0x5000cca2525f2669), phy(50), device_name(0x5000cca2525f266b) [Mon Dec 9 06:17:23 2019][ 17.882562] scsi 1:0:113:0: enclosure logical id(0x5000ccab04037180), slot(59) [Mon Dec 9 06:17:23 2019][ 17.889869] scsi 1:0:113:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 17.896761] scsi 1:0:113:0: serial_number( 7SHPA6AW) [Mon Dec 9 06:17:23 2019][ 17.902335] scsi 1:0:113:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 17.925711] mpt3sas_cm0: expander_add: handle(0x00dd), parent(0x00da), sas_addr(0x5000ccab040371ff), phys(68) [Mon Dec 9 06:17:23 2019][ 17.946147] mpt3sas_cm0: detecting: handle(0x0112), sas_address(0x5000cca2525eacc1), phy(42) [Mon Dec 9 06:17:23 2019][ 17.954580] mpt3sas_cm0: REPORT_LUNS: handle(0x0112), retries(0) [Mon Dec 9 06:17:23 2019][ 17.960701] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0112), lun(0) [Mon Dec 9 06:17:23 2019][ 17.967324] scsi 1:0:114:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 17.975719] scsi 1:0:114:0: SSP: handle(0x0112), sas_addr(0x5000cca2525eacc1), phy(42), device_name(0x5000cca2525eacc3) [Mon Dec 9 06:17:23 2019][ 17.986488] scsi 1:0:114:0: enclosure logical id(0x5000ccab04037180), slot(1) [Mon Dec 9 06:17:23 2019][ 17.993708] scsi 1:0:114:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 18.000597] scsi 1:0:114:0: serial_number( 7SHP235W) [Mon Dec 9 06:17:23 2019][ 18.006172] scsi 1:0:114:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 18.026222] mpt3sas_cm0: detecting: handle(0x0113), sas_address(0x5000cca2525f8151), phy(43) [Mon Dec 9 06:17:23 2019][ 18.034662] mpt3sas_cm0: REPORT_LUNS: handle(0x0113), retries(0) [Mon Dec 9 06:17:23 2019][ 18.040793] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0113), lun(0) [Mon Dec 9 06:17:23 2019][ 18.047431] scsi 1:0:115:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 18.055824] scsi 1:0:115:0: SSP: handle(0x0113), sas_addr(0x5000cca2525f8151), phy(43), device_name(0x5000cca2525f8153) [Mon Dec 9 06:17:23 2019][ 18.066596] scsi 1:0:115:0: enclosure logical id(0x5000ccab04037180), slot(3) [Mon Dec 9 06:17:23 2019][ 18.073815] scsi 1:0:115:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 18.080711] scsi 1:0:115:0: serial_number( 7SHPJ80W) [Mon Dec 9 06:17:23 2019][ 18.086282] scsi 1:0:115:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 18.106812] mpt3sas_cm0: detecting: handle(0x0114), sas_address(0x5000cca2525ef839), phy(44) [Mon Dec 9 06:17:23 2019][ 18.115250] mpt3sas_cm0: REPORT_LUNS: handle(0x0114), retries(0) [Mon Dec 9 06:17:23 2019][ 18.121390] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0114), lun(0) [Mon Dec 9 06:17:23 2019][ 18.128000] scsi 1:0:116:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 18.136391] scsi 1:0:116:0: SSP: handle(0x0114), sas_addr(0x5000cca2525ef839), phy(44), device_name(0x5000cca2525ef83b) [Mon Dec 9 06:17:23 2019][ 18.147163] scsi 1:0:116:0: enclosure logical id(0x5000ccab04037180), slot(4) [Mon Dec 9 06:17:23 2019][ 18.154383] scsi 1:0:116:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 18.161277] scsi 1:0:116:0: serial_number( 7SHP73ZW) [Mon Dec 9 06:17:23 2019][ 18.166848] scsi 1:0:116:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 18.189227] mpt3sas_cm0: detecting: handle(0x0115), sas_address(0x5000cca2525e72a9), phy(45) [Mon Dec 9 06:17:23 2019][ 18.197662] mpt3sas_cm0: REPORT_LUNS: handle(0x0115), retries(0) [Mon Dec 9 06:17:23 2019][ 18.203835] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0115), lun(0) [Mon Dec 9 06:17:23 2019][ 18.210577] scsi 1:0:117:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 18.225366] scsi 1:0:117:0: SSP: handle(0x0115), sas_addr(0x5000cca2525e72a9), phy(45), device_name(0x5000cca2525e72ab) [Mon Dec 9 06:17:23 2019][ 18.236138] scsi 1:0:117:0: enclosure logical id(0x5000ccab04037180), slot(5) [Mon Dec 9 06:17:23 2019][ 18.243356] scsi 1:0:117:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 18.250250] scsi 1:0:117:0: serial_number( 7SHNY77W) [Mon Dec 9 06:17:23 2019][ 18.255824] scsi 1:0:117:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 18.278226] mpt3sas_cm0: detecting: handle(0x0116), sas_address(0x5000cca2525d3c89), phy(46) [Mon Dec 9 06:17:23 2019][ 18.286663] mpt3sas_cm0: REPORT_LUNS: handle(0x0116), retries(0) [Mon Dec 9 06:17:23 2019][ 18.292829] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0116), lun(0) [Mon Dec 9 06:17:23 2019][ 18.299537] scsi 1:0:118:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 18.319935] scsi 1:0:118:0: SSP: handle(0x0116), sas_addr(0x5000cca2525d3c89), phy(46), device_name(0x5000cca2525d3c8b) [Mon Dec 9 06:17:23 2019][ 18.330710] scsi 1:0:118:0: enclosure logical id(0x5000ccab04037180), slot(6) [Mon Dec 9 06:17:23 2019][ 18.337930] scsi 1:0:118:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 18.344822] scsi 1:0:118:0: serial_number( 7SHN8KZW) [Mon Dec 9 06:17:23 2019][ 18.350395] scsi 1:0:118:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 18.370231] mpt3sas_cm0: detecting: handle(0x0117), sas_address(0x5000cca2525fae0d), phy(47) [Mon Dec 9 06:17:23 2019][ 18.378669] mpt3sas_cm0: REPORT_LUNS: handle(0x0117), retries(0) [Mon Dec 9 06:17:23 2019][ 18.384834] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0117), lun(0) [Mon Dec 9 06:17:23 2019][ 18.411287] scsi 1:0:119:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 18.419660] scsi 1:0:119:0: SSP: handle(0x0117), sas_addr(0x5000cca2525fae0d), phy(47), device_name(0x5000cca2525fae0f) [Mon Dec 9 06:17:23 2019][ 18.430432] scsi 1:0:119:0: enclosure logical id(0x5000ccab04037180), slot(7) [Mon Dec 9 06:17:23 2019][ 18.437652] scsi 1:0:119:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 18.444526] scsi 1:0:119:0: serial_number( 7SHPM7BW) [Mon Dec 9 06:17:23 2019][ 18.450099] scsi 1:0:119:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 18.470263] mpt3sas_cm0: detecting: handle(0x0118), sas_address(0x5000cca2525efdad), phy(48) [Mon Dec 9 06:17:23 2019][ 18.478701] mpt3sas_cm0: REPORT_LUNS: handle(0x0118), retries(0) [Mon Dec 9 06:17:23 2019][ 18.484863] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0118), lun(0) [Mon Dec 9 06:17:23 2019][ 18.491674] scsi 1:0:120:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 18.500073] scsi 1:0:120:0: SSP: handle(0x0118), sas_addr(0x5000cca2525efdad), phy(48), device_name(0x5000cca2525efdaf) [Mon Dec 9 06:17:23 2019][ 18.510845] scsi 1:0:120:0: enclosure logical id(0x5000ccab04037180), slot(8) [Mon Dec 9 06:17:23 2019][ 18.518062] scsi 1:0:120:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 18.524957] scsi 1:0:120:0: serial_number( 7SHP7H7W) [Mon Dec 9 06:17:23 2019][ 18.530539] scsi 1:0:120:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 18.554234] mpt3sas_cm0: detecting: handle(0x0119), sas_address(0x5000cca2525fa301), phy(49) [Mon Dec 9 06:17:23 2019][ 18.562674] mpt3sas_cm0: REPORT_LUNS: handle(0x0119), retries(0) [Mon Dec 9 06:17:23 2019][ 18.568840] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0119), lun(0) [Mon Dec 9 06:17:23 2019][ 18.575693] scsi 1:0:121:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 18.584095] scsi 1:0:121:0: SSP: handle(0x0119), sas_addr(0x5000cca2525fa301), phy(49), device_name(0x5000cca2525fa303) [Mon Dec 9 06:17:23 2019][ 18.594869] scsi 1:0:121:0: enclosure logical id(0x5000ccab04037180), slot(9) [Mon Dec 9 06:17:23 2019][ 18.602089] scsi 1:0:121:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 18.608978] scsi 1:0:121:0: serial_number( 7SHPLHKW) [Mon Dec 9 06:17:23 2019][ 18.614554] scsi 1:0:121:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 18.651238] mpt3sas_cm0: detecting: handle(0x011a), sas_address(0x5000cca2525fb4bd), phy(50) [Mon Dec 9 06:17:23 2019][ 18.659676] mpt3sas_cm0: REPORT_LUNS: handle(0x011a), retries(0) [Mon Dec 9 06:17:23 2019][ 18.665820] mpt3sas_cm0: TEST_UNIT_READY: handle(0x011a), lun(0) [Mon Dec 9 06:17:23 2019][ 18.750803] scsi 1:0:122:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 18.759178] scsi 1:0:122:0: SSP: handle(0x011a), sas_addr(0x5000cca2525fb4bd), phy(50), device_name(0x5000cca2525fb4bf) [Mon Dec 9 06:17:23 2019][ 18.769948] scsi 1:0:122:0: enclosure logical id(0x5000ccab04037180), slot(10) [Mon Dec 9 06:17:24 2019][ 18.777255] scsi 1:0:122:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:24 2019][ 18.784131] scsi 1:0:122:0: serial_number( 7SHPMP5W) [Mon Dec 9 06:17:24 2019][ 18.789703] scsi 1:0:122:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 18.815750] mpt3sas_cm0: expander_add: handle(0x0017), parent(0x0009), sas_addr(0x5000ccab0405db7d), phys(49) [Mon Dec 9 06:17:24 2019][ 18.836307] mpt3sas_cm0: detecting: handle(0x001b), sas_address(0x5000ccab0405db7c), phy(48) [Mon Dec 9 06:17:24 2019][ 18.844749] mpt3sas_cm0: REPORT_LUNS: handle(0x001b), retries(0) [Mon Dec 9 06:17:24 2019][ 18.851187] mpt3sas_cm0: TEST_UNIT_READY: handle(0x001b), lun(0) [Mon Dec 9 06:17:24 2019][ 18.858040] scsi 1:0:123:0: Enclosure HGST H4060-J 2033 PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 18.866618] scsi 1:0:123:0: set ignore_delay_remove for handle(0x001b) [Mon Dec 9 06:17:24 2019][ 18.873146] scsi 1:0:123:0: SES: handle(0x001b), sas_addr(0x5000ccab0405db7c), phy(48), device_name(0x0000000000000000) [Mon Dec 9 06:17:24 2019][ 18.883917] scsi 1:0:123:0: enclosure logical id(0x5000ccab0405db00), slot(60) [Mon Dec 9 06:17:24 2019][ 18.891224] scsi 1:0:123:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 18.898118] scsi 1:0:123:0: serial_number(USWSJ03918EZ0069 ) [Mon Dec 9 06:17:24 2019][ 18.904038] scsi 1:0:123:0: qdepth(1), tagged(0), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 18.928605] mpt3sas_cm0: expander_add: handle(0x0019), parent(0x0017), sas_addr(0x5000ccab0405db79), phys(68) [Mon Dec 9 06:17:24 2019][ 18.949726] mpt3sas_cm0: detecting: handle(0x001c), sas_address(0x5000cca252550a76), phy(0) [Mon Dec 9 06:17:24 2019][ 18.958079] mpt3sas_cm0: REPORT_LUNS: handle(0x001c), retries(0) [Mon Dec 9 06:17:24 2019][ 18.964939] mpt3sas_cm0: TEST_UNIT_READY: handle(0x001c), lun(0) [Mon Dec 9 06:17:24 2019][ 18.972854] scsi 1:0:124:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 18.981251] scsi 1:0:124:0: SSP: handle(0x001c), sas_addr(0x5000cca252550a76), phy(0), device_name(0x5000cca252550a77) [Mon Dec 9 06:17:24 2019][ 18.991933] scsi 1:0:124:0: enclosure logical id(0x5000ccab0405db00), slot(0) [Mon Dec 9 06:17:24 2019][ 18.999152] scsi 1:0:124:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 19.006043] scsi 1:0:124:0: serial_number( 7SHHSVGG) [Mon Dec 9 06:17:24 2019][ 19.011620] scsi 1:0:124:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 19.034230] mpt3sas_cm0: detecting: handle(0x001d), sas_address(0x5000cca25253eb32), phy(1) [Mon Dec 9 06:17:24 2019][ 19.042578] mpt3sas_cm0: REPORT_LUNS: handle(0x001d), retries(0) [Mon Dec 9 06:17:24 2019][ 19.048713] mpt3sas_cm0: TEST_UNIT_READY: handle(0x001d), lun(0) [Mon Dec 9 06:17:24 2019][ 19.055363] scsi 1:0:125:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 19.063753] scsi 1:0:125:0: SSP: handle(0x001d), sas_addr(0x5000cca25253eb32), phy(1), device_name(0x5000cca25253eb33) [Mon Dec 9 06:17:24 2019][ 19.074441] scsi 1:0:125:0: enclosure logical id(0x5000ccab0405db00), slot(2) [Mon Dec 9 06:17:24 2019][ 19.081661] scsi 1:0:125:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 19.088554] scsi 1:0:125:0: serial_number( 7SHH4RDG) [Mon Dec 9 06:17:24 2019][ 19.094127] scsi 1:0:125:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 19.114232] mpt3sas_cm0: detecting: handle(0x001e), sas_address(0x5000cca26b950bb6), phy(2) [Mon Dec 9 06:17:24 2019][ 19.122582] mpt3sas_cm0: REPORT_LUNS: handle(0x001e), retries(0) [Mon Dec 9 06:17:24 2019][ 19.128741] mpt3sas_cm0: TEST_UNIT_READY: handle(0x001e), lun(0) [Mon Dec 9 06:17:24 2019][ 19.135446] scsi 1:0:126:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 19.146152] scsi 1:0:126:0: SSP: handle(0x001e), sas_addr(0x5000cca26b950bb6), phy(2), device_name(0x5000cca26b950bb7) [Mon Dec 9 06:17:24 2019][ 19.156839] scsi 1:0:126:0: enclosure logical id(0x5000ccab0405db00), slot(11) [Mon Dec 9 06:17:24 2019][ 19.164144] scsi 1:0:126:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 19.171038] scsi 1:0:126:0: serial_number( 1SJMZ22Z) [Mon Dec 9 06:17:24 2019][ 19.176611] scsi 1:0:126:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 19.199239] mpt3sas_cm0: detecting: handle(0x001f), sas_address(0x5000cca25253f3be), phy(3) [Mon Dec 9 06:17:24 2019][ 19.207591] mpt3sas_cm0: REPORT_LUNS: handle(0x001f), retries(0) [Mon Dec 9 06:17:24 2019][ 19.213727] mpt3sas_cm0: TEST_UNIT_READY: handle(0x001f), lun(0) [Mon Dec 9 06:17:24 2019][ 19.234740] scsi 1:0:127:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 19.243128] scsi 1:0:127:0: SSP: handle(0x001f), sas_addr(0x5000cca25253f3be), phy(3), device_name(0x5000cca25253f3bf) [Mon Dec 9 06:17:24 2019][ 19.253813] scsi 1:0:127:0: enclosure logical id(0x5000ccab0405db00), slot(12) [Mon Dec 9 06:17:24 2019][ 19.261117] scsi 1:0:127:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 19.267995] scsi 1:0:127:0: serial_number( 7SHH591G) [Mon Dec 9 06:17:24 2019][ 19.273567] scsi 1:0:127:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 19.312240] mpt3sas_cm0: detecting: handle(0x0020), sas_address(0x5000cca26a2ac3da), phy(4) [Mon Dec 9 06:17:24 2019][ 19.320593] mpt3sas_cm0: REPORT_LUNS: handle(0x0020), retries(0) [Mon Dec 9 06:17:24 2019][ 19.326724] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0020), lun(0) [Mon Dec 9 06:17:24 2019][ 19.333344] scsi 1:0:128:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 19.341734] scsi 1:0:128:0: SSP: handle(0x0020), sas_addr(0x5000cca26a2ac3da), phy(4), device_name(0x5000cca26a2ac3db) [Mon Dec 9 06:17:24 2019][ 19.352423] scsi 1:0:128:0: enclosure logical id(0x5000ccab0405db00), slot(13) [Mon Dec 9 06:17:24 2019][ 19.359730] scsi 1:0:128:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 19.366621] scsi 1:0:128:0: serial_number( 2TGSJ30D) [Mon Dec 9 06:17:24 2019][ 19.372194] scsi 1:0:128:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 19.395241] mpt3sas_cm0: detecting: handle(0x0021), sas_address(0x5000cca25254102a), phy(5) [Mon Dec 9 06:17:24 2019][ 19.403588] mpt3sas_cm0: REPORT_LUNS: handle(0x0021), retries(0) [Mon Dec 9 06:17:24 2019][ 19.409719] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0021), lun(0) [Mon Dec 9 06:17:24 2019][ 19.416340] scsi 1:0:129:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 19.424743] scsi 1:0:129:0: SSP: handle(0x0021), sas_addr(0x5000cca25254102a), phy(5), device_name(0x5000cca25254102b) [Mon Dec 9 06:17:24 2019][ 19.435425] scsi 1:0:129:0: enclosure logical id(0x5000ccab0405db00), slot(14) [Mon Dec 9 06:17:24 2019][ 19.442731] scsi 1:0:129:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 19.449623] scsi 1:0:129:0: serial_number( 7SHH75RG) [Mon Dec 9 06:17:24 2019][ 19.455197] scsi 1:0:129:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 19.475280] mpt3sas_cm0: detecting: handle(0x0022), sas_address(0x5000cca25254534a), phy(6) [Mon Dec 9 06:17:24 2019][ 19.483627] mpt3sas_cm0: REPORT_LUNS: handle(0x0022), retries(0) [Mon Dec 9 06:17:24 2019][ 19.489787] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0022), lun(0) [Mon Dec 9 06:17:24 2019][ 19.496536] scsi 1:0:130:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 19.504934] scsi 1:0:130:0: SSP: handle(0x0022), sas_addr(0x5000cca25254534a), phy(6), device_name(0x5000cca25254534b) [Mon Dec 9 06:17:24 2019][ 19.515621] scsi 1:0:130:0: enclosure logical id(0x5000ccab0405db00), slot(15) [Mon Dec 9 06:17:24 2019][ 19.522928] scsi 1:0:130:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 19.529817] scsi 1:0:130:0: serial_number( 7SHHBN9G) [Mon Dec 9 06:17:24 2019][ 19.535394] scsi 1:0:130:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 19.555247] mpt3sas_cm0: detecting: handle(0x0023), sas_address(0x5000cca2525430c6), phy(7) [Mon Dec 9 06:17:24 2019][ 19.563598] mpt3sas_cm0: REPORT_LUNS: handle(0x0023), retries(0) [Mon Dec 9 06:17:24 2019][ 19.569746] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0023), lun(0) [Mon Dec 9 06:17:24 2019][ 19.595611] scsi 1:0:131:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 19.605795] scsi 1:0:131:0: SSP: handle(0x0023), sas_addr(0x5000cca2525430c6), phy(7), device_name(0x5000cca2525430c7) [Mon Dec 9 06:17:24 2019][ 19.616477] scsi 1:0:131:0: enclosure logical id(0x5000ccab0405db00), slot(16) [Mon Dec 9 06:17:24 2019][ 19.623784] scsi 1:0:131:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 19.630661] scsi 1:0:131:0: serial_number( 7SHH9B1G) [Mon Dec 9 06:17:24 2019][ 19.636232] scsi 1:0:131:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 19.683250] mpt3sas_cm0: detecting: handle(0x0024), sas_address(0x5000cca25254385e), phy(8) [Mon Dec 9 06:17:24 2019][ 19.691599] mpt3sas_cm0: REPORT_LUNS: handle(0x0024), retries(0) [Mon Dec 9 06:17:24 2019][ 19.697761] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0024), lun(0) [Mon Dec 9 06:17:24 2019][ 19.705623] scsi 1:0:132:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 19.716476] scsi 1:0:132:0: SSP: handle(0x0024), sas_addr(0x5000cca25254385e), phy(8), device_name(0x5000cca25254385f) [Mon Dec 9 06:17:24 2019][ 19.727164] scsi 1:0:132:0: enclosure logical id(0x5000ccab0405db00), slot(17) [Mon Dec 9 06:17:24 2019][ 19.734468] scsi 1:0:132:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 19.741362] scsi 1:0:132:0: serial_number( 7SHH9VRG) [Mon Dec 9 06:17:24 2019][ 19.746935] scsi 1:0:132:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 19.794295] mpt3sas_cm0: detecting: handle(0x0025), sas_address(0x5000cca25253f30e), phy(9) [Mon Dec 9 06:17:25 2019][ 19.802648] mpt3sas_cm0: REPORT_LUNS: handle(0x0025), retries(0) [Mon Dec 9 06:17:25 2019][ 19.808792] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0025), lun(0) [Mon Dec 9 06:17:25 2019][ 19.815553] scsi 1:0:133:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:25 2019][ 19.823953] scsi 1:0:133:0: SSP: handle(0x0025), sas_addr(0x5000cca25253f30e), phy(9), device_name(0x5000cca25253f30f) [Mon Dec 9 06:17:25 2019][ 19.834640] scsi 1:0:133:0: enclosure logical id(0x5000ccab0405db00), slot(18) [Mon Dec 9 06:17:25 2019][ 19.841948] scsi 1:0:133:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:25 2019][ 19.848839] scsi 1:0:133:0: serial_number( 7SHH57MG) [Mon Dec 9 06:17:25 2019][ 19.854412] scsi 1:0:133:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:25 2019][ 19.874253] mpt3sas_cm0: detecting: handle(0x0026), sas_address(0x5000cca252545f66), phy(10) [Mon Dec 9 06:17:25 2019][ 19.882694] mpt3sas_cm0: REPORT_LUNS: handle(0x0026), retries(0) [Mon Dec 9 06:17:25 2019][ 19.888831] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0026), lun(0) [Mon Dec 9 06:17:25 2019][ 19.895602] scsi 1:0:134:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:25 2019][ 19.904210] scsi 1:0:134:0: SSP: handle(0x0026), sas_addr(0x5000cca252545f66), phy(10), device_name(0x5000cca252545f67) [Mon Dec 9 06:17:25 2019][ 19.914981] scsi 1:0:134:0: enclosure logical id(0x5000ccab0405db00), slot(19) [Mon Dec 9 06:17:25 2019][ 19.922289] scsi 1:0:134:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:25 2019][ 19.929183] scsi 1:0:134:0: serial_number( 7SHHDG9G) [Mon Dec 9 06:17:25 2019][ 19.934755] scsi 1:0:134:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:25 2019][ 19.963256] mpt3sas_cm0: detecting: handle(0x0027), sas_address(0x5000cca266daa4e6), phy(11) [Mon Dec 9 06:17:25 2019][ 19.971695] mpt3sas_cm0: REPORT_LUNS: handle(0x0027), retries(0) [Mon Dec 9 06:17:25 2019][ 19.977839] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0027), lun(0) [Mon Dec 9 06:17:25 2019][ 19.984445] scsi 1:0:135:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:25 2019][ 19.992836] scsi 1:0:135:0: SSP: handle(0x0027), sas_addr(0x5000cca266daa4e6), phy(11), device_name(0x5000cca266daa4e7) [Mon Dec 9 06:17:25 2019][ 20.003610] scsi 1:0:135:0: enclosure logical id(0x5000ccab0405db00), slot(20) [Mon Dec 9 06:17:25 2019][ 20.010917] scsi 1:0:135:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:25 2019][ 20.017807] scsi 1:0:135:0: serial_number( 7JKW7MYK) [Mon Dec 9 06:17:25 2019][ 20.023381] scsi 1:0:135:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:25 2019][ 20.043254] mpt3sas_cm0: detecting: handle(0x0028), sas_address(0x5000cca26a25167e), phy(12) [Mon Dec 9 06:17:25 2019][ 20.051689] mpt3sas_cm0: REPORT_LUNS: handle(0x0028), retries(0) [Mon Dec 9 06:17:25 2019][ 20.057850] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0028), lun(0) [Mon Dec 9 06:17:25 2019][ 20.064457] scsi 1:0:136:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:25 2019][ 20.072850] scsi 1:0:136:0: SSP: handle(0x0028), sas_addr(0x5000cca26a25167e), phy(12), device_name(0x5000cca26a25167f) [Mon Dec 9 06:17:25 2019][ 20.083621] scsi 1:0:136:0: enclosure logical id(0x5000ccab0405db00), slot(21) [Mon Dec 9 06:17:25 2019][ 20.090928] scsi 1:0:136:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:25 2019][ 20.097820] scsi 1:0:136:0: serial_number( 2TGND9JD) [Mon Dec 9 06:17:25 2019][ 20.103394] scsi 1:0:136:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:25 2019][ 20.123257] mpt3sas_cm0: detecting: handle(0x0029), sas_address(0x5000cca25253edaa), phy(13) [Mon Dec 9 06:17:25 2019][ 20.131694] mpt3sas_cm0: REPORT_LUNS: handle(0x0029), retries(0) [Mon Dec 9 06:17:25 2019][ 20.137823] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0029), lun(0) [Mon Dec 9 06:17:25 2019][ 20.144426] scsi 1:0:137:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:25 2019][ 20.152818] scsi 1:0:137:0: SSP: handle(0x0029), sas_addr(0x5000cca25253edaa), phy(13), device_name(0x5000cca25253edab) [Mon Dec 9 06:17:25 2019][ 20.163593] scsi 1:0:137:0: enclosure logical id(0x5000ccab0405db00), slot(22) [Mon Dec 9 06:17:25 2019][ 20.170899] scsi 1:0:137:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:25 2019][ 20.177788] scsi 1:0:137:0: serial_number( 7SHH4WHG) [Mon Dec 9 06:17:25 2019][ 20.183365] scsi 1:0:137:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:25 2019][ 20.204257] mpt3sas_cm0: detecting: handle(0x002a), sas_address(0x5000cca266d491a2), phy(14) [Mon Dec 9 06:17:25 2019][ 20.212694] mpt3sas_cm0: REPORT_LUNS: handle(0x002a), retries(0) [Mon Dec 9 06:17:25 2019][ 20.218822] mpt3sas_cm0: TEST_UNIT_READY: handle(0x002a), lun(0) [Mon Dec 9 06:17:25 2019][ 20.225451] scsi 1:0:138:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:25 2019][ 20.233835] scsi 1:0:138:0: SSP: handle(0x002a), sas_addr(0x5000cca266d491a2), phy(14), device_name(0x5000cca266d491a3) [Mon Dec 9 06:17:25 2019][ 20.244609] scsi 1:0:138:0: enclosure logical id(0x5000ccab0405db00), slot(23) [Mon Dec 9 06:17:25 2019][ 20.251916] scsi 1:0:138:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:25 2019][ 20.258806] scsi 1:0:138:0: serial_number( 7JKSX22K) [Mon Dec 9 06:17:25 2019][ 20.264383] scsi 1:0:138:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:25 2019][ 20.284260] mpt3sas_cm0: detecting: handle(0x002b), sas_address(0x5000cca26b9a709a), phy(15) [Mon Dec 9 06:17:25 2019][ 20.292697] mpt3sas_cm0: REPORT_LUNS: handle(0x002b), retries(0) [Mon Dec 9 06:17:25 2019][ 20.298833] mpt3sas_cm0: TEST_UNIT_READY: handle(0x002b), lun(0) [Mon Dec 9 06:17:25 2019][ 20.468617] scsi 1:0:139:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:25 2019][ 20.477001] scsi 1:0:139:0: SSP: handle(0x002b), sas_addr(0x5000cca26b9a709a), phy(15), device_name(0x5000cca26b9a709b) [Mon Dec 9 06:17:25 2019][ 20.487778] scsi 1:0:139:0: enclosure logical id(0x5000ccab0405db00), slot(24) [Mon Dec 9 06:17:25 2019][ 20.495082] scsi 1:0:139:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:25 2019][ 20.501974] scsi 1:0:139:0: serial_number( 1SJRY0YZ) [Mon Dec 9 06:17:25 2019][ 20.507547] scsi 1:0:139:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:25 2019][ 20.533272] mpt3sas_cm0: detecting: handle(0x002c), sas_address(0x5000cca25253f832), phy(16) [Mon Dec 9 06:17:25 2019][ 20.541707] mpt3sas_cm0: REPORT_LUNS: handle(0x002c), retries(0) [Mon Dec 9 06:17:25 2019][ 20.548005] mpt3sas_cm0: TEST_UNIT_READY: handle(0x002c), lun(0) [Mon Dec 9 06:17:25 2019][ 20.728121] scsi 1:0:140:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:25 2019][ 20.736500] scsi 1:0:140:0: SSP: handle(0x002c), sas_addr(0x5000cca25253f832), phy(16), device_name(0x5000cca25253f833) [Mon Dec 9 06:17:25 2019][ 20.747272] scsi 1:0:140:0: enclosure logical id(0x5000ccab0405db00), slot(25) [Mon Dec 9 06:17:25 2019][ 20.754578] scsi 1:0:140:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:25 2019][ 20.761455] scsi 1:0:140:0: serial_number( 7SHH5L7G) [Mon Dec 9 06:17:25 2019][ 20.767025] scsi 1:0:140:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 20.787273] mpt3sas_cm0: detecting: handle(0x002d), sas_address(0x5000cca26a2ab23e), phy(17) [Mon Dec 9 06:17:26 2019][ 20.795705] mpt3sas_cm0: REPORT_LUNS: handle(0x002d), retries(0) [Mon Dec 9 06:17:26 2019][ 20.801867] mpt3sas_cm0: TEST_UNIT_READY: handle(0x002d), lun(0) [Mon Dec 9 06:17:26 2019][ 20.808471] scsi 1:0:141:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 20.816856] scsi 1:0:141:0: SSP: handle(0x002d), sas_addr(0x5000cca26a2ab23e), phy(17), device_name(0x5000cca26a2ab23f) [Mon Dec 9 06:17:26 2019][ 20.827632] scsi 1:0:141:0: enclosure logical id(0x5000ccab0405db00), slot(26) [Mon Dec 9 06:17:26 2019][ 20.834937] scsi 1:0:141:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 20.841828] scsi 1:0:141:0: serial_number( 2TGSGXND) [Mon Dec 9 06:17:26 2019][ 20.847404] scsi 1:0:141:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 20.867275] mpt3sas_cm0: detecting: handle(0x002e), sas_address(0x5000cca26b9b9696), phy(18) [Mon Dec 9 06:17:26 2019][ 20.875710] mpt3sas_cm0: REPORT_LUNS: handle(0x002e), retries(0) [Mon Dec 9 06:17:26 2019][ 20.881974] mpt3sas_cm0: TEST_UNIT_READY: handle(0x002e), lun(0) [Mon Dec 9 06:17:26 2019][ 20.889131] scsi 1:0:142:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 20.897525] scsi 1:0:142:0: SSP: handle(0x002e), sas_addr(0x5000cca26b9b9696), phy(18), device_name(0x5000cca26b9b9697) [Mon Dec 9 06:17:26 2019][ 20.908294] scsi 1:0:142:0: enclosure logical id(0x5000ccab0405db00), slot(27) [Mon Dec 9 06:17:26 2019][ 20.915601] scsi 1:0:142:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 20.922492] scsi 1:0:142:0: serial_number( 1SJSKLWZ) [Mon Dec 9 06:17:26 2019][ 20.928065] scsi 1:0:142:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 20.948277] mpt3sas_cm0: detecting: handle(0x002f), sas_address(0x5000cca252559472), phy(19) [Mon Dec 9 06:17:26 2019][ 20.956713] mpt3sas_cm0: REPORT_LUNS: handle(0x002f), retries(0) [Mon Dec 9 06:17:26 2019][ 20.962861] mpt3sas_cm0: TEST_UNIT_READY: handle(0x002f), lun(0) [Mon Dec 9 06:17:26 2019][ 20.975815] scsi 1:0:143:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 20.984197] scsi 1:0:143:0: SSP: handle(0x002f), sas_addr(0x5000cca252559472), phy(19), device_name(0x5000cca252559473) [Mon Dec 9 06:17:26 2019][ 20.994972] scsi 1:0:143:0: enclosure logical id(0x5000ccab0405db00), slot(28) [Mon Dec 9 06:17:26 2019][ 21.002277] scsi 1:0:143:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 21.009156] scsi 1:0:143:0: serial_number( 7SHJ21AG) [Mon Dec 9 06:17:26 2019][ 21.014735] scsi 1:0:143:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 21.046290] mpt3sas_cm0: detecting: handle(0x0030), sas_address(0x5000cca25253f94e), phy(20) [Mon Dec 9 06:17:26 2019][ 21.054727] mpt3sas_cm0: REPORT_LUNS: handle(0x0030), retries(0) [Mon Dec 9 06:17:26 2019][ 21.060874] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0030), lun(0) [Mon Dec 9 06:17:26 2019][ 21.067593] scsi 1:0:144:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 21.077404] scsi 1:0:144:0: SSP: handle(0x0030), sas_addr(0x5000cca25253f94e), phy(20), device_name(0x5000cca25253f94f) [Mon Dec 9 06:17:26 2019][ 21.088177] scsi 1:0:144:0: enclosure logical id(0x5000ccab0405db00), slot(29) [Mon Dec 9 06:17:26 2019][ 21.095481] scsi 1:0:144:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 21.102376] scsi 1:0:144:0: serial_number( 7SHH5NJG) [Mon Dec 9 06:17:26 2019][ 21.107947] scsi 1:0:144:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 21.138284] mpt3sas_cm0: detecting: handle(0x0031), sas_address(0x5000cca25253e69a), phy(21) [Mon Dec 9 06:17:26 2019][ 21.146726] mpt3sas_cm0: REPORT_LUNS: handle(0x0031), retries(0) [Mon Dec 9 06:17:26 2019][ 21.152865] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0031), lun(0) [Mon Dec 9 06:17:26 2019][ 21.162001] scsi 1:0:145:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 21.170392] scsi 1:0:145:0: SSP: handle(0x0031), sas_addr(0x5000cca25253e69a), phy(21), device_name(0x5000cca25253e69b) [Mon Dec 9 06:17:26 2019][ 21.181164] scsi 1:0:145:0: enclosure logical id(0x5000ccab0405db00), slot(30) [Mon Dec 9 06:17:26 2019][ 21.188468] scsi 1:0:145:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 21.195359] scsi 1:0:145:0: serial_number( 7SHH4DXG) [Mon Dec 9 06:17:26 2019][ 21.200935] scsi 1:0:145:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 21.231883] mpt3sas_cm0: detecting: handle(0x0032), sas_address(0x5000cca252543cc2), phy(22) [Mon Dec 9 06:17:26 2019][ 21.240318] mpt3sas_cm0: REPORT_LUNS: handle(0x0032), retries(0) [Mon Dec 9 06:17:26 2019][ 21.246449] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0032), lun(0) [Mon Dec 9 06:17:26 2019][ 21.253048] scsi 1:0:146:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 21.261436] scsi 1:0:146:0: SSP: handle(0x0032), sas_addr(0x5000cca252543cc2), phy(22), device_name(0x5000cca252543cc3) [Mon Dec 9 06:17:26 2019][ 21.272208] scsi 1:0:146:0: enclosure logical id(0x5000ccab0405db00), slot(31) [Mon Dec 9 06:17:26 2019][ 21.279514] scsi 1:0:146:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 21.286404] scsi 1:0:146:0: serial_number( 7SHHA4TG) [Mon Dec 9 06:17:26 2019][ 21.291980] scsi 1:0:146:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 21.322287] mpt3sas_cm0: detecting: handle(0x0033), sas_address(0x5000cca26a24fcde), phy(23) [Mon Dec 9 06:17:26 2019][ 21.330737] mpt3sas_cm0: REPORT_LUNS: handle(0x0033), retries(0) [Mon Dec 9 06:17:26 2019][ 21.336859] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0033), lun(0) [Mon Dec 9 06:17:26 2019][ 21.343466] scsi 1:0:147:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 21.351852] scsi 1:0:147:0: SSP: handle(0x0033), sas_addr(0x5000cca26a24fcde), phy(23), device_name(0x5000cca26a24fcdf) [Mon Dec 9 06:17:26 2019][ 21.362622] scsi 1:0:147:0: enclosure logical id(0x5000ccab0405db00), slot(32) [Mon Dec 9 06:17:26 2019][ 21.369927] scsi 1:0:147:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 21.376819] scsi 1:0:147:0: serial_number( 2TGNALMD) [Mon Dec 9 06:17:26 2019][ 21.382393] scsi 1:0:147:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 21.402285] mpt3sas_cm0: detecting: handle(0x0034), sas_address(0x5000cca252543bce), phy(24) [Mon Dec 9 06:17:26 2019][ 21.410725] mpt3sas_cm0: REPORT_LUNS: handle(0x0034), retries(0) [Mon Dec 9 06:17:26 2019][ 21.416860] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0034), lun(0) [Mon Dec 9 06:17:26 2019][ 21.423461] scsi 1:0:148:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 21.431850] scsi 1:0:148:0: SSP: handle(0x0034), sas_addr(0x5000cca252543bce), phy(24), device_name(0x5000cca252543bcf) [Mon Dec 9 06:17:26 2019][ 21.442625] scsi 1:0:148:0: enclosure logical id(0x5000ccab0405db00), slot(33) [Mon Dec 9 06:17:26 2019][ 21.449932] scsi 1:0:148:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 21.456822] scsi 1:0:148:0: serial_number( 7SHHA2UG) [Mon Dec 9 06:17:26 2019][ 21.462396] scsi 1:0:148:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 21.482289] mpt3sas_cm0: detecting: handle(0x0035), sas_address(0x5000cca252551266), phy(25) [Mon Dec 9 06:17:26 2019][ 21.490723] mpt3sas_cm0: REPORT_LUNS: handle(0x0035), retries(0) [Mon Dec 9 06:17:26 2019][ 21.496861] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0035), lun(0) [Mon Dec 9 06:17:26 2019][ 21.505027] scsi 1:0:149:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 21.513524] scsi 1:0:149:0: SSP: handle(0x0035), sas_addr(0x5000cca252551266), phy(25), device_name(0x5000cca252551267) [Mon Dec 9 06:17:26 2019][ 21.524294] scsi 1:0:149:0: enclosure logical id(0x5000ccab0405db00), slot(34) [Mon Dec 9 06:17:26 2019][ 21.531599] scsi 1:0:149:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 21.538491] scsi 1:0:149:0: serial_number( 7SHHTBVG) [Mon Dec 9 06:17:26 2019][ 21.544064] scsi 1:0:149:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 21.572298] mpt3sas_cm0: detecting: handle(0x0036), sas_address(0x5000cca252555fca), phy(26) [Mon Dec 9 06:17:26 2019][ 21.580737] mpt3sas_cm0: REPORT_LUNS: handle(0x0036), retries(0) [Mon Dec 9 06:17:26 2019][ 21.586878] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0036), lun(0) [Mon Dec 9 06:17:26 2019][ 21.620949] scsi 1:0:150:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 21.629344] scsi 1:0:150:0: SSP: handle(0x0036), sas_addr(0x5000cca252555fca), phy(26), device_name(0x5000cca252555fcb) [Mon Dec 9 06:17:26 2019][ 21.640116] scsi 1:0:150:0: enclosure logical id(0x5000ccab0405db00), slot(35) [Mon Dec 9 06:17:26 2019][ 21.647423] scsi 1:0:150:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 21.654315] scsi 1:0:150:0: serial_number( 7SHHYJMG) [Mon Dec 9 06:17:26 2019][ 21.659889] scsi 1:0:150:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 21.698909] mpt3sas_cm0: detecting: handle(0x0037), sas_address(0x5000cca252559f7e), phy(27) [Mon Dec 9 06:17:26 2019][ 21.707342] mpt3sas_cm0: REPORT_LUNS: handle(0x0037), retries(0) [Mon Dec 9 06:17:26 2019][ 21.713482] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0037), lun(0) [Mon Dec 9 06:17:26 2019][ 21.720293] scsi 1:0:151:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 21.733357] scsi 1:0:151:0: SSP: handle(0x0037), sas_addr(0x5000cca252559f7e), phy(27), device_name(0x5000cca252559f7f) [Mon Dec 9 06:17:26 2019][ 21.744130] scsi 1:0:151:0: enclosure logical id(0x5000ccab0405db00), slot(36) [Mon Dec 9 06:17:26 2019][ 21.751434] scsi 1:0:151:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 21.758326] scsi 1:0:151:0: serial_number( 7SHJ2T4G) [Mon Dec 9 06:17:26 2019][ 21.763899] scsi 1:0:151:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 21.786332] mpt3sas_cm0: detecting: handle(0x0038), sas_address(0x5000cca26c244bce), phy(28) [Mon Dec 9 06:17:27 2019][ 21.794772] mpt3sas_cm0: REPORT_LUNS: handle(0x0038), retries(0) [Mon Dec 9 06:17:27 2019][ 21.800904] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0038), lun(0) [Mon Dec 9 06:17:27 2019][ 21.807711] scsi 1:0:152:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 21.816098] scsi 1:0:152:0: SSP: handle(0x0038), sas_addr(0x5000cca26c244bce), phy(28), device_name(0x5000cca26c244bcf) [Mon Dec 9 06:17:27 2019][ 21.826870] scsi 1:0:152:0: enclosure logical id(0x5000ccab0405db00), slot(37) [Mon Dec 9 06:17:27 2019][ 21.834177] scsi 1:0:152:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 21.841068] scsi 1:0:152:0: serial_number( 1DGMYU2Z) [Mon Dec 9 06:17:27 2019][ 21.846644] scsi 1:0:152:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 21.867296] mpt3sas_cm0: detecting: handle(0x0039), sas_address(0x5000cca26a2aa10e), phy(29) [Mon Dec 9 06:17:27 2019][ 21.875731] mpt3sas_cm0: REPORT_LUNS: handle(0x0039), retries(0) [Mon Dec 9 06:17:27 2019][ 21.881865] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0039), lun(0) [Mon Dec 9 06:17:27 2019][ 21.888654] scsi 1:0:153:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 21.897047] scsi 1:0:153:0: SSP: handle(0x0039), sas_addr(0x5000cca26a2aa10e), phy(29), device_name(0x5000cca26a2aa10f) [Mon Dec 9 06:17:27 2019][ 21.907819] scsi 1:0:153:0: enclosure logical id(0x5000ccab0405db00), slot(38) [Mon Dec 9 06:17:27 2019][ 21.915126] scsi 1:0:153:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 21.922020] scsi 1:0:153:0: serial_number( 2TGSET5D) [Mon Dec 9 06:17:27 2019][ 21.927591] scsi 1:0:153:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 21.950304] mpt3sas_cm0: detecting: handle(0x003a), sas_address(0x5000cca25254e236), phy(30) [Mon Dec 9 06:17:27 2019][ 21.958743] mpt3sas_cm0: REPORT_LUNS: handle(0x003a), retries(0) [Mon Dec 9 06:17:27 2019][ 21.964884] mpt3sas_cm0: TEST_UNIT_READY: handle(0x003a), lun(0) [Mon Dec 9 06:17:27 2019][ 21.982927] scsi 1:0:154:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 21.991500] scsi 1:0:154:0: SSP: handle(0x003a), sas_addr(0x5000cca25254e236), phy(30), device_name(0x5000cca25254e237) [Mon Dec 9 06:17:27 2019][ 22.002273] scsi 1:0:154:0: enclosure logical id(0x5000ccab0405db00), slot(39) [Mon Dec 9 06:17:27 2019][ 22.009577] scsi 1:0:154:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 22.016471] scsi 1:0:154:0: serial_number( 7SHHP5BG) [Mon Dec 9 06:17:27 2019][ 22.022043] scsi 1:0:154:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 22.069313] mpt3sas_cm0: detecting: handle(0x003b), sas_address(0x5000cca25254df96), phy(31) [Mon Dec 9 06:17:27 2019][ 22.077749] mpt3sas_cm0: REPORT_LUNS: handle(0x003b), retries(0) [Mon Dec 9 06:17:27 2019][ 22.084709] mpt3sas_cm0: TEST_UNIT_READY: handle(0x003b), lun(0) [Mon Dec 9 06:17:27 2019][ 22.109363] scsi 1:0:155:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 22.117809] scsi 1:0:155:0: SSP: handle(0x003b), sas_addr(0x5000cca25254df96), phy(31), device_name(0x5000cca25254df97) [Mon Dec 9 06:17:27 2019][ 22.128584] scsi 1:0:155:0: enclosure logical id(0x5000ccab0405db00), slot(40) [Mon Dec 9 06:17:27 2019][ 22.135889] scsi 1:0:155:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 22.142780] scsi 1:0:155:0: serial_number( 7SHHNZYG) [Mon Dec 9 06:17:27 2019][ 22.148353] scsi 1:0:155:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 22.183319] mpt3sas_cm0: detecting: handle(0x003c), sas_address(0x5000cca25254e9d2), phy(32) [Mon Dec 9 06:17:27 2019][ 22.191760] mpt3sas_cm0: REPORT_LUNS: handle(0x003c), retries(0) [Mon Dec 9 06:17:27 2019][ 22.197889] mpt3sas_cm0: TEST_UNIT_READY: handle(0x003c), lun(0) [Mon Dec 9 06:17:27 2019][ 22.239975] scsi 1:0:156:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 22.248362] scsi 1:0:156:0: SSP: handle(0x003c), sas_addr(0x5000cca25254e9d2), phy(32), device_name(0x5000cca25254e9d3) [Mon Dec 9 06:17:27 2019][ 22.259132] scsi 1:0:156:0: enclosure logical id(0x5000ccab0405db00), slot(41) [Mon Dec 9 06:17:27 2019][ 22.266440] scsi 1:0:156:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 22.273330] scsi 1:0:156:0: serial_number( 7SHHPP2G) [Mon Dec 9 06:17:27 2019][ 22.278903] scsi 1:0:156:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 22.400322] mpt3sas_cm0: detecting: handle(0x003d), sas_address(0x5000cca26a24008a), phy(33) [Mon Dec 9 06:17:27 2019][ 22.408763] mpt3sas_cm0: REPORT_LUNS: handle(0x003d), retries(0) [Mon Dec 9 06:17:27 2019][ 22.414903] mpt3sas_cm0: TEST_UNIT_READY: handle(0x003d), lun(0) [Mon Dec 9 06:17:27 2019][ 22.421525] scsi 1:0:157:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 22.429905] scsi 1:0:157:0: SSP: handle(0x003d), sas_addr(0x5000cca26a24008a), phy(33), device_name(0x5000cca26a24008b) [Mon Dec 9 06:17:27 2019][ 22.440679] scsi 1:0:157:0: enclosure logical id(0x5000ccab0405db00), slot(42) [Mon Dec 9 06:17:27 2019][ 22.447983] scsi 1:0:157:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 22.454877] scsi 1:0:157:0: serial_number( 2TGMTTPD) [Mon Dec 9 06:17:27 2019][ 22.460450] scsi 1:0:157:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 22.483317] mpt3sas_cm0: detecting: handle(0x003e), sas_address(0x5000cca26a24b9ea), phy(34) [Mon Dec 9 06:17:27 2019][ 22.491758] mpt3sas_cm0: REPORT_LUNS: handle(0x003e), retries(0) [Mon Dec 9 06:17:27 2019][ 22.497891] mpt3sas_cm0: TEST_UNIT_READY: handle(0x003e), lun(0) [Mon Dec 9 06:17:27 2019][ 22.506984] scsi 1:0:158:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 22.515363] scsi 1:0:158:0: SSP: handle(0x003e), sas_addr(0x5000cca26a24b9ea), phy(34), device_name(0x5000cca26a24b9eb) [Mon Dec 9 06:17:27 2019][ 22.526134] scsi 1:0:158:0: enclosure logical id(0x5000ccab0405db00), slot(43) [Mon Dec 9 06:17:27 2019][ 22.533438] scsi 1:0:158:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 22.540331] scsi 1:0:158:0: serial_number( 2TGN64DD) [Mon Dec 9 06:17:27 2019][ 22.545905] scsi 1:0:158:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 22.572373] mpt3sas_cm0: detecting: handle(0x003f), sas_address(0x5000cca26a25aed6), phy(35) [Mon Dec 9 06:17:27 2019][ 22.580812] mpt3sas_cm0: REPORT_LUNS: handle(0x003f), retries(0) [Mon Dec 9 06:17:27 2019][ 22.586962] mpt3sas_cm0: TEST_UNIT_READY: handle(0x003f), lun(0) [Mon Dec 9 06:17:27 2019][ 22.593574] scsi 1:0:159:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 22.601988] scsi 1:0:159:0: SSP: handle(0x003f), sas_addr(0x5000cca26a25aed6), phy(35), device_name(0x5000cca26a25aed7) [Mon Dec 9 06:17:27 2019][ 22.612759] scsi 1:0:159:0: enclosure logical id(0x5000ccab0405db00), slot(44) [Mon Dec 9 06:17:27 2019][ 22.620066] scsi 1:0:159:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 22.626956] scsi 1:0:159:0: serial_number( 2TGNRG1D) [Mon Dec 9 06:17:27 2019][ 22.632530] scsi 1:0:159:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 22.652320] mpt3sas_cm0: detecting: handle(0x0040), sas_address(0x5000cca266d32b6a), phy(36) [Mon Dec 9 06:17:27 2019][ 22.660761] mpt3sas_cm0: REPORT_LUNS: handle(0x0040), retries(0) [Mon Dec 9 06:17:27 2019][ 22.666902] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0040), lun(0) [Mon Dec 9 06:17:27 2019][ 22.674761] scsi 1:0:160:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 22.683145] scsi 1:0:160:0: SSP: handle(0x0040), sas_addr(0x5000cca266d32b6a), phy(36), device_name(0x5000cca266d32b6b) [Mon Dec 9 06:17:27 2019][ 22.693916] scsi 1:0:160:0: enclosure logical id(0x5000ccab0405db00), slot(45) [Mon Dec 9 06:17:27 2019][ 22.701222] scsi 1:0:160:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 22.708113] scsi 1:0:160:0: serial_number( 7JKS46JK) [Mon Dec 9 06:17:27 2019][ 22.713687] scsi 1:0:160:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 22.736429] mpt3sas_cm0: detecting: handle(0x0041), sas_address(0x5000cca26b9bf886), phy(37) [Mon Dec 9 06:17:27 2019][ 22.744866] mpt3sas_cm0: REPORT_LUNS: handle(0x0041), retries(0) [Mon Dec 9 06:17:27 2019][ 22.751007] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0041), lun(0) [Mon Dec 9 06:17:27 2019][ 22.757858] scsi 1:0:161:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 22.766243] scsi 1:0:161:0: SSP: handle(0x0041), sas_addr(0x5000cca26b9bf886), phy(37), device_name(0x5000cca26b9bf887) [Mon Dec 9 06:17:27 2019][ 22.777013] scsi 1:0:161:0: enclosure logical id(0x5000ccab0405db00), slot(46) [Mon Dec 9 06:17:27 2019][ 22.784318] scsi 1:0:161:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 22.791211] scsi 1:0:161:0: serial_number( 1SJST42Z) [Mon Dec 9 06:17:28 2019][ 22.796785] scsi 1:0:161:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 22.819333] mpt3sas_cm0: detecting: handle(0x0042), sas_address(0x5000cca26b9b24ca), phy(38) [Mon Dec 9 06:17:28 2019][ 22.827773] mpt3sas_cm0: REPORT_LUNS: handle(0x0042), retries(0) [Mon Dec 9 06:17:28 2019][ 22.833946] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0042), lun(0) [Mon Dec 9 06:17:28 2019][ 22.840765] scsi 1:0:162:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 22.849150] scsi 1:0:162:0: SSP: handle(0x0042), sas_addr(0x5000cca26b9b24ca), phy(38), device_name(0x5000cca26b9b24cb) [Mon Dec 9 06:17:28 2019][ 22.859921] scsi 1:0:162:0: enclosure logical id(0x5000ccab0405db00), slot(47) [Mon Dec 9 06:17:28 2019][ 22.867225] scsi 1:0:162:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 22.874119] scsi 1:0:162:0: serial_number( 1SJSA0YZ) [Mon Dec 9 06:17:28 2019][ 22.879693] scsi 1:0:162:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 22.902325] mpt3sas_cm0: detecting: handle(0x0043), sas_address(0x5000cca26a21d742), phy(39) [Mon Dec 9 06:17:28 2019][ 22.910768] mpt3sas_cm0: REPORT_LUNS: handle(0x0043), retries(0) [Mon Dec 9 06:17:28 2019][ 22.916934] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0043), lun(0) [Mon Dec 9 06:17:28 2019][ 22.923724] scsi 1:0:163:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 22.932114] scsi 1:0:163:0: SSP: handle(0x0043), sas_addr(0x5000cca26a21d742), phy(39), device_name(0x5000cca26a21d743) [Mon Dec 9 06:17:28 2019][ 22.942890] scsi 1:0:163:0: enclosure logical id(0x5000ccab0405db00), slot(48) [Mon Dec 9 06:17:28 2019][ 22.950194] scsi 1:0:163:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 22.957087] scsi 1:0:163:0: serial_number( 2TGLLYED) [Mon Dec 9 06:17:28 2019][ 22.962661] scsi 1:0:163:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 22.985328] mpt3sas_cm0: detecting: handle(0x0044), sas_address(0x5000cca26a27af5e), phy(40) [Mon Dec 9 06:17:28 2019][ 22.993771] mpt3sas_cm0: REPORT_LUNS: handle(0x0044), retries(0) [Mon Dec 9 06:17:28 2019][ 22.999938] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0044), lun(0) [Mon Dec 9 06:17:28 2019][ 23.006708] scsi 1:0:164:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 23.015105] scsi 1:0:164:0: SSP: handle(0x0044), sas_addr(0x5000cca26a27af5e), phy(40), device_name(0x5000cca26a27af5f) [Mon Dec 9 06:17:28 2019][ 23.025875] scsi 1:0:164:0: enclosure logical id(0x5000ccab0405db00), slot(49) [Mon Dec 9 06:17:28 2019][ 23.033179] scsi 1:0:164:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 23.040072] scsi 1:0:164:0: serial_number( 2TGPUL5D) [Mon Dec 9 06:17:28 2019][ 23.045645] scsi 1:0:164:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 23.068333] mpt3sas_cm0: detecting: handle(0x0045), sas_address(0x5000cca2525552e6), phy(41) [Mon Dec 9 06:17:28 2019][ 23.076773] mpt3sas_cm0: REPORT_LUNS: handle(0x0045), retries(0) [Mon Dec 9 06:17:28 2019][ 23.082912] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0045), lun(0) [Mon Dec 9 06:17:28 2019][ 23.142655] scsi 1:0:165:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 23.151033] scsi 1:0:165:0: SSP: handle(0x0045), sas_addr(0x5000cca2525552e6), phy(41), device_name(0x5000cca2525552e7) [Mon Dec 9 06:17:28 2019][ 23.161806] scsi 1:0:165:0: enclosure logical id(0x5000ccab0405db00), slot(50) [Mon Dec 9 06:17:28 2019][ 23.169111] scsi 1:0:165:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 23.176004] scsi 1:0:165:0: serial_number( 7SHHXP0G) [Mon Dec 9 06:17:28 2019][ 23.181578] scsi 1:0:165:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 23.227333] mpt3sas_cm0: detecting: handle(0x0046), sas_address(0x5000cca26a26dff2), phy(42) [Mon Dec 9 06:17:28 2019][ 23.235774] mpt3sas_cm0: REPORT_LUNS: handle(0x0046), retries(0) [Mon Dec 9 06:17:28 2019][ 23.241931] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0046), lun(0) [Mon Dec 9 06:17:28 2019][ 23.248548] scsi 1:0:166:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 23.256933] scsi 1:0:166:0: SSP: handle(0x0046), sas_addr(0x5000cca26a26dff2), phy(42), device_name(0x5000cca26a26dff3) [Mon Dec 9 06:17:28 2019][ 23.267707] scsi 1:0:166:0: enclosure logical id(0x5000ccab0405db00), slot(51) [Mon Dec 9 06:17:28 2019][ 23.275011] scsi 1:0:166:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 23.281905] scsi 1:0:166:0: serial_number( 2TGPBSYD) [Mon Dec 9 06:17:28 2019][ 23.287479] scsi 1:0:166:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 23.307329] mpt3sas_cm0: detecting: handle(0x0047), sas_address(0x5000cca26b9c5d52), phy(43) [Mon Dec 9 06:17:28 2019][ 23.315772] mpt3sas_cm0: REPORT_LUNS: handle(0x0047), retries(0) [Mon Dec 9 06:17:28 2019][ 23.321910] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0047), lun(0) [Mon Dec 9 06:17:28 2019][ 23.328636] scsi 1:0:167:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 23.337013] scsi 1:0:167:0: SSP: handle(0x0047), sas_addr(0x5000cca26b9c5d52), phy(43), device_name(0x5000cca26b9c5d53) [Mon Dec 9 06:17:28 2019][ 23.347789] scsi 1:0:167:0: enclosure logical id(0x5000ccab0405db00), slot(52) [Mon Dec 9 06:17:28 2019][ 23.355093] scsi 1:0:167:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 23.361987] scsi 1:0:167:0: serial_number( 1SJSZV5Z) [Mon Dec 9 06:17:28 2019][ 23.367561] scsi 1:0:167:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 23.390963] mpt3sas_cm0: detecting: handle(0x0048), sas_address(0x5000cca26b9602c6), phy(44) [Mon Dec 9 06:17:28 2019][ 23.399405] mpt3sas_cm0: REPORT_LUNS: handle(0x0048), retries(0) [Mon Dec 9 06:17:28 2019][ 23.405550] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0048), lun(0) [Mon Dec 9 06:17:28 2019][ 23.421839] scsi 1:0:168:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 23.430254] scsi 1:0:168:0: SSP: handle(0x0048), sas_addr(0x5000cca26b9602c6), phy(44), device_name(0x5000cca26b9602c7) [Mon Dec 9 06:17:28 2019][ 23.441026] scsi 1:0:168:0: enclosure logical id(0x5000ccab0405db00), slot(53) [Mon Dec 9 06:17:28 2019][ 23.448333] scsi 1:0:168:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 23.455227] scsi 1:0:168:0: serial_number( 1SJNHJ4Z) [Mon Dec 9 06:17:28 2019][ 23.460798] scsi 1:0:168:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 23.483345] mpt3sas_cm0: detecting: handle(0x0049), sas_address(0x5000cca252544a02), phy(45) [Mon Dec 9 06:17:28 2019][ 23.491788] mpt3sas_cm0: REPORT_LUNS: handle(0x0049), retries(0) [Mon Dec 9 06:17:28 2019][ 23.497955] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0049), lun(0) [Mon Dec 9 06:17:28 2019][ 23.521496] scsi 1:0:169:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 23.529884] scsi 1:0:169:0: SSP: handle(0x0049), sas_addr(0x5000cca252544a02), phy(45), device_name(0x5000cca252544a03) [Mon Dec 9 06:17:28 2019][ 23.540660] scsi 1:0:169:0: enclosure logical id(0x5000ccab0405db00), slot(54) [Mon Dec 9 06:17:28 2019][ 23.547968] scsi 1:0:169:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 23.554862] scsi 1:0:169:0: serial_number( 7SHHB14G) [Mon Dec 9 06:17:28 2019][ 23.560434] scsi 1:0:169:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 23.580344] mpt3sas_cm0: detecting: handle(0x004a), sas_address(0x5000cca252559f9e), phy(46) [Mon Dec 9 06:17:28 2019][ 23.588787] mpt3sas_cm0: REPORT_LUNS: handle(0x004a), retries(0) [Mon Dec 9 06:17:28 2019][ 23.594926] mpt3sas_cm0: TEST_UNIT_READY: handle(0x004a), lun(0) [Mon Dec 9 06:17:28 2019][ 23.609780] scsi 1:0:170:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 23.618175] scsi 1:0:170:0: SSP: handle(0x004a), sas_addr(0x5000cca252559f9e), phy(46), device_name(0x5000cca252559f9f) [Mon Dec 9 06:17:28 2019][ 23.628950] scsi 1:0:170:0: enclosure logical id(0x5000ccab0405db00), slot(55) [Mon Dec 9 06:17:28 2019][ 23.636257] scsi 1:0:170:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 23.643151] scsi 1:0:170:0: serial_number( 7SHJ2TDG) [Mon Dec 9 06:17:28 2019][ 23.648723] scsi 1:0:170:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 23.678375] mpt3sas_cm0: detecting: handle(0x004b), sas_address(0x5000cca25255571e), phy(47) [Mon Dec 9 06:17:28 2019][ 23.686816] mpt3sas_cm0: REPORT_LUNS: handle(0x004b), retries(0) [Mon Dec 9 06:17:28 2019][ 23.692956] mpt3sas_cm0: TEST_UNIT_READY: handle(0x004b), lun(0) [Mon Dec 9 06:17:28 2019][ 23.707569] scsi 1:0:171:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 23.715945] scsi 1:0:171:0: SSP: handle(0x004b), sas_addr(0x5000cca25255571e), phy(47), device_name(0x5000cca25255571f) [Mon Dec 9 06:17:28 2019][ 23.726723] scsi 1:0:171:0: enclosure logical id(0x5000ccab0405db00), slot(56) [Mon Dec 9 06:17:28 2019][ 23.734028] scsi 1:0:171:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 23.740921] scsi 1:0:171:0: serial_number( 7SHHXYRG) [Mon Dec 9 06:17:28 2019][ 23.746494] scsi 1:0:171:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 23.766347] mpt3sas_cm0: detecting: handle(0x004c), sas_address(0x5000cca26b9bf57e), phy(48) [Mon Dec 9 06:17:28 2019][ 23.774786] mpt3sas_cm0: REPORT_LUNS: handle(0x004c), retries(0) [Mon Dec 9 06:17:29 2019][ 23.780923] mpt3sas_cm0: TEST_UNIT_READY: handle(0x004c), lun(0) [Mon Dec 9 06:17:29 2019][ 23.805687] scsi 1:0:172:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:29 2019][ 23.814070] scsi 1:0:172:0: SSP: handle(0x004c), sas_addr(0x5000cca26b9bf57e), phy(48), device_name(0x5000cca26b9bf57f) [Mon Dec 9 06:17:29 2019][ 23.824840] scsi 1:0:172:0: enclosure logical id(0x5000ccab0405db00), slot(57) [Mon Dec 9 06:17:29 2019][ 23.832147] scsi 1:0:172:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:29 2019][ 23.839039] scsi 1:0:172:0: serial_number( 1SJSSXUZ) [Mon Dec 9 06:17:29 2019][ 23.844613] scsi 1:0:172:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:29 2019][ 23.867347] mpt3sas_cm0: detecting: handle(0x004d), sas_address(0x5000cca252555372), phy(49) [Mon Dec 9 06:17:29 2019][ 23.875788] mpt3sas_cm0: REPORT_LUNS: handle(0x004d), retries(0) [Mon Dec 9 06:17:29 2019][ 23.881952] mpt3sas_cm0: TEST_UNIT_READY: handle(0x004d), lun(0) [Mon Dec 9 06:17:29 2019][ 23.888580] scsi 1:0:173:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:29 2019][ 23.896952] scsi 1:0:173:0: SSP: handle(0x004d), sas_addr(0x5000cca252555372), phy(49), device_name(0x5000cca252555373) [Mon Dec 9 06:17:29 2019][ 23.907722] scsi 1:0:173:0: enclosure logical id(0x5000ccab0405db00), slot(58) [Mon Dec 9 06:17:29 2019][ 23.915026] scsi 1:0:173:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:29 2019][ 23.921920] scsi 1:0:173:0: serial_number( 7SHHXR4G) [Mon Dec 9 06:17:29 2019][ 23.927494] scsi 1:0:173:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:29 2019][ 23.947349] mpt3sas_cm0: detecting: handle(0x004e), sas_address(0x5000cca25253eefe), phy(50) [Mon Dec 9 06:17:29 2019][ 23.955786] mpt3sas_cm0: REPORT_LUNS: handle(0x004e), retries(0) [Mon Dec 9 06:17:29 2019][ 23.961956] mpt3sas_cm0: TEST_UNIT_READY: handle(0x004e), lun(0) [Mon Dec 9 06:17:29 2019][ 23.968592] scsi 1:0:174:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:29 2019][ 23.976965] scsi 1:0:174:0: SSP: handle(0x004e), sas_addr(0x5000cca25253eefe), phy(50), device_name(0x5000cca25253eeff) [Mon Dec 9 06:17:29 2019][ 23.987734] scsi 1:0:174:0: enclosure logical id(0x5000ccab0405db00), slot(59) [Mon Dec 9 06:17:29 2019][ 23.995039] scsi 1:0:174:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:29 2019][ 24.001933] scsi 1:0:174:0: serial_number( 7SHH4Z7G) [Mon Dec 9 06:17:29 2019][ 24.007506] scsi 1:0:174:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:29 2019][ 24.029845] mpt3sas_cm0: expander_add: handle(0x001a), parent(0x0017), sas_addr(0x5000ccab0405db7b), phys(68) [Mon Dec 9 06:17:29 2019][ 24.050568] mpt3sas_cm0: detecting: handle(0x004f), sas_address(0x5000cca26b9cbb06), phy(42) [Mon Dec 9 06:17:29 2019][ 24.059024] mpt3sas_cm0: REPORT_LUNS: handle(0x004f), retries(0) [Mon Dec 9 06:17:29 2019][ 24.065149] mpt3sas_cm0: TEST_UNIT_READY: handle(0x004f), lun(0) [Mon Dec 9 06:17:29 2019][ 24.071795] scsi 1:0:175:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:29 2019][ 24.080191] scsi 1:0:175:0: SSP: handle(0x004f), sas_addr(0x5000cca26b9cbb06), phy(42), device_name(0x5000cca26b9cbb07) [Mon Dec 9 06:17:29 2019][ 24.090967] scsi 1:0:175:0: enclosure logical id(0x5000ccab0405db00), slot(1) [Mon Dec 9 06:17:29 2019][ 24.098184] scsi 1:0:175:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:29 2019][ 24.105078] scsi 1:0:175:0: serial_number( 1SJT62MZ) [Mon Dec 9 06:17:29 2019][ 24.110649] scsi 1:0:175:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:29 2019][ 24.147349] mpt3sas_cm0: detecting: handle(0x0050), sas_address(0x5000cca252544476), phy(43) [Mon Dec 9 06:17:29 2019][ 24.155805] mpt3sas_cm0: REPORT_LUNS: handle(0x0050), retries(0) [Mon Dec 9 06:17:29 2019][ 24.161950] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0050), lun(0) [Mon Dec 9 06:17:29 2019][ 24.168678] scsi 1:0:176:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:29 2019][ 24.184932] scsi 1:0:176:0: SSP: handle(0x0050), sas_addr(0x5000cca252544476), phy(43), device_name(0x5000cca252544477) [Mon Dec 9 06:17:29 2019][ 24.195704] scsi 1:0:176:0: enclosure logical id(0x5000ccab0405db00), slot(3) [Mon Dec 9 06:17:29 2019][ 24.202924] scsi 1:0:176:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:29 2019][ 24.209815] scsi 1:0:176:0: serial_number( 7SHHANPG) [Mon Dec 9 06:17:29 2019][ 24.215389] scsi 1:0:176:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:29 2019][ 24.235948] mpt3sas_cm0: detecting: handle(0x0051), sas_address(0x5000cca26a26173e), phy(44) [Mon Dec 9 06:17:29 2019][ 24.244384] mpt3sas_cm0: REPORT_LUNS: handle(0x0051), retries(0) [Mon Dec 9 06:17:29 2019][ 24.250543] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0051), lun(0) [Mon Dec 9 06:17:29 2019][ 24.275589] scsi 1:0:177:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:29 2019][ 24.283977] scsi 1:0:177:0: SSP: handle(0x0051), sas_addr(0x5000cca26a26173e), phy(44), device_name(0x5000cca26a26173f) [Mon Dec 9 06:17:29 2019][ 24.294751] scsi 1:0:177:0: enclosure logical id(0x5000ccab0405db00), slot(4) [Mon Dec 9 06:17:29 2019][ 24.301968] scsi 1:0:177:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:29 2019][ 24.308861] scsi 1:0:177:0: serial_number( 2TGNYDLD) [Mon Dec 9 06:17:29 2019][ 24.314435] scsi 1:0:177:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:29 2019][ 24.334358] mpt3sas_cm0: detecting: handle(0x0052), sas_address(0x5000cca252544cb6), phy(45) [Mon Dec 9 06:17:29 2019][ 24.342796] mpt3sas_cm0: REPORT_LUNS: handle(0x0052), retries(0) [Mon Dec 9 06:17:29 2019][ 24.348954] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0052), lun(0) [Mon Dec 9 06:17:29 2019][ 24.355768] scsi 1:0:178:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:29 2019][ 24.364145] scsi 1:0:178:0: SSP: handle(0x0052), sas_addr(0x5000cca252544cb6), phy(45), device_name(0x5000cca252544cb7) [Mon Dec 9 06:17:29 2019][ 24.374919] scsi 1:0:178:0: enclosure logical id(0x5000ccab0405db00), slot(5) [Mon Dec 9 06:17:29 2019][ 24.382136] scsi 1:0:178:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:29 2019][ 24.389029] scsi 1:0:178:0: serial_number( 7SHHB6RG) [Mon Dec 9 06:17:29 2019][ 24.394602] scsi 1:0:178:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:29 2019][ 24.446367] mpt3sas_cm0: detecting: handle(0x0053), sas_address(0x5000cca26c238692), phy(46) [Mon Dec 9 06:17:29 2019][ 24.454805] mpt3sas_cm0: REPORT_LUNS: handle(0x0053), retries(0) [Mon Dec 9 06:17:29 2019][ 24.460952] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0053), lun(0) [Mon Dec 9 06:17:29 2019][ 24.706352] scsi 1:0:179:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:29 2019][ 24.714726] scsi 1:0:179:0: SSP: handle(0x0053), sas_addr(0x5000cca26c238692), phy(46), device_name(0x5000cca26c238693) [Mon Dec 9 06:17:29 2019][ 24.725502] scsi 1:0:179:0: enclosure logical id(0x5000ccab0405db00), slot(6) [Mon Dec 9 06:17:29 2019][ 24.732722] scsi 1:0:179:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:29 2019][ 24.739615] scsi 1:0:179:0: serial_number( 1DGMJNWZ) [Mon Dec 9 06:17:29 2019][ 24.745188] scsi 1:0:179:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:30 2019][ 24.931367] mpt3sas_cm0: detecting: handle(0x0054), sas_address(0x5000cca26a2ac96a), phy(47) [Mon Dec 9 06:17:30 2019][ 24.939820] mpt3sas_cm0: REPORT_LUNS: handle(0x0054), retries(0) [Mon Dec 9 06:17:30 2019][ 24.945945] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0054), lun(0) [Mon Dec 9 06:17:30 2019][ 24.999733] scsi 1:0:180:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:30 2019][ 25.008148] scsi 1:0:180:0: SSP: handle(0x0054), sas_addr(0x5000cca26a2ac96a), phy(47), device_name(0x5000cca26a2ac96b) [Mon Dec 9 06:17:30 2019][ 25.018921] scsi 1:0:180:0: enclosure logical id(0x5000ccab0405db00), slot(7) [Mon Dec 9 06:17:30 2019][ 25.026138] scsi 1:0:180:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:30 2019][ 25.033030] scsi 1:0:180:0: serial_number( 2TGSJGHD) [Mon Dec 9 06:17:30 2019][ 25.038603] scsi 1:0:180:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:30 2019][ 25.155379] mpt3sas_cm0: detecting: handle(0x0055), sas_address(0x5000cca25253e61a), phy(48) [Mon Dec 9 06:17:30 2019][ 25.163819] mpt3sas_cm0: REPORT_LUNS: handle(0x0055), retries(0) [Mon Dec 9 06:17:30 2019][ 25.169993] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0055), lun(0) [Mon Dec 9 06:17:30 2019][ 25.176816] scsi 1:0:181:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:30 2019][ 25.185205] scsi 1:0:181:0: SSP: handle(0x0055), sas_addr(0x5000cca25253e61a), phy(48), device_name(0x5000cca25253e61b) [Mon Dec 9 06:17:30 2019][ 25.195976] scsi 1:0:181:0: enclosure logical id(0x5000ccab0405db00), slot(8) [Mon Dec 9 06:17:30 2019][ 25.203195] scsi 1:0:181:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:30 2019][ 25.210094] scsi 1:0:181:0: serial_number( 7SHH4BWG) [Mon Dec 9 06:17:30 2019][ 25.215670] scsi 1:0:181:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:30 2019][ 25.238382] mpt3sas_cm0: detecting: handle(0x0056), sas_address(0x5000cca252542cfe), phy(49) [Mon Dec 9 06:17:30 2019][ 25.246821] mpt3sas_cm0: REPORT_LUNS: handle(0x0056), retries(0) [Mon Dec 9 06:17:30 2019][ 25.252993] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0056), lun(0) [Mon Dec 9 06:17:30 2019][ 25.259851] scsi 1:0:182:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:30 2019][ 25.268284] scsi 1:0:182:0: SSP: handle(0x0056), sas_addr(0x5000cca252542cfe), phy(49), device_name(0x5000cca252542cff) [Mon Dec 9 06:17:30 2019][ 25.279056] scsi 1:0:182:0: enclosure logical id(0x5000ccab0405db00), slot(9) [Mon Dec 9 06:17:30 2019][ 25.286276] scsi 1:0:182:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:30 2019][ 25.293168] scsi 1:0:182:0: serial_number( 7SHH937G) [Mon Dec 9 06:17:30 2019][ 25.298744] scsi 1:0:182:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:30 2019][ 25.321384] mpt3sas_cm0: detecting: handle(0x0057), sas_address(0x5000cca26a3181fe), phy(50) [Mon Dec 9 06:17:30 2019][ 25.329825] mpt3sas_cm0: REPORT_LUNS: handle(0x0057), retries(0) [Mon Dec 9 06:17:30 2019][ 25.335991] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0057), lun(0) [Mon Dec 9 06:17:30 2019][ 25.342817] scsi 1:0:183:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:30 2019][ 25.351209] scsi 1:0:183:0: SSP: handle(0x0057), sas_addr(0x5000cca26a3181fe), phy(50), device_name(0x5000cca26a3181ff) [Mon Dec 9 06:17:30 2019][ 25.361981] scsi 1:0:183:0: enclosure logical id(0x5000ccab0405db00), slot(10) [Mon Dec 9 06:17:30 2019][ 25.369287] scsi 1:0:183:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:30 2019][ 25.376181] scsi 1:0:183:0: serial_number( 2TGW71ND) [Mon Dec 9 06:17:30 2019][ 25.381753] scsi 1:0:183:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:30 2019][ 25.406892] mpt3sas_cm0: expander_add: handle(0x0099), parent(0x000a), sas_addr(0x5000ccab0405db3d), phys(49) [Mon Dec 9 06:17:30 2019][ 25.428327] mpt3sas_cm0: detecting: handle(0x009d), sas_address(0x5000ccab0405db3c), phy(48) [Mon Dec 9 06:17:30 2019][ 25.436763] mpt3sas_cm0: REPORT_LUNS: handle(0x009d), retries(0) [Mon Dec 9 06:17:30 2019][ 25.444026] mpt3sas_cm0: TEST_UNIT_READY: handle(0x009d), lun(0) [Mon Dec 9 06:17:30 2019][ 25.450935] scsi 1:0:184:0: Enclosure HGST H4060-J 2033 PQ: 0 ANSI: 6 [Mon Dec 9 06:17:30 2019][ 25.459522] scsi 1:0:184:0: set ignore_delay_remove for handle(0x009d) [Mon Dec 9 06:17:30 2019][ 25.466046] scsi 1:0:184:0: SES: handle(0x009d), sas_addr(0x5000ccab0405db3c), phy(48), device_name(0x0000000000000000) [Mon Dec 9 06:17:30 2019][ 25.476817] scsi 1:0:184:0: enclosure logical id(0x5000ccab0405db00), slot(60) [Mon Dec 9 06:17:30 2019][ 25.484122] scsi 1:0:184:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:30 2019][ 25.491014] scsi 1:0:184:0: serial_number(USWSJ03918EZ0069 ) [Mon Dec 9 06:17:30 2019][ 25.496936] scsi 1:0:184:0: qdepth(1), tagged(0), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:30 2019][ 25.521756] mpt3sas_cm0: expander_add: handle(0x009b), parent(0x0099), sas_addr(0x5000ccab0405db3f), phys(68) [Mon Dec 9 06:17:30 2019][ 25.543977] mpt3sas_cm0: detecting: handle(0x009e), sas_address(0x5000cca252550a75), phy(0) [Mon Dec 9 06:17:30 2019][ 25.552334] mpt3sas_cm0: REPORT_LUNS: handle(0x009e), retries(0) [Mon Dec 9 06:17:30 2019][ 25.558455] mpt3sas_cm0: TEST_UNIT_READY: handle(0x009e), lun(0) [Mon Dec 9 06:17:30 2019][ 25.565084] scsi 1:0:185:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:30 2019][ 25.573487] scsi 1:0:185:0: SSP: handle(0x009e), sas_addr(0x5000cca252550a75), phy(0), device_name(0x5000cca252550a77) [Mon Dec 9 06:17:30 2019][ 25.584174] scsi 1:0:185:0: enclosure logical id(0x5000ccab0405db00), slot(0) [Mon Dec 9 06:17:30 2019][ 25.591393] scsi 1:0:185:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:30 2019][ 25.598287] scsi 1:0:185:0: serial_number( 7SHHSVGG) [Mon Dec 9 06:17:30 2019][ 25.603860] scsi 1:0:185:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:30 2019][ 25.626393] mpt3sas_cm0: detecting: handle(0x009f), sas_address(0x5000cca25253eb31), phy(1) [Mon Dec 9 06:17:30 2019][ 25.634742] mpt3sas_cm0: REPORT_LUNS: handle(0x009f), retries(0) [Mon Dec 9 06:17:30 2019][ 25.640883] mpt3sas_cm0: TEST_UNIT_READY: handle(0x009f), lun(0) [Mon Dec 9 06:17:30 2019][ 25.647533] scsi 1:0:186:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:30 2019][ 25.655910] scsi 1:0:186:0: SSP: handle(0x009f), sas_addr(0x5000cca25253eb31), phy(1), device_name(0x5000cca25253eb33) [Mon Dec 9 06:17:30 2019][ 25.666596] scsi 1:0:186:0: enclosure logical id(0x5000ccab0405db00), slot(2) [Mon Dec 9 06:17:30 2019][ 25.673816] scsi 1:0:186:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:30 2019][ 25.680709] scsi 1:0:186:0: serial_number( 7SHH4RDG) [Mon Dec 9 06:17:30 2019][ 25.686280] scsi 1:0:186:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:30 2019][ 25.706392] mpt3sas_cm0: detecting: handle(0x00a0), sas_address(0x5000cca26b950bb5), phy(2) [Mon Dec 9 06:17:30 2019][ 25.714746] mpt3sas_cm0: REPORT_LUNS: handle(0x00a0), retries(0) [Mon Dec 9 06:17:30 2019][ 25.720892] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a0), lun(0) [Mon Dec 9 06:17:30 2019][ 25.764588] scsi 1:0:187:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:30 2019][ 25.772971] scsi 1:0:187:0: SSP: handle(0x00a0), sas_addr(0x5000cca26b950bb5), phy(2), device_name(0x5000cca26b950bb7) [Mon Dec 9 06:17:30 2019][ 25.783659] scsi 1:0:187:0: enclosure logical id(0x5000ccab0405db00), slot(11) [Mon Dec 9 06:17:31 2019][ 25.790967] scsi 1:0:187:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 25.797860] scsi 1:0:187:0: serial_number( 1SJMZ22Z) [Mon Dec 9 06:17:31 2019][ 25.803431] scsi 1:0:187:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 25.832397] mpt3sas_cm0: detecting: handle(0x00a1), sas_address(0x5000cca25253f3bd), phy(3) [Mon Dec 9 06:17:31 2019][ 25.840745] mpt3sas_cm0: REPORT_LUNS: handle(0x00a1), retries(0) [Mon Dec 9 06:17:31 2019][ 25.846885] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a1), lun(0) [Mon Dec 9 06:17:31 2019][ 25.853530] scsi 1:0:188:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 25.861915] scsi 1:0:188:0: SSP: handle(0x00a1), sas_addr(0x5000cca25253f3bd), phy(3), device_name(0x5000cca25253f3bf) [Mon Dec 9 06:17:31 2019][ 25.872598] scsi 1:0:188:0: enclosure logical id(0x5000ccab0405db00), slot(12) [Mon Dec 9 06:17:31 2019][ 25.879905] scsi 1:0:188:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 25.886798] scsi 1:0:188:0: serial_number( 7SHH591G) [Mon Dec 9 06:17:31 2019][ 25.892372] scsi 1:0:188:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 25.912398] mpt3sas_cm0: detecting: handle(0x00a2), sas_address(0x5000cca26a2ac3d9), phy(4) [Mon Dec 9 06:17:31 2019][ 25.920749] mpt3sas_cm0: REPORT_LUNS: handle(0x00a2), retries(0) [Mon Dec 9 06:17:31 2019][ 25.926922] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a2), lun(0) [Mon Dec 9 06:17:31 2019][ 25.933712] scsi 1:0:189:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 25.942091] scsi 1:0:189:0: SSP: handle(0x00a2), sas_addr(0x5000cca26a2ac3d9), phy(4), device_name(0x5000cca26a2ac3db) [Mon Dec 9 06:17:31 2019][ 25.952776] scsi 1:0:189:0: enclosure logical id(0x5000ccab0405db00), slot(13) [Mon Dec 9 06:17:31 2019][ 25.960084] scsi 1:0:189:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 25.966974] scsi 1:0:189:0: serial_number( 2TGSJ30D) [Mon Dec 9 06:17:31 2019][ 25.972548] scsi 1:0:189:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 25.992400] mpt3sas_cm0: detecting: handle(0x00a3), sas_address(0x5000cca252541029), phy(5) [Mon Dec 9 06:17:31 2019][ 26.000753] mpt3sas_cm0: REPORT_LUNS: handle(0x00a3), retries(0) [Mon Dec 9 06:17:31 2019][ 26.006920] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a3), lun(0) [Mon Dec 9 06:17:31 2019][ 26.013633] scsi 1:0:190:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 26.022009] scsi 1:0:190:0: SSP: handle(0x00a3), sas_addr(0x5000cca252541029), phy(5), device_name(0x5000cca25254102b) [Mon Dec 9 06:17:31 2019][ 26.032695] scsi 1:0:190:0: enclosure logical id(0x5000ccab0405db00), slot(14) [Mon Dec 9 06:17:31 2019][ 26.040000] scsi 1:0:190:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 26.046892] scsi 1:0:190:0: serial_number( 7SHH75RG) [Mon Dec 9 06:17:31 2019][ 26.052465] scsi 1:0:190:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 26.072401] mpt3sas_cm0: detecting: handle(0x00a4), sas_address(0x5000cca252545349), phy(6) [Mon Dec 9 06:17:31 2019][ 26.080749] mpt3sas_cm0: REPORT_LUNS: handle(0x00a4), retries(0) [Mon Dec 9 06:17:31 2019][ 26.086889] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a4), lun(0) [Mon Dec 9 06:17:31 2019][ 26.093539] scsi 1:0:191:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 26.101914] scsi 1:0:191:0: SSP: handle(0x00a4), sas_addr(0x5000cca252545349), phy(6), device_name(0x5000cca25254534b) [Mon Dec 9 06:17:31 2019][ 26.112602] scsi 1:0:191:0: enclosure logical id(0x5000ccab0405db00), slot(15) [Mon Dec 9 06:17:31 2019][ 26.119909] scsi 1:0:191:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 26.126801] scsi 1:0:191:0: serial_number( 7SHHBN9G) [Mon Dec 9 06:17:31 2019][ 26.132376] scsi 1:0:191:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 26.152404] mpt3sas_cm0: detecting: handle(0x00a5), sas_address(0x5000cca2525430c5), phy(7) [Mon Dec 9 06:17:31 2019][ 26.160752] mpt3sas_cm0: REPORT_LUNS: handle(0x00a5), retries(0) [Mon Dec 9 06:17:31 2019][ 26.167075] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a5), lun(0) [Mon Dec 9 06:17:31 2019][ 26.178924] scsi 1:0:192:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 26.187308] scsi 1:0:192:0: SSP: handle(0x00a5), sas_addr(0x5000cca2525430c5), phy(7), device_name(0x5000cca2525430c7) [Mon Dec 9 06:17:31 2019][ 26.197997] scsi 1:0:192:0: enclosure logical id(0x5000ccab0405db00), slot(16) [Mon Dec 9 06:17:31 2019][ 26.205304] scsi 1:0:192:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 26.212197] scsi 1:0:192:0: serial_number( 7SHH9B1G) [Mon Dec 9 06:17:31 2019][ 26.217770] scsi 1:0:192:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 26.240414] mpt3sas_cm0: detecting: handle(0x00a6), sas_address(0x5000cca25254385d), phy(8) [Mon Dec 9 06:17:31 2019][ 26.248764] mpt3sas_cm0: REPORT_LUNS: handle(0x00a6), retries(0) [Mon Dec 9 06:17:31 2019][ 26.254897] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a6), lun(0) [Mon Dec 9 06:17:31 2019][ 26.266038] scsi 1:0:193:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 26.274419] scsi 1:0:193:0: SSP: handle(0x00a6), sas_addr(0x5000cca25254385d), phy(8), device_name(0x5000cca25254385f) [Mon Dec 9 06:17:31 2019][ 26.285107] scsi 1:0:193:0: enclosure logical id(0x5000ccab0405db00), slot(17) [Mon Dec 9 06:17:31 2019][ 26.292414] scsi 1:0:193:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 26.299306] scsi 1:0:193:0: serial_number( 7SHH9VRG) [Mon Dec 9 06:17:31 2019][ 26.304879] scsi 1:0:193:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 26.333425] mpt3sas_cm0: detecting: handle(0x00a7), sas_address(0x5000cca25253f30d), phy(9) [Mon Dec 9 06:17:31 2019][ 26.341777] mpt3sas_cm0: REPORT_LUNS: handle(0x00a7), retries(0) [Mon Dec 9 06:17:31 2019][ 26.347919] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a7), lun(0) [Mon Dec 9 06:17:31 2019][ 26.354730] scsi 1:0:194:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 26.363122] scsi 1:0:194:0: SSP: handle(0x00a7), sas_addr(0x5000cca25253f30d), phy(9), device_name(0x5000cca25253f30f) [Mon Dec 9 06:17:31 2019][ 26.373805] scsi 1:0:194:0: enclosure logical id(0x5000ccab0405db00), slot(18) [Mon Dec 9 06:17:31 2019][ 26.381110] scsi 1:0:194:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 26.388004] scsi 1:0:194:0: serial_number( 7SHH57MG) [Mon Dec 9 06:17:31 2019][ 26.393578] scsi 1:0:194:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 26.413444] mpt3sas_cm0: detecting: handle(0x00a8), sas_address(0x5000cca252545f65), phy(10) [Mon Dec 9 06:17:31 2019][ 26.421884] mpt3sas_cm0: REPORT_LUNS: handle(0x00a8), retries(0) [Mon Dec 9 06:17:31 2019][ 26.428044] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a8), lun(0) [Mon Dec 9 06:17:31 2019][ 26.434872] scsi 1:0:195:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 26.443275] scsi 1:0:195:0: SSP: handle(0x00a8), sas_addr(0x5000cca252545f65), phy(10), device_name(0x5000cca252545f67) [Mon Dec 9 06:17:31 2019][ 26.454044] scsi 1:0:195:0: enclosure logical id(0x5000ccab0405db00), slot(19) [Mon Dec 9 06:17:31 2019][ 26.461350] scsi 1:0:195:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 26.468240] scsi 1:0:195:0: serial_number( 7SHHDG9G) [Mon Dec 9 06:17:31 2019][ 26.473815] scsi 1:0:195:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 26.496412] mpt3sas_cm0: detecting: handle(0x00a9), sas_address(0x5000cca266daa4e5), phy(11) [Mon Dec 9 06:17:31 2019][ 26.504852] mpt3sas_cm0: REPORT_LUNS: handle(0x00a9), retries(0) [Mon Dec 9 06:17:31 2019][ 26.510986] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a9), lun(0) [Mon Dec 9 06:17:31 2019][ 26.525513] scsi 1:0:196:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 26.533900] scsi 1:0:196:0: SSP: handle(0x00a9), sas_addr(0x5000cca266daa4e5), phy(11), device_name(0x5000cca266daa4e7) [Mon Dec 9 06:17:31 2019][ 26.544673] scsi 1:0:196:0: enclosure logical id(0x5000ccab0405db00), slot(20) [Mon Dec 9 06:17:31 2019][ 26.551978] scsi 1:0:196:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 26.558872] scsi 1:0:196:0: serial_number( 7JKW7MYK) [Mon Dec 9 06:17:31 2019][ 26.564450] scsi 1:0:196:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 26.594415] mpt3sas_cm0: detecting: handle(0x00aa), sas_address(0x5000cca26a25167d), phy(12) [Mon Dec 9 06:17:31 2019][ 26.602849] mpt3sas_cm0: REPORT_LUNS: handle(0x00aa), retries(0) [Mon Dec 9 06:17:31 2019][ 26.609007] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00aa), lun(0) [Mon Dec 9 06:17:31 2019][ 26.615817] scsi 1:0:197:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 26.624208] scsi 1:0:197:0: SSP: handle(0x00aa), sas_addr(0x5000cca26a25167d), phy(12), device_name(0x5000cca26a25167f) [Mon Dec 9 06:17:31 2019][ 26.634981] scsi 1:0:197:0: enclosure logical id(0x5000ccab0405db00), slot(21) [Mon Dec 9 06:17:31 2019][ 26.642288] scsi 1:0:197:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 26.649182] scsi 1:0:197:0: serial_number( 2TGND9JD) [Mon Dec 9 06:17:31 2019][ 26.654752] scsi 1:0:197:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 26.677415] mpt3sas_cm0: detecting: handle(0x00ab), sas_address(0x5000cca25253eda9), phy(13) [Mon Dec 9 06:17:31 2019][ 26.685852] mpt3sas_cm0: REPORT_LUNS: handle(0x00ab), retries(0) [Mon Dec 9 06:17:31 2019][ 26.691985] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ab), lun(0) [Mon Dec 9 06:17:31 2019][ 26.701296] scsi 1:0:198:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 26.709679] scsi 1:0:198:0: SSP: handle(0x00ab), sas_addr(0x5000cca25253eda9), phy(13), device_name(0x5000cca25253edab) [Mon Dec 9 06:17:31 2019][ 26.720456] scsi 1:0:198:0: enclosure logical id(0x5000ccab0405db00), slot(22) [Mon Dec 9 06:17:31 2019][ 26.727760] scsi 1:0:198:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 26.734652] scsi 1:0:198:0: serial_number( 7SHH4WHG) [Mon Dec 9 06:17:31 2019][ 26.740225] scsi 1:0:198:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 26.763416] mpt3sas_cm0: detecting: handle(0x00ac), sas_address(0x5000cca266d491a1), phy(14) [Mon Dec 9 06:17:31 2019][ 26.771853] mpt3sas_cm0: REPORT_LUNS: handle(0x00ac), retries(0) [Mon Dec 9 06:17:31 2019][ 26.777986] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ac), lun(0) [Mon Dec 9 06:17:31 2019][ 26.784642] scsi 1:0:199:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 26.793028] scsi 1:0:199:0: SSP: handle(0x00ac), sas_addr(0x5000cca266d491a1), phy(14), device_name(0x5000cca266d491a3) [Mon Dec 9 06:17:32 2019][ 26.803805] scsi 1:0:199:0: enclosure logical id(0x5000ccab0405db00), slot(23) [Mon Dec 9 06:17:32 2019][ 26.811109] scsi 1:0:199:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 26.818001] scsi 1:0:199:0: serial_number( 7JKSX22K) [Mon Dec 9 06:17:32 2019][ 26.823577] scsi 1:0:199:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 26.843419] mpt3sas_cm0: detecting: handle(0x00ad), sas_address(0x5000cca26b9a7099), phy(15) [Mon Dec 9 06:17:32 2019][ 26.851857] mpt3sas_cm0: REPORT_LUNS: handle(0x00ad), retries(0) [Mon Dec 9 06:17:32 2019][ 26.858031] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ad), lun(0) [Mon Dec 9 06:17:32 2019][ 26.874100] scsi 1:0:200:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 26.882478] scsi 1:0:200:0: SSP: handle(0x00ad), sas_addr(0x5000cca26b9a7099), phy(15), device_name(0x5000cca26b9a709b) [Mon Dec 9 06:17:32 2019][ 26.893254] scsi 1:0:200:0: enclosure logical id(0x5000ccab0405db00), slot(24) [Mon Dec 9 06:17:32 2019][ 26.900561] scsi 1:0:200:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 26.907454] scsi 1:0:200:0: serial_number( 1SJRY0YZ) [Mon Dec 9 06:17:32 2019][ 26.913026] scsi 1:0:200:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 26.935424] mpt3sas_cm0: detecting: handle(0x00ae), sas_address(0x5000cca25253f831), phy(16) [Mon Dec 9 06:17:32 2019][ 26.943866] mpt3sas_cm0: REPORT_LUNS: handle(0x00ae), retries(0) [Mon Dec 9 06:17:32 2019][ 26.957487] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ae), lun(0) [Mon Dec 9 06:17:32 2019][ 26.965975] scsi 1:0:201:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 26.974368] scsi 1:0:201:0: SSP: handle(0x00ae), sas_addr(0x5000cca25253f831), phy(16), device_name(0x5000cca25253f833) [Mon Dec 9 06:17:32 2019][ 26.985141] scsi 1:0:201:0: enclosure logical id(0x5000ccab0405db00), slot(25) [Mon Dec 9 06:17:32 2019][ 26.992448] scsi 1:0:201:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 26.999341] scsi 1:0:201:0: serial_number( 7SHH5L7G) [Mon Dec 9 06:17:32 2019][ 27.004912] scsi 1:0:201:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 27.035435] mpt3sas_cm0: detecting: handle(0x00af), sas_address(0x5000cca26a2ab23d), phy(17) [Mon Dec 9 06:17:32 2019][ 27.043873] mpt3sas_cm0: REPORT_LUNS: handle(0x00af), retries(0) [Mon Dec 9 06:17:32 2019][ 27.050033] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00af), lun(0) [Mon Dec 9 06:17:32 2019][ 27.056681] scsi 1:0:202:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 27.065071] scsi 1:0:202:0: SSP: handle(0x00af), sas_addr(0x5000cca26a2ab23d), phy(17), device_name(0x5000cca26a2ab23f) [Mon Dec 9 06:17:32 2019][ 27.075839] scsi 1:0:202:0: enclosure logical id(0x5000ccab0405db00), slot(26) [Mon Dec 9 06:17:32 2019][ 27.083146] scsi 1:0:202:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 27.090037] scsi 1:0:202:0: serial_number( 2TGSGXND) [Mon Dec 9 06:17:32 2019][ 27.095615] scsi 1:0:202:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 27.116475] mpt3sas_cm0: detecting: handle(0x00b0), sas_address(0x5000cca26b9b9695), phy(18) [Mon Dec 9 06:17:32 2019][ 27.124907] mpt3sas_cm0: REPORT_LUNS: handle(0x00b0), retries(0) [Mon Dec 9 06:17:32 2019][ 27.131074] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b0), lun(0) [Mon Dec 9 06:17:32 2019][ 27.137769] scsi 1:0:203:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 27.146201] scsi 1:0:203:0: SSP: handle(0x00b0), sas_addr(0x5000cca26b9b9695), phy(18), device_name(0x5000cca26b9b9697) [Mon Dec 9 06:17:32 2019][ 27.156970] scsi 1:0:203:0: enclosure logical id(0x5000ccab0405db00), slot(27) [Mon Dec 9 06:17:32 2019][ 27.164278] scsi 1:0:203:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 27.171169] scsi 1:0:203:0: serial_number( 1SJSKLWZ) [Mon Dec 9 06:17:32 2019][ 27.176743] scsi 1:0:203:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 27.197430] mpt3sas_cm0: detecting: handle(0x00b1), sas_address(0x5000cca252559471), phy(19) [Mon Dec 9 06:17:32 2019][ 27.205864] mpt3sas_cm0: REPORT_LUNS: handle(0x00b1), retries(0) [Mon Dec 9 06:17:32 2019][ 27.212023] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b1), lun(0) [Mon Dec 9 06:17:32 2019][ 27.218682] scsi 1:0:204:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 27.227066] scsi 1:0:204:0: SSP: handle(0x00b1), sas_addr(0x5000cca252559471), phy(19), device_name(0x5000cca252559473) [Mon Dec 9 06:17:32 2019][ 27.237843] scsi 1:0:204:0: enclosure logical id(0x5000ccab0405db00), slot(28) [Mon Dec 9 06:17:32 2019][ 27.245146] scsi 1:0:204:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 27.252038] scsi 1:0:204:0: serial_number( 7SHJ21AG) [Mon Dec 9 06:17:32 2019][ 27.257614] scsi 1:0:204:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 27.312439] mpt3sas_cm0: detecting: handle(0x00b2), sas_address(0x5000cca25253f94d), phy(20) [Mon Dec 9 06:17:32 2019][ 27.320874] mpt3sas_cm0: REPORT_LUNS: handle(0x00b2), retries(0) [Mon Dec 9 06:17:32 2019][ 27.327010] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b2), lun(0) [Mon Dec 9 06:17:32 2019][ 27.345304] scsi 1:0:205:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 27.353687] scsi 1:0:205:0: SSP: handle(0x00b2), sas_addr(0x5000cca25253f94d), phy(20), device_name(0x5000cca25253f94f) [Mon Dec 9 06:17:32 2019][ 27.364456] scsi 1:0:205:0: enclosure logical id(0x5000ccab0405db00), slot(29) [Mon Dec 9 06:17:32 2019][ 27.371762] scsi 1:0:205:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 27.378640] scsi 1:0:205:0: serial_number( 7SHH5NJG) [Mon Dec 9 06:17:32 2019][ 27.384210] scsi 1:0:205:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 27.426450] mpt3sas_cm0: detecting: handle(0x00b3), sas_address(0x5000cca25253e699), phy(21) [Mon Dec 9 06:17:32 2019][ 27.434887] mpt3sas_cm0: REPORT_LUNS: handle(0x00b3), retries(0) [Mon Dec 9 06:17:32 2019][ 27.441017] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b3), lun(0) [Mon Dec 9 06:17:32 2019][ 27.464744] scsi 1:0:206:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 27.473116] scsi 1:0:206:0: SSP: handle(0x00b3), sas_addr(0x5000cca25253e699), phy(21), device_name(0x5000cca25253e69b) [Mon Dec 9 06:17:32 2019][ 27.483887] scsi 1:0:206:0: enclosure logical id(0x5000ccab0405db00), slot(30) [Mon Dec 9 06:17:32 2019][ 27.491191] scsi 1:0:206:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 27.498068] scsi 1:0:206:0: serial_number( 7SHH4DXG) [Mon Dec 9 06:17:32 2019][ 27.503641] scsi 1:0:206:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 27.525026] mpt3sas_cm0: detecting: handle(0x00b4), sas_address(0x5000cca252543cc1), phy(22) [Mon Dec 9 06:17:32 2019][ 27.533466] mpt3sas_cm0: REPORT_LUNS: handle(0x00b4), retries(0) [Mon Dec 9 06:17:32 2019][ 27.539607] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b4), lun(0) [Mon Dec 9 06:17:32 2019][ 27.546220] scsi 1:0:207:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 27.554616] scsi 1:0:207:0: SSP: handle(0x00b4), sas_addr(0x5000cca252543cc1), phy(22), device_name(0x5000cca252543cc3) [Mon Dec 9 06:17:32 2019][ 27.565390] scsi 1:0:207:0: enclosure logical id(0x5000ccab0405db00), slot(31) [Mon Dec 9 06:17:32 2019][ 27.572695] scsi 1:0:207:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 27.579590] scsi 1:0:207:0: serial_number( 7SHHA4TG) [Mon Dec 9 06:17:32 2019][ 27.585163] scsi 1:0:207:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 27.605437] mpt3sas_cm0: detecting: handle(0x00b5), sas_address(0x5000cca26a24fcdd), phy(23) [Mon Dec 9 06:17:32 2019][ 27.613875] mpt3sas_cm0: REPORT_LUNS: handle(0x00b5), retries(0) [Mon Dec 9 06:17:32 2019][ 27.620034] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b5), lun(0) [Mon Dec 9 06:17:32 2019][ 27.626760] scsi 1:0:208:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 27.635194] scsi 1:0:208:0: SSP: handle(0x00b5), sas_addr(0x5000cca26a24fcdd), phy(23), device_name(0x5000cca26a24fcdf) [Mon Dec 9 06:17:32 2019][ 27.645965] scsi 1:0:208:0: enclosure logical id(0x5000ccab0405db00), slot(32) [Mon Dec 9 06:17:32 2019][ 27.653272] scsi 1:0:208:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 27.660161] scsi 1:0:208:0: serial_number( 2TGNALMD) [Mon Dec 9 06:17:32 2019][ 27.665737] scsi 1:0:208:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 27.688439] mpt3sas_cm0: detecting: handle(0x00b6), sas_address(0x5000cca252543bcd), phy(24) [Mon Dec 9 06:17:32 2019][ 27.696880] mpt3sas_cm0: REPORT_LUNS: handle(0x00b6), retries(0) [Mon Dec 9 06:17:32 2019][ 27.703020] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b6), lun(0) [Mon Dec 9 06:17:32 2019][ 27.709721] scsi 1:0:209:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 27.718108] scsi 1:0:209:0: SSP: handle(0x00b6), sas_addr(0x5000cca252543bcd), phy(24), device_name(0x5000cca252543bcf) [Mon Dec 9 06:17:32 2019][ 27.728882] scsi 1:0:209:0: enclosure logical id(0x5000ccab0405db00), slot(33) [Mon Dec 9 06:17:32 2019][ 27.736186] scsi 1:0:209:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 27.743079] scsi 1:0:209:0: serial_number( 7SHHA2UG) [Mon Dec 9 06:17:32 2019][ 27.748652] scsi 1:0:209:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 27.777438] mpt3sas_cm0: detecting: handle(0x00b7), sas_address(0x5000cca252551265), phy(25) [Mon Dec 9 06:17:33 2019][ 27.785879] mpt3sas_cm0: REPORT_LUNS: handle(0x00b7), retries(0) [Mon Dec 9 06:17:33 2019][ 27.792011] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b7), lun(0) [Mon Dec 9 06:17:33 2019][ 27.798652] scsi 1:0:210:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:33 2019][ 27.807035] scsi 1:0:210:0: SSP: handle(0x00b7), sas_addr(0x5000cca252551265), phy(25), device_name(0x5000cca252551267) [Mon Dec 9 06:17:33 2019][ 27.817812] scsi 1:0:210:0: enclosure logical id(0x5000ccab0405db00), slot(34) [Mon Dec 9 06:17:33 2019][ 27.825119] scsi 1:0:210:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:33 2019][ 27.832007] scsi 1:0:210:0: serial_number( 7SHHTBVG) [Mon Dec 9 06:17:33 2019][ 27.837583] scsi 1:0:210:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:33 2019][ 27.857442] mpt3sas_cm0: detecting: handle(0x00b8), sas_address(0x5000cca252555fc9), phy(26) [Mon Dec 9 06:17:33 2019][ 27.865882] mpt3sas_cm0: REPORT_LUNS: handle(0x00b8), retries(0) [Mon Dec 9 06:17:33 2019][ 27.872017] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b8), lun(0) [Mon Dec 9 06:17:33 2019][ 27.878631] scsi 1:0:211:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:33 2019][ 27.887014] scsi 1:0:211:0: SSP: handle(0x00b8), sas_addr(0x5000cca252555fc9), phy(26), device_name(0x5000cca252555fcb) [Mon Dec 9 06:17:33 2019][ 27.897789] scsi 1:0:211:0: enclosure logical id(0x5000ccab0405db00), slot(35) [Mon Dec 9 06:17:33 2019][ 27.905096] scsi 1:0:211:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:33 2019][ 27.911988] scsi 1:0:211:0: serial_number( 7SHHYJMG) [Mon Dec 9 06:17:33 2019][ 27.917561] scsi 1:0:211:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:33 2019][ 27.937443] mpt3sas_cm0: detecting: handle(0x00b9), sas_address(0x5000cca252559f7d), phy(27) [Mon Dec 9 06:17:33 2019][ 27.945878] mpt3sas_cm0: REPORT_LUNS: handle(0x00b9), retries(0) [Mon Dec 9 06:17:33 2019][ 27.952037] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b9), lun(0) [Mon Dec 9 06:17:33 2019][ 27.966904] scsi 1:0:212:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:33 2019][ 27.975275] scsi 1:0:212:0: SSP: handle(0x00b9), sas_addr(0x5000cca252559f7d), phy(27), device_name(0x5000cca252559f7f) [Mon Dec 9 06:17:33 2019][ 27.986044] scsi 1:0:212:0: enclosure logical id(0x5000ccab0405db00), slot(36) [Mon Dec 9 06:17:33 2019][ 27.993351] scsi 1:0:212:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:33 2019][ 28.000229] scsi 1:0:212:0: serial_number( 7SHJ2T4G) [Mon Dec 9 06:17:33 2019][ 28.005810] scsi 1:0:212:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:33 2019][ 28.028446] mpt3sas_cm0: detecting: handle(0x00ba), sas_address(0x5000cca26c244bcd), phy(28) [Mon Dec 9 06:17:33 2019][ 28.036881] mpt3sas_cm0: REPORT_LUNS: handle(0x00ba), retries(0) [Mon Dec 9 06:17:33 2019][ 28.043022] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ba), lun(0) [Mon Dec 9 06:17:33 2019][ 28.049655] scsi 1:0:213:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:33 2019][ 28.058039] scsi 1:0:213:0: SSP: handle(0x00ba), sas_addr(0x5000cca26c244bcd), phy(28), device_name(0x5000cca26c244bcf) [Mon Dec 9 06:17:33 2019][ 28.068812] scsi 1:0:213:0: enclosure logical id(0x5000ccab0405db00), slot(37) [Mon Dec 9 06:17:33 2019][ 28.076120] scsi 1:0:213:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:33 2019][ 28.083010] scsi 1:0:213:0: serial_number( 1DGMYU2Z) [Mon Dec 9 06:17:33 2019][ 28.088586] scsi 1:0:213:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:33 2019][ 28.121449] mpt3sas_cm0: detecting: handle(0x00bb), sas_address(0x5000cca26a2aa10d), phy(29) [Mon Dec 9 06:17:33 2019][ 28.129884] mpt3sas_cm0: REPORT_LUNS: handle(0x00bb), retries(0) [Mon Dec 9 06:17:33 2019][ 28.136025] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00bb), lun(0) [Mon Dec 9 06:17:33 2019][ 28.233702] scsi 1:0:214:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:33 2019][ 28.242088] scsi 1:0:214:0: SSP: handle(0x00bb), sas_addr(0x5000cca26a2aa10d), phy(29), device_name(0x5000cca26a2aa10f) [Mon Dec 9 06:17:33 2019][ 28.252863] scsi 1:0:214:0: enclosure logical id(0x5000ccab0405db00), slot(38) [Mon Dec 9 06:17:33 2019][ 28.260170] scsi 1:0:214:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:33 2019][ 28.267060] scsi 1:0:214:0: serial_number( 2TGSET5D) [Mon Dec 9 06:17:33 2019][ 28.272635] scsi 1:0:214:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:33 2019][ 28.298454] mpt3sas_cm0: detecting: handle(0x00bc), sas_address(0x5000cca25254e235), phy(30) [Mon Dec 9 06:17:33 2019][ 28.306889] mpt3sas_cm0: REPORT_LUNS: handle(0x00bc), retries(0) [Mon Dec 9 06:17:33 2019][ 28.313051] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00bc), lun(0) [Mon Dec 9 06:17:33 2019][ 28.319670] scsi 1:0:215:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:33 2019][ 28.328055] scsi 1:0:215:0: SSP: handle(0x00bc), sas_addr(0x5000cca25254e235), phy(30), device_name(0x5000cca25254e237) [Mon Dec 9 06:17:33 2019][ 28.338830] scsi 1:0:215:0: enclosure logical id(0x5000ccab0405db00), slot(39) [Mon Dec 9 06:17:33 2019][ 28.346138] scsi 1:0:215:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:33 2019][ 28.353028] scsi 1:0:215:0: serial_number( 7SHHP5BG) [Mon Dec 9 06:17:33 2019][ 28.358602] scsi 1:0:215:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:33 2019][ 28.379459] mpt3sas_cm0: detecting: handle(0x00bd), sas_address(0x5000cca25254df95), phy(31) [Mon Dec 9 06:17:33 2019][ 28.387899] mpt3sas_cm0: REPORT_LUNS: handle(0x00bd), retries(0) [Mon Dec 9 06:17:33 2019][ 28.394037] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00bd), lun(0) [Mon Dec 9 06:17:33 2019][ 28.439541] scsi 1:0:216:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:33 2019][ 28.447922] scsi 1:0:216:0: SSP: handle(0x00bd), sas_addr(0x5000cca25254df95), phy(31), device_name(0x5000cca25254df97) [Mon Dec 9 06:17:33 2019][ 28.458694] scsi 1:0:216:0: enclosure logical id(0x5000ccab0405db00), slot(40) [Mon Dec 9 06:17:33 2019][ 28.466001] scsi 1:0:216:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:33 2019][ 28.472890] scsi 1:0:216:0: serial_number( 7SHHNZYG) [Mon Dec 9 06:17:33 2019][ 28.478466] scsi 1:0:216:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:33 2019][ 28.568112] mpt3sas_cm0: detecting: handle(0x00be), sas_address(0x5000cca25254e9d1), phy(32) [Mon Dec 9 06:17:33 2019][ 28.576557] mpt3sas_cm0: REPORT_LUNS: handle(0x00be), retries(0) [Mon Dec 9 06:17:33 2019][ 28.582697] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00be), lun(0) [Mon Dec 9 06:17:33 2019][ 28.594947] scsi 1:0:217:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:33 2019][ 28.603638] scsi 1:0:217:0: SSP: handle(0x00be), sas_addr(0x5000cca25254e9d1), phy(32), device_name(0x5000cca25254e9d3) [Mon Dec 9 06:17:33 2019][ 28.614413] scsi 1:0:217:0: enclosure logical id(0x5000ccab0405db00), slot(41) [Mon Dec 9 06:17:33 2019][ 28.621717] scsi 1:0:217:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:33 2019][ 28.628596] scsi 1:0:217:0: serial_number( 7SHHPP2G) [Mon Dec 9 06:17:33 2019][ 28.634174] scsi 1:0:217:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:33 2019][ 28.663469] mpt3sas_cm0: detecting: handle(0x00bf), sas_address(0x5000cca26a240089), phy(33) [Mon Dec 9 06:17:33 2019][ 28.671904] mpt3sas_cm0: REPORT_LUNS: handle(0x00bf), retries(0) [Mon Dec 9 06:17:33 2019][ 28.678034] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00bf), lun(0) [Mon Dec 9 06:17:33 2019][ 28.722599] scsi 1:0:218:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:33 2019][ 28.730985] scsi 1:0:218:0: SSP: handle(0x00bf), sas_addr(0x5000cca26a240089), phy(33), device_name(0x5000cca26a24008b) [Mon Dec 9 06:17:33 2019][ 28.741754] scsi 1:0:218:0: enclosure logical id(0x5000ccab0405db00), slot(42) [Mon Dec 9 06:17:33 2019][ 28.749059] scsi 1:0:218:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:33 2019][ 28.755937] scsi 1:0:218:0: serial_number( 2TGMTTPD) [Mon Dec 9 06:17:34 2019][ 28.761509] scsi 1:0:218:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 28.787473] mpt3sas_cm0: detecting: handle(0x00c0), sas_address(0x5000cca26a24b9e9), phy(34) [Mon Dec 9 06:17:34 2019][ 28.795909] mpt3sas_cm0: REPORT_LUNS: handle(0x00c0), retries(0) [Mon Dec 9 06:17:34 2019][ 28.802042] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c0), lun(0) [Mon Dec 9 06:17:34 2019][ 28.813348] scsi 1:0:219:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 28.821732] scsi 1:0:219:0: SSP: handle(0x00c0), sas_addr(0x5000cca26a24b9e9), phy(34), device_name(0x5000cca26a24b9eb) [Mon Dec 9 06:17:34 2019][ 28.832506] scsi 1:0:219:0: enclosure logical id(0x5000ccab0405db00), slot(43) [Mon Dec 9 06:17:34 2019][ 28.839810] scsi 1:0:219:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 28.846701] scsi 1:0:219:0: serial_number( 2TGN64DD) [Mon Dec 9 06:17:34 2019][ 28.852277] scsi 1:0:219:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 28.872472] mpt3sas_cm0: detecting: handle(0x00c1), sas_address(0x5000cca26a25aed5), phy(35) [Mon Dec 9 06:17:34 2019][ 28.880904] mpt3sas_cm0: REPORT_LUNS: handle(0x00c1), retries(0) [Mon Dec 9 06:17:34 2019][ 28.887037] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c1), lun(0) [Mon Dec 9 06:17:34 2019][ 28.912813] scsi 1:0:220:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 28.921185] scsi 1:0:220:0: SSP: handle(0x00c1), sas_addr(0x5000cca26a25aed5), phy(35), device_name(0x5000cca26a25aed7) [Mon Dec 9 06:17:34 2019][ 28.931956] scsi 1:0:220:0: enclosure logical id(0x5000ccab0405db00), slot(44) [Mon Dec 9 06:17:34 2019][ 28.939263] scsi 1:0:220:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 28.946142] scsi 1:0:220:0: serial_number( 2TGNRG1D) [Mon Dec 9 06:17:34 2019][ 28.951720] scsi 1:0:220:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 28.972469] mpt3sas_cm0: detecting: handle(0x00c2), sas_address(0x5000cca266d32b69), phy(36) [Mon Dec 9 06:17:34 2019][ 28.980905] mpt3sas_cm0: REPORT_LUNS: handle(0x00c2), retries(0) [Mon Dec 9 06:17:34 2019][ 28.987076] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c2), lun(0) [Mon Dec 9 06:17:34 2019][ 28.993719] scsi 1:0:221:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 29.002108] scsi 1:0:221:0: SSP: handle(0x00c2), sas_addr(0x5000cca266d32b69), phy(36), device_name(0x5000cca266d32b6b) [Mon Dec 9 06:17:34 2019][ 29.012880] scsi 1:0:221:0: enclosure logical id(0x5000ccab0405db00), slot(45) [Mon Dec 9 06:17:34 2019][ 29.020187] scsi 1:0:221:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 29.027080] scsi 1:0:221:0: serial_number( 7JKS46JK) [Mon Dec 9 06:17:34 2019][ 29.032654] scsi 1:0:221:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 29.052474] mpt3sas_cm0: detecting: handle(0x00c3), sas_address(0x5000cca26b9bf885), phy(37) [Mon Dec 9 06:17:34 2019][ 29.060915] mpt3sas_cm0: REPORT_LUNS: handle(0x00c3), retries(0) [Mon Dec 9 06:17:34 2019][ 29.067055] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c3), lun(0) [Mon Dec 9 06:17:34 2019][ 29.073696] scsi 1:0:222:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 29.082084] scsi 1:0:222:0: SSP: handle(0x00c3), sas_addr(0x5000cca26b9bf885), phy(37), device_name(0x5000cca26b9bf887) [Mon Dec 9 06:17:34 2019][ 29.092858] scsi 1:0:222:0: enclosure logical id(0x5000ccab0405db00), slot(46) [Mon Dec 9 06:17:34 2019][ 29.100165] scsi 1:0:222:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 29.107056] scsi 1:0:222:0: serial_number( 1SJST42Z) [Mon Dec 9 06:17:34 2019][ 29.112630] scsi 1:0:222:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 29.188509] mpt3sas_cm0: detecting: handle(0x00c4), sas_address(0x5000cca26b9b24c9), phy(38) [Mon Dec 9 06:17:34 2019][ 29.196945] mpt3sas_cm0: REPORT_LUNS: handle(0x00c4), retries(0) [Mon Dec 9 06:17:34 2019][ 29.203080] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c4), lun(0) [Mon Dec 9 06:17:34 2019][ 29.212591] scsi 1:0:223:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 29.220974] scsi 1:0:223:0: SSP: handle(0x00c4), sas_addr(0x5000cca26b9b24c9), phy(38), device_name(0x5000cca26b9b24cb) [Mon Dec 9 06:17:34 2019][ 29.231746] scsi 1:0:223:0: enclosure logical id(0x5000ccab0405db00), slot(47) [Mon Dec 9 06:17:34 2019][ 29.239050] scsi 1:0:223:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 29.245928] scsi 1:0:223:0: serial_number( 1SJSA0YZ) [Mon Dec 9 06:17:34 2019][ 29.251499] scsi 1:0:223:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 29.271480] mpt3sas_cm0: detecting: handle(0x00c5), sas_address(0x5000cca26a21d741), phy(39) [Mon Dec 9 06:17:34 2019][ 29.279923] mpt3sas_cm0: REPORT_LUNS: handle(0x00c5), retries(0) [Mon Dec 9 06:17:34 2019][ 29.286086] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c5), lun(0) [Mon Dec 9 06:17:34 2019][ 29.292881] scsi 1:0:224:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 29.301275] scsi 1:0:224:0: SSP: handle(0x00c5), sas_addr(0x5000cca26a21d741), phy(39), device_name(0x5000cca26a21d743) [Mon Dec 9 06:17:34 2019][ 29.312043] scsi 1:0:224:0: enclosure logical id(0x5000ccab0405db00), slot(48) [Mon Dec 9 06:17:34 2019][ 29.319350] scsi 1:0:224:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 29.326240] scsi 1:0:224:0: serial_number( 2TGLLYED) [Mon Dec 9 06:17:34 2019][ 29.331818] scsi 1:0:224:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 29.352480] mpt3sas_cm0: detecting: handle(0x00c6), sas_address(0x5000cca26a27af5d), phy(40) [Mon Dec 9 06:17:34 2019][ 29.360920] mpt3sas_cm0: REPORT_LUNS: handle(0x00c6), retries(0) [Mon Dec 9 06:17:34 2019][ 29.367084] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c6), lun(0) [Mon Dec 9 06:17:34 2019][ 29.373724] scsi 1:0:225:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 29.382112] scsi 1:0:225:0: SSP: handle(0x00c6), sas_addr(0x5000cca26a27af5d), phy(40), device_name(0x5000cca26a27af5f) [Mon Dec 9 06:17:34 2019][ 29.392889] scsi 1:0:225:0: enclosure logical id(0x5000ccab0405db00), slot(49) [Mon Dec 9 06:17:34 2019][ 29.400196] scsi 1:0:225:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 29.407087] scsi 1:0:225:0: serial_number( 2TGPUL5D) [Mon Dec 9 06:17:34 2019][ 29.412660] scsi 1:0:225:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 29.432479] mpt3sas_cm0: detecting: handle(0x00c7), sas_address(0x5000cca2525552e5), phy(41) [Mon Dec 9 06:17:34 2019][ 29.440917] mpt3sas_cm0: REPORT_LUNS: handle(0x00c7), retries(0) [Mon Dec 9 06:17:34 2019][ 29.447061] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c7), lun(0) [Mon Dec 9 06:17:34 2019][ 29.453709] scsi 1:0:226:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 29.462105] scsi 1:0:226:0: SSP: handle(0x00c7), sas_addr(0x5000cca2525552e5), phy(41), device_name(0x5000cca2525552e7) [Mon Dec 9 06:17:34 2019][ 29.472876] scsi 1:0:226:0: enclosure logical id(0x5000ccab0405db00), slot(50) [Mon Dec 9 06:17:34 2019][ 29.480180] scsi 1:0:226:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 29.487073] scsi 1:0:226:0: serial_number( 7SHHXP0G) [Mon Dec 9 06:17:34 2019][ 29.492648] scsi 1:0:226:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 29.512486] mpt3sas_cm0: detecting: handle(0x00c8), sas_address(0x5000cca26a26dff1), phy(42) [Mon Dec 9 06:17:34 2019][ 29.520923] mpt3sas_cm0: REPORT_LUNS: handle(0x00c8), retries(0) [Mon Dec 9 06:17:34 2019][ 29.527070] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c8), lun(0) [Mon Dec 9 06:17:34 2019][ 29.533745] scsi 1:0:227:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 29.542134] scsi 1:0:227:0: SSP: handle(0x00c8), sas_addr(0x5000cca26a26dff1), phy(42), device_name(0x5000cca26a26dff3) [Mon Dec 9 06:17:34 2019][ 29.552904] scsi 1:0:227:0: enclosure logical id(0x5000ccab0405db00), slot(51) [Mon Dec 9 06:17:34 2019][ 29.560211] scsi 1:0:227:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 29.567108] scsi 1:0:227:0: serial_number( 2TGPBSYD) [Mon Dec 9 06:17:34 2019][ 29.572687] scsi 1:0:227:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 29.595491] mpt3sas_cm0: detecting: handle(0x00c9), sas_address(0x5000cca26b9c5d51), phy(43) [Mon Dec 9 06:17:34 2019][ 29.603934] mpt3sas_cm0: REPORT_LUNS: handle(0x00c9), retries(0) [Mon Dec 9 06:17:34 2019][ 29.610741] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c9), lun(0) [Mon Dec 9 06:17:34 2019][ 29.636259] scsi 1:0:228:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 29.644642] scsi 1:0:228:0: SSP: handle(0x00c9), sas_addr(0x5000cca26b9c5d51), phy(43), device_name(0x5000cca26b9c5d53) [Mon Dec 9 06:17:34 2019][ 29.655417] scsi 1:0:228:0: enclosure logical id(0x5000ccab0405db00), slot(52) [Mon Dec 9 06:17:34 2019][ 29.662724] scsi 1:0:228:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 29.669599] scsi 1:0:228:0: serial_number( 1SJSZV5Z) [Mon Dec 9 06:17:34 2019][ 29.675172] scsi 1:0:228:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 29.700086] mpt3sas_cm0: detecting: handle(0x00ca), sas_address(0x5000cca26b9602c5), phy(44) [Mon Dec 9 06:17:34 2019][ 29.708538] mpt3sas_cm0: REPORT_LUNS: handle(0x00ca), retries(0) [Mon Dec 9 06:17:34 2019][ 29.714655] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ca), lun(0) [Mon Dec 9 06:17:34 2019][ 29.726616] scsi 1:0:229:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 29.736415] scsi 1:0:229:0: SSP: handle(0x00ca), sas_addr(0x5000cca26b9602c5), phy(44), device_name(0x5000cca26b9602c7) [Mon Dec 9 06:17:34 2019][ 29.747190] scsi 1:0:229:0: enclosure logical id(0x5000ccab0405db00), slot(53) [Mon Dec 9 06:17:34 2019][ 29.754497] scsi 1:0:229:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 29.761390] scsi 1:0:229:0: serial_number( 1SJNHJ4Z) [Mon Dec 9 06:17:35 2019][ 29.766961] scsi 1:0:229:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 29.810615] mpt3sas_cm0: detecting: handle(0x00cb), sas_address(0x5000cca252544a01), phy(45) [Mon Dec 9 06:17:35 2019][ 29.819053] mpt3sas_cm0: REPORT_LUNS: handle(0x00cb), retries(0) [Mon Dec 9 06:17:35 2019][ 29.825193] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00cb), lun(0) [Mon Dec 9 06:17:35 2019][ 29.831829] scsi 1:0:230:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 29.840220] scsi 1:0:230:0: SSP: handle(0x00cb), sas_addr(0x5000cca252544a01), phy(45), device_name(0x5000cca252544a03) [Mon Dec 9 06:17:35 2019][ 29.850994] scsi 1:0:230:0: enclosure logical id(0x5000ccab0405db00), slot(54) [Mon Dec 9 06:17:35 2019][ 29.858301] scsi 1:0:230:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 29.865191] scsi 1:0:230:0: serial_number( 7SHHB14G) [Mon Dec 9 06:17:35 2019][ 29.870764] scsi 1:0:230:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 29.893527] mpt3sas_cm0: detecting: handle(0x00cc), sas_address(0x5000cca252559f9d), phy(46) [Mon Dec 9 06:17:35 2019][ 29.901968] mpt3sas_cm0: REPORT_LUNS: handle(0x00cc), retries(0) [Mon Dec 9 06:17:35 2019][ 29.908099] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00cc), lun(0) [Mon Dec 9 06:17:35 2019][ 29.980081] scsi 1:0:231:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 29.988467] scsi 1:0:231:0: SSP: handle(0x00cc), sas_addr(0x5000cca252559f9d), phy(46), device_name(0x5000cca252559f9f) [Mon Dec 9 06:17:35 2019][ 29.999241] scsi 1:0:231:0: enclosure logical id(0x5000ccab0405db00), slot(55) [Mon Dec 9 06:17:35 2019][ 30.006546] scsi 1:0:231:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 30.013424] scsi 1:0:231:0: serial_number( 7SHJ2TDG) [Mon Dec 9 06:17:35 2019][ 30.018995] scsi 1:0:231:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 30.039495] mpt3sas_cm0: detecting: handle(0x00cd), sas_address(0x5000cca25255571d), phy(47) [Mon Dec 9 06:17:35 2019][ 30.047935] mpt3sas_cm0: REPORT_LUNS: handle(0x00cd), retries(0) [Mon Dec 9 06:17:35 2019][ 30.054075] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00cd), lun(0) [Mon Dec 9 06:17:35 2019][ 30.060757] scsi 1:0:232:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 30.069152] scsi 1:0:232:0: SSP: handle(0x00cd), sas_addr(0x5000cca25255571d), phy(47), device_name(0x5000cca25255571f) [Mon Dec 9 06:17:35 2019][ 30.079921] scsi 1:0:232:0: enclosure logical id(0x5000ccab0405db00), slot(56) [Mon Dec 9 06:17:35 2019][ 30.087228] scsi 1:0:232:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 30.094118] scsi 1:0:232:0: serial_number( 7SHHXYRG) [Mon Dec 9 06:17:35 2019][ 30.099694] scsi 1:0:232:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 30.122499] mpt3sas_cm0: detecting: handle(0x00ce), sas_address(0x5000cca26b9bf57d), phy(48) [Mon Dec 9 06:17:35 2019][ 30.130939] mpt3sas_cm0: REPORT_LUNS: handle(0x00ce), retries(0) [Mon Dec 9 06:17:35 2019][ 30.137072] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ce), lun(0) [Mon Dec 9 06:17:35 2019][ 30.154253] scsi 1:0:233:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 30.162643] scsi 1:0:233:0: SSP: handle(0x00ce), sas_addr(0x5000cca26b9bf57d), phy(48), device_name(0x5000cca26b9bf57f) [Mon Dec 9 06:17:35 2019][ 30.173419] scsi 1:0:233:0: enclosure logical id(0x5000ccab0405db00), slot(57) [Mon Dec 9 06:17:35 2019][ 30.180726] scsi 1:0:233:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 30.187604] scsi 1:0:233:0: serial_number( 1SJSSXUZ) [Mon Dec 9 06:17:35 2019][ 30.193184] scsi 1:0:233:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 30.213502] mpt3sas_cm0: detecting: handle(0x00cf), sas_address(0x5000cca252555371), phy(49) [Mon Dec 9 06:17:35 2019][ 30.221944] mpt3sas_cm0: REPORT_LUNS: handle(0x00cf), retries(0) [Mon Dec 9 06:17:35 2019][ 30.228150] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00cf), lun(0) [Mon Dec 9 06:17:35 2019][ 30.235011] scsi 1:0:234:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 30.243408] scsi 1:0:234:0: SSP: handle(0x00cf), sas_addr(0x5000cca252555371), phy(49), device_name(0x5000cca252555373) [Mon Dec 9 06:17:35 2019][ 30.254177] scsi 1:0:234:0: enclosure logical id(0x5000ccab0405db00), slot(58) [Mon Dec 9 06:17:35 2019][ 30.261484] scsi 1:0:234:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 30.268376] scsi 1:0:234:0: serial_number( 7SHHXR4G) [Mon Dec 9 06:17:35 2019][ 30.273949] scsi 1:0:234:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 30.296498] mpt3sas_cm0: detecting: handle(0x00d0), sas_address(0x5000cca25253eefd), phy(50) [Mon Dec 9 06:17:35 2019][ 30.304936] mpt3sas_cm0: REPORT_LUNS: handle(0x00d0), retries(0) [Mon Dec 9 06:17:35 2019][ 30.311102] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d0), lun(0) [Mon Dec 9 06:17:35 2019][ 30.317742] scsi 1:0:235:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 30.326128] scsi 1:0:235:0: SSP: handle(0x00d0), sas_addr(0x5000cca25253eefd), phy(50), device_name(0x5000cca25253eeff) [Mon Dec 9 06:17:35 2019][ 30.336902] scsi 1:0:235:0: enclosure logical id(0x5000ccab0405db00), slot(59) [Mon Dec 9 06:17:35 2019][ 30.344210] scsi 1:0:235:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 30.351101] scsi 1:0:235:0: serial_number( 7SHH4Z7G) [Mon Dec 9 06:17:35 2019][ 30.356676] scsi 1:0:235:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 30.378985] mpt3sas_cm0: expander_add: handle(0x009c), parent(0x0099), sas_addr(0x5000ccab0405db7f), phys(68) [Mon Dec 9 06:17:35 2019][ 30.400920] mpt3sas_cm0: detecting: handle(0x00d1), sas_address(0x5000cca26b9cbb05), phy(42) [Mon Dec 9 06:17:35 2019][ 30.409354] mpt3sas_cm0: REPORT_LUNS: handle(0x00d1), retries(0) [Mon Dec 9 06:17:35 2019][ 30.415472] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d1), lun(0) [Mon Dec 9 06:17:35 2019][ 30.422288] scsi 1:0:236:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 30.430687] scsi 1:0:236:0: SSP: handle(0x00d1), sas_addr(0x5000cca26b9cbb05), phy(42), device_name(0x5000cca26b9cbb07) [Mon Dec 9 06:17:35 2019][ 30.441460] scsi 1:0:236:0: enclosure logical id(0x5000ccab0405db00), slot(1) [Mon Dec 9 06:17:35 2019][ 30.448679] scsi 1:0:236:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 30.455570] scsi 1:0:236:0: serial_number( 1SJT62MZ) [Mon Dec 9 06:17:35 2019][ 30.461146] scsi 1:0:236:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 30.483507] mpt3sas_cm0: detecting: handle(0x00d2), sas_address(0x5000cca252544475), phy(43) [Mon Dec 9 06:17:35 2019][ 30.491940] mpt3sas_cm0: REPORT_LUNS: handle(0x00d2), retries(0) [Mon Dec 9 06:17:35 2019][ 30.498097] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d2), lun(0) [Mon Dec 9 06:17:35 2019][ 30.504891] scsi 1:0:237:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 30.518314] scsi 1:0:237:0: SSP: handle(0x00d2), sas_addr(0x5000cca252544475), phy(43), device_name(0x5000cca252544477) [Mon Dec 9 06:17:35 2019][ 30.529082] scsi 1:0:237:0: enclosure logical id(0x5000ccab0405db00), slot(3) [Mon Dec 9 06:17:35 2019][ 30.536302] scsi 1:0:237:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 30.543195] scsi 1:0:237:0: serial_number( 7SHHANPG) [Mon Dec 9 06:17:35 2019][ 30.548769] scsi 1:0:237:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 30.572101] mpt3sas_cm0: detecting: handle(0x00d3), sas_address(0x5000cca26a26173d), phy(44) [Mon Dec 9 06:17:35 2019][ 30.580549] mpt3sas_cm0: REPORT_LUNS: handle(0x00d3), retries(0) [Mon Dec 9 06:17:35 2019][ 30.586692] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d3), lun(0) [Mon Dec 9 06:17:35 2019][ 30.614858] scsi 1:0:238:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 30.623251] scsi 1:0:238:0: SSP: handle(0x00d3), sas_addr(0x5000cca26a26173d), phy(44), device_name(0x5000cca26a26173f) [Mon Dec 9 06:17:35 2019][ 30.634022] scsi 1:0:238:0: enclosure logical id(0x5000ccab0405db00), slot(4) [Mon Dec 9 06:17:35 2019][ 30.641240] scsi 1:0:238:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 30.648117] scsi 1:0:238:0: serial_number( 2TGNYDLD) [Mon Dec 9 06:17:35 2019][ 30.653688] scsi 1:0:238:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 30.691516] mpt3sas_cm0: detecting: handle(0x00d4), sas_address(0x5000cca252544cb5), phy(45) [Mon Dec 9 06:17:35 2019][ 30.699956] mpt3sas_cm0: REPORT_LUNS: handle(0x00d4), retries(0) [Mon Dec 9 06:17:35 2019][ 30.706109] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d4), lun(0) [Mon Dec 9 06:17:35 2019][ 30.752782] scsi 1:0:239:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 30.761165] scsi 1:0:239:0: SSP: handle(0x00d4), sas_addr(0x5000cca252544cb5), phy(45), device_name(0x5000cca252544cb7) [Mon Dec 9 06:17:35 2019][ 30.771937] scsi 1:0:239:0: enclosure logical id(0x5000ccab0405db00), slot(5) [Mon Dec 9 06:17:35 2019][ 30.779158] scsi 1:0:239:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 30.786033] scsi 1:0:239:0: serial_number( 7SHHB6RG) [Mon Dec 9 06:17:35 2019][ 30.791603] scsi 1:0:239:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:36 2019][ 30.811514] mpt3sas_cm0: detecting: handle(0x00d5), sas_address(0x5000cca26c238691), phy(46) [Mon Dec 9 06:17:36 2019][ 30.819947] mpt3sas_cm0: REPORT_LUNS: handle(0x00d5), retries(0) [Mon Dec 9 06:17:36 2019][ 30.826094] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d5), lun(0) [Mon Dec 9 06:17:36 2019][ 30.832906] scsi 1:0:240:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:36 2019][ 30.841307] scsi 1:0:240:0: SSP: handle(0x00d5), sas_addr(0x5000cca26c238691), phy(46), device_name(0x5000cca26c238693) [Mon Dec 9 06:17:36 2019][ 30.852081] scsi 1:0:240:0: enclosure logical id(0x5000ccab0405db00), slot(6) [Mon Dec 9 06:17:36 2019][ 30.859298] scsi 1:0:240:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:36 2019][ 30.866191] scsi 1:0:240:0: serial_number( 1DGMJNWZ) [Mon Dec 9 06:17:36 2019][ 30.871765] scsi 1:0:240:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:36 2019][ 30.894550] mpt3sas_cm0: detecting: handle(0x00d6), sas_address(0x5000cca26a2ac969), phy(47) [Mon Dec 9 06:17:36 2019][ 30.902984] mpt3sas_cm0: REPORT_LUNS: handle(0x00d6), retries(0) [Mon Dec 9 06:17:36 2019][ 30.909118] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d6), lun(0) [Mon Dec 9 06:17:36 2019][ 30.926099] scsi 1:0:241:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:36 2019][ 30.934473] scsi 1:0:241:0: SSP: handle(0x00d6), sas_addr(0x5000cca26a2ac969), phy(47), device_name(0x5000cca26a2ac96b) [Mon Dec 9 06:17:36 2019][ 30.945250] scsi 1:0:241:0: enclosure logical id(0x5000ccab0405db00), slot(7) [Mon Dec 9 06:17:36 2019][ 30.952469] scsi 1:0:241:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:36 2019][ 30.959344] scsi 1:0:241:0: serial_number( 2TGSJGHD) [Mon Dec 9 06:17:36 2019][ 30.964916] scsi 1:0:241:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:36 2019][ 30.985518] mpt3sas_cm0: detecting: handle(0x00d7), sas_address(0x5000cca25253e619), phy(48) [Mon Dec 9 06:17:36 2019][ 30.993952] mpt3sas_cm0: REPORT_LUNS: handle(0x00d7), retries(0) [Mon Dec 9 06:17:36 2019][ 31.000112] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d7), lun(0) [Mon Dec 9 06:17:36 2019][ 31.006862] scsi 1:0:242:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:36 2019][ 31.015252] scsi 1:0:242:0: SSP: handle(0x00d7), sas_addr(0x5000cca25253e619), phy(48), device_name(0x5000cca25253e61b) [Mon Dec 9 06:17:36 2019][ 31.026023] scsi 1:0:242:0: enclosure logical id(0x5000ccab0405db00), slot(8) [Mon Dec 9 06:17:36 2019][ 31.033243] scsi 1:0:242:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:36 2019][ 31.040136] scsi 1:0:242:0: serial_number( 7SHH4BWG) [Mon Dec 9 06:17:36 2019][ 31.045709] scsi 1:0:242:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:36 2019][ 31.066518] mpt3sas_cm0: detecting: handle(0x00d8), sas_address(0x5000cca252542cfd), phy(49) [Mon Dec 9 06:17:36 2019][ 31.074953] mpt3sas_cm0: REPORT_LUNS: handle(0x00d8), retries(0) [Mon Dec 9 06:17:36 2019][ 31.081085] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d8), lun(0) [Mon Dec 9 06:17:36 2019][ 31.087744] scsi 1:0:243:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:36 2019][ 31.096134] scsi 1:0:243:0: SSP: handle(0x00d8), sas_addr(0x5000cca252542cfd), phy(49), device_name(0x5000cca252542cff) [Mon Dec 9 06:17:36 2019][ 31.106903] scsi 1:0:243:0: enclosure logical id(0x5000ccab0405db00), slot(9) [Mon Dec 9 06:17:36 2019][ 31.114123] scsi 1:0:243:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:36 2019][ 31.121015] scsi 1:0:243:0: serial_number( 7SHH937G) [Mon Dec 9 06:17:36 2019][ 31.126588] scsi 1:0:243:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:36 2019][ 31.146520] mpt3sas_cm0: detecting: handle(0x00d9), sas_address(0x5000cca26a3181fd), phy(50) [Mon Dec 9 06:17:36 2019][ 31.154958] mpt3sas_cm0: REPORT_LUNS: handle(0x00d9), retries(0) [Mon Dec 9 06:17:36 2019][ 31.161089] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d9), lun(0) [Mon Dec 9 06:17:36 2019][ 31.167738] scsi 1:0:244:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:36 2019][ 31.176125] scsi 1:0:244:0: SSP: handle(0x00d9), sas_addr(0x5000cca26a3181fd), phy(50), device_name(0x5000cca26a3181ff) [Mon Dec 9 06:17:36 2019][ 31.186900] scsi 1:0:244:0: enclosure logical id(0x5000ccab0405db00), slot(10) [Mon Dec 9 06:17:36 2019][ 31.194204] scsi 1:0:244:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:36 2019][ 31.201096] scsi 1:0:244:0: serial_number( 2TGW71ND) [Mon Dec 9 06:17:36 2019][ 31.206672] scsi 1:0:244:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:36 2019][ 31.234333] mpt3sas_cm0: port enable: SUCCESS [Mon Dec 9 06:17:36 2019][ 31.239493] sd 1:0:2:0: [sdb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.247353] sd 1:0:2:0: [sdb] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.252608] sd 1:0:3:0: [sdc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.252628] sd 1:0:4:0: [sdd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.252629] sd 1:0:4:0: [sdd] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.252640] sd 1:0:5:0: [sde] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.252642] sd 1:0:5:0: [sde] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.252793] sd 1:0:6:0: [sdf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.252795] sd 1:0:6:0: [sdf] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.252973] sd 1:0:8:0: [sdh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.252974] sd 1:0:8:0: [sdh] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.253099] sd 1:0:10:0: [sdj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.253101] sd 1:0:10:0: [sdj] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.253304] sd 1:0:13:0: [sdm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.253311] sd 1:0:13:0: [sdm] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.253320] sd 1:0:11:0: [sdk] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.253321] sd 1:0:11:0: [sdk] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.253557] sd 1:0:6:0: [sdf] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.253573] sd 1:0:14:0: [sdn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.253574] sd 1:0:14:0: [sdn] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.253654] sd 1:0:19:0: [sds] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.253659] sd 1:0:19:0: [sds] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.253686] sd 1:0:8:0: [sdh] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.253722] sd 1:0:20:0: [sdt] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.253723] sd 1:0:20:0: [sdt] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.253775] sd 1:0:18:0: [sdr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.253777] sd 1:0:18:0: [sdr] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.253793] sd 1:0:21:0: [sdu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.253794] sd 1:0:21:0: [sdu] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.253830] sd 1:0:10:0: [sdj] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.253851] sd 1:0:22:0: [sdv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.253856] sd 1:0:22:0: [sdv] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.254096] sd 1:0:13:0: [sdm] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.254136] sd 1:0:8:0: [sdh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.254142] sd 1:0:23:0: [sdw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.254143] sd 1:0:23:0: [sdw] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.254170] sd 1:0:6:0: [sdf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.254294] sd 1:0:10:0: [sdj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.254351] sd 1:0:26:0: [sdz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.254353] sd 1:0:26:0: [sdz] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.254449] sd 1:0:20:0: [sdt] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.254495] sd 1:0:18:0: [sdr] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.254504] sd 1:0:31:0: [sdae] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.254505] sd 1:0:31:0: [sdae] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.254518] sd 1:0:21:0: [sdu] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.254563] sd 1:0:32:0: [sdaf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.254564] sd 1:0:32:0: [sdaf] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.254585] sd 1:0:13:0: [sdm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.254605] sd 1:0:22:0: [sdv] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.254632] sd 1:0:33:0: [sdag] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.254638] sd 1:0:33:0: [sdag] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.254692] sd 1:0:19:0: [sds] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.254729] sd 1:0:34:0: [sdah] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.254731] sd 1:0:34:0: [sdah] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.254763] sd 1:0:5:0: [sde] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.254898] sd 1:0:25:0: [sdy] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.254900] sd 1:0:25:0: [sdy] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.254929] sd 1:0:20:0: [sdt] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.254978] sd 1:0:18:0: [sdr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.254996] sd 1:0:21:0: [sdu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.255086] sd 1:0:37:0: [sdak] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.255087] sd 1:0:37:0: [sdak] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.255096] sd 1:0:22:0: [sdv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.255105] sd 1:0:38:0: [sdal] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.255107] sd 1:0:38:0: [sdal] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.255157] sd 1:0:19:0: [sds] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.255231] sd 1:0:31:0: [sdae] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.255305] sd 1:0:32:0: [sdaf] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.255389] sd 1:0:33:0: [sdag] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.255506] sd 1:0:34:0: [sdah] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.255695] sd 1:0:31:0: [sdae] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.255775] sd 1:0:32:0: [sdaf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.255806] sd 1:0:37:0: [sdak] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.255828] sd 1:0:38:0: [sdal] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.255864] sd 1:0:33:0: [sdag] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.255983] sd 1:0:34:0: [sdah] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.256060] sd 1:0:5:0: [sde] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.256285] sd 1:0:9:0: [sdi] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.256287] sd 1:0:9:0: [sdi] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.256288] sd 1:0:37:0: [sdak] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.256309] sd 1:0:38:0: [sdal] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.256727] sd 1:0:17:0: [sdq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.256729] sd 1:0:17:0: [sdq] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.256756] sd 1:0:42:0: [sdap] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.256758] sd 1:0:42:0: [sdap] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.257860] sd 1:0:9:0: [sdi] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.257998] sd 1:0:29:0: [sdac] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.258001] sd 1:0:29:0: [sdac] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.258316] sd 1:0:41:0: [sdao] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.258319] sd 1:0:41:0: [sdao] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.258558] sd 1:0:42:0: [sdap] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.258664] sd 1:0:17:0: [sdq] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.258734] sd 1:0:29:0: [sdac] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.259025] sd 1:0:42:0: [sdap] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.259162] sd 1:0:17:0: [sdq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.259390] sd 1:0:2:0: [sdb] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.259560] sd 1:0:29:0: [sdac] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.259860] sd 1:0:2:0: [sdb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.260699] sd 1:0:47:0: [sdau] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.260700] sd 1:0:47:0: [sdau] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.260943] sd 1:0:30:0: [sdad] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.260945] sd 1:0:30:0: [sdad] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.261656] sd 1:0:30:0: [sdad] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.263371] sd 1:0:7:0: [sdg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.263373] sd 1:0:7:0: [sdg] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.263415] sd 1:0:11:0: [sdk] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.263569] sd 1:0:30:0: [sdad] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.264118] sd 1:0:7:0: [sdg] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.264576] sd 1:0:7:0: [sdg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.264751] sd 1:0:11:0: [sdk] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.264857] sd 1:0:26:0: [sdz] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.265450] sd 1:0:23:0: [sdw] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.265467] sd 1:0:51:0: [sday] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.265468] sd 1:0:51:0: [sday] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.265662] sd 1:0:9:0: [sdi] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.265830] sd 1:0:25:0: [sdy] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.266108] sd 1:0:14:0: [sdn] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.267582] sd 1:0:12:0: [sdl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.267583] sd 1:0:12:0: [sdl] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.267847] sd 1:0:53:0: [sdba] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.267849] sd 1:0:53:0: [sdba] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.268354] sd 1:0:54:0: [sdbb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.268355] sd 1:0:54:0: [sdbb] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.268859] sd 1:0:12:0: [sdl] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.268868] sd 1:0:24:0: [sdx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.268872] sd 1:0:24:0: [sdx] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.269299] sd 1:0:43:0: [sdaq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.269300] sd 1:0:43:0: [sdaq] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.270089] sd 1:0:43:0: [sdaq] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.270510] sd 1:0:44:0: [sdar] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.270514] sd 1:0:44:0: [sdar] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.270550] sd 1:0:43:0: [sdaq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.270562] sd 1:0:53:0: [sdba] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.270919] sd 1:0:41:0: [sdao] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.271084] sd 1:0:35:0: [sdai] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.271086] sd 1:0:35:0: [sdai] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.271181] sd 1:0:36:0: [sdaj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.271183] sd 1:0:36:0: [sdaj] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.271302] sd 1:0:53:0: [sdba] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.271385] sd 1:0:54:0: [sdbb] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.271422] sd 1:0:41:0: [sdao] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.271639] sd 1:0:44:0: [sdar] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.271669] sd 1:0:31:0: [sdae] Attached SCSI disk [Mon Dec 9 06:17:37 2019][ 31.271852] sd 1:0:54:0: [sdbb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.271968] sd 1:0:57:0: [sdbe] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.271969] sd 1:0:57:0: [sdbe] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.272112] sd 1:0:44:0: [sdar] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.272211] sd 1:0:35:0: [sdai] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.272296] sd 1:0:36:0: [sdaj] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.272602] sd 1:0:55:0: [sdbc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.272604] sd 1:0:55:0: [sdbc] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.272774] sd 1:0:35:0: [sdai] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.272867] sd 1:0:36:0: [sdaj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.272873] sd 1:0:46:0: [sdat] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.272874] sd 1:0:46:0: [sdat] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.273073] sd 1:0:52:0: [sdaz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.273074] sd 1:0:52:0: [sdaz] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.273145] sd 1:0:45:0: [sdas] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.273146] sd 1:0:45:0: [sdas] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.273373] sd 1:0:6:0: [sdf] Attached SCSI disk [Mon Dec 9 06:17:37 2019][ 31.273596] sd 1:0:46:0: [sdat] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.274566] sd 1:0:46:0: [sdat] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.275826] sd 1:0:49:0: [sdaw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.275828] sd 1:0:49:0: [sdaw] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.275829] sd 1:0:18:0: [sdr] Attached SCSI disk [Mon Dec 9 06:17:37 2019][ 31.277272] sd 1:0:52:0: [sdaz] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.277305] sd 1:0:45:0: [sdas] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.277314] sd 1:0:47:0: [sdau] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.277549] sd 1:0:59:0: [sdbg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.277551] sd 1:0:59:0: [sdbg] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.277846] sd 1:0:60:0: [sdbh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.277848] sd 1:0:60:0: [sdbh] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.277892] sd 1:0:26:0: [sdz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.278353] sd 1:0:47:0: [sdau] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.281905] sd 1:0:63:0: [sdbj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.281906] sd 1:0:63:0: [sdbj] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.319115] sd 1:0:99:0: [sdct] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.319117] sd 1:0:99:0: [sdct] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.319925] sd 1:0:102:0: [sdcw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.319927] sd 1:0:102:0: [sdcw] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328266] sd 1:0:48:0: [sdav] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328268] sd 1:0:48:0: [sdav] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328283] sd 1:0:58:0: [sdbf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328285] sd 1:0:58:0: [sdbf] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328434] sd 1:0:59:0: [sdbg] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.328456] sd 1:0:104:0: [sdcy] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328458] sd 1:0:104:0: [sdcy] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328537] sd 1:0:90:0: [sdck] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328539] sd 1:0:90:0: [sdck] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328578] sd 1:0:103:0: [sdcx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328581] sd 1:0:103:0: [sdcx] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328650] sd 1:0:56:0: [sdbd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328653] sd 1:0:56:0: [sdbd] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328709] sd 1:0:27:0: [sdaa] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328711] sd 1:0:27:0: [sdaa] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328716] sd 1:0:51:0: [sday] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.328755] sd 1:0:24:0: [sdx] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.328769] sd 1:0:25:0: [sdy] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.328803] sd 1:0:28:0: [sdab] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328804] sd 1:0:28:0: [sdab] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328819] sd 1:0:4:0: [sdd] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.328820] sd 1:0:23:0: [sdw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.328830] sd 1:0:55:0: [sdbc] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.328832] sd 1:0:57:0: [sdbe] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.328850] sd 1:0:49:0: [sdaw] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.328915] sd 1:0:94:0: [sdco] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328917] sd 1:0:94:0: [sdco] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328936] sd 1:0:70:0: [sdbq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328938] sd 1:0:66:0: [sdbm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328940] sd 1:0:70:0: [sdbq] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328942] sd 1:0:66:0: [sdbm] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328951] sd 1:0:111:0: [sddf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328953] sd 1:0:111:0: [sddf] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328954] sd 1:0:95:0: [sdcp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328956] sd 1:0:95:0: [sdcp] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328963] sd 1:0:114:0: [sddi] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328966] sd 1:0:114:0: [sddi] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328976] sd 1:0:98:0: [sdcs] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328977] sd 1:0:98:0: [sdcs] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328987] sd 1:0:71:0: [sdbr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328989] sd 1:0:71:0: [sdbr] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.329005] sd 1:0:50:0: [sdax] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.329008] sd 1:0:117:0: [sddl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.329011] sd 1:0:50:0: [sdax] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.329013] sd 1:0:117:0: [sddl] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.329024] sd 1:0:82:0: [sdcc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.329025] sd 1:0:82:0: [sdcc] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.329047] sd 1:0:107:0: [sddb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.329049] sd 1:0:107:0: [sddb] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.329372] sd 1:0:91:0: [sdcl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.329374] sd 1:0:91:0: [sdcl] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.329450] sd 1:0:109:0: [sddd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.329453] sd 1:0:109:0: [sddd] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.329460] sd 1:0:60:0: [sdbh] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.329467] sd 1:0:78:0: [sdby] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.329468] sd 1:0:78:0: [sdby] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.329578] sd 1:0:29:0: [sdac] Attached SCSI disk [Mon Dec 9 06:17:38 2019][ 31.329812] sd 1:0:102:0: [sdcw] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.329919] sd 1:0:63:0: [sdbj] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.329964] sd 1:0:60:0: [sdbh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.329991] sd 1:0:56:0: [sdbd] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330112] sd 1:0:48:0: [sdav] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330184] sd 1:0:103:0: [sdcx] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330217] sd 1:0:111:0: [sddf] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330242] sd 1:0:58:0: [sdbf] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330324] sd 1:0:51:0: [sday] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.330354] sd 1:0:82:0: [sdcc] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330379] sd 1:0:94:0: [sdco] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330390] sd 1:0:107:0: [sddb] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330491] sd 1:0:117:0: [sddl] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330492] sd 1:0:50:0: [sdax] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330504] sd 1:0:70:0: [sdbq] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330530] sd 1:0:90:0: [sdck] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330531] sd 1:0:112:0: [sddg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.330533] sd 1:0:112:0: [sddg] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.330578] sd 1:0:119:0: [sddn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.330579] sd 1:0:119:0: [sddn] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.330603] sd 1:0:91:0: [sdcl] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330686] sd 1:0:69:0: [sdbp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.330689] sd 1:0:69:0: [sdbp] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.330837] sd 1:0:102:0: [sdcw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.330894] sd 1:0:79:0: [sdbz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.330895] sd 1:0:79:0: [sdbz] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.330920] sd 1:0:71:0: [sdbr] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.331064] sd 1:0:63:0: [sdbj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331090] sd 1:0:109:0: [sddd] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.331167] sd 1:0:56:0: [sdbd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331174] sd 1:0:111:0: [sddf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331181] sd 1:0:66:0: [sdbm] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.331196] sd 1:0:90:0: [sdck] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331217] sd 1:0:94:0: [sdco] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331281] sd 1:0:103:0: [sdcx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331346] sd 1:0:70:0: [sdbq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331355] sd 1:0:82:0: [sdcc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331401] sd 1:0:117:0: [sddl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331401] sd 1:0:50:0: [sdax] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331452] sd 1:0:48:0: [sdav] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331489] sd 1:0:119:0: [sddn] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.331505] sd 1:0:91:0: [sdcl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331576] sd 1:0:107:0: [sddb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331717] sd 1:0:58:0: [sdbf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331744] sd 1:0:69:0: [sdbp] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.331760] sd 1:0:95:0: [sdcp] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.332100] sd 1:0:79:0: [sdbz] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.332264] sd 1:0:78:0: [sdby] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.332573] sd 1:0:79:0: [sdbz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.332664] sd 1:0:34:0: [sdah] Attached SCSI disk [Mon Dec 9 06:17:38 2019][ 31.332817] sd 1:0:109:0: [sddd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.332827] sd 1:0:118:0: [sddm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.332828] sd 1:0:118:0: [sddm] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.332988] sd 1:0:121:0: [sddp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.332990] sd 1:0:121:0: [sddp] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.333073] sd 1:0:119:0: [sddn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.333113] sd 1:0:86:0: [sdcg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.333115] sd 1:0:86:0: [sdcg] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.333157] sd 1:0:71:0: [sdbr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.333165] sd 1:0:78:0: [sdby] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.333231] sd 1:0:5:0: [sde] Attached SCSI disk [Mon Dec 9 06:17:38 2019][ 31.333386] sd 1:0:66:0: [sdbm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.333707] sd 1:0:121:0: [sddp] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.333869] sd 1:0:57:0: [sdbe] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.334008] sd 1:0:84:0: [sdce] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.334009] sd 1:0:84:0: [sdce] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.334041] sd 1:0:27:0: [sdaa] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.334094] sd 1:0:110:0: [sdde] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.334096] sd 1:0:110:0: [sdde] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.334169] sd 1:0:118:0: [sddm] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.334210] sd 1:0:88:0: [sdci] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.334211] sd 1:0:88:0: [sdci] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.334436] sd 1:0:73:0: [sdbt] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.334437] sd 1:0:73:0: [sdbt] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.334528] sd 1:0:106:0: [sdda] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.334535] sd 1:0:106:0: [sdda] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.334582] sd 1:0:49:0: [sdaw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.334677] sd 1:0:86:0: [sdcg] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.334825] sd 1:0:121:0: [sddp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.335113] sd 1:0:88:0: [sdci] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.335123] sd 1:0:12:0: [sdl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.335126] sd 1:0:45:0: [sdas] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.335148] sd 1:0:72:0: [sdbs] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.335149] sd 1:0:72:0: [sdbs] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.335348] sd 1:0:27:0: [sdaa] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.335426] sd 1:0:59:0: [sdbg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.335506] sd 1:0:108:0: [sddc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.335507] sd 1:0:108:0: [sddc] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.335730] sd 1:0:116:0: [sddk] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.335732] sd 1:0:116:0: [sddk] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.335771] sd 1:0:42:0: [sdap] Attached SCSI disk [Mon Dec 9 06:17:38 2019][ 31.335927] sd 1:0:67:0: [sdbn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.335929] sd 1:0:67:0: [sdbn] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.335984] sd 1:0:30:0: [sdad] Attached SCSI disk [Mon Dec 9 06:17:38 2019][ 31.335995] sd 1:0:84:0: [sdce] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.335998] sd 1:0:106:0: [sdda] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.336012] sd 1:0:110:0: [sdde] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.336039] sd 1:0:120:0: [sddo] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.336041] sd 1:0:120:0: [sddo] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.336053] sd 1:0:114:0: [sddi] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.336280] sd 1:0:83:0: [sdcd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.336282] sd 1:0:83:0: [sdcd] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.336562] sd 1:0:55:0: [sdbc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.336664] sd 1:0:88:0: [sdci] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.336742] sd 1:0:67:0: [sdbn] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.336793] sd 1:0:96:0: [sdcq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.336795] sd 1:0:96:0: [sdcq] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.336870] sd 1:0:116:0: [sddk] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.336899] sd 1:0:86:0: [sdcg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.337049] sd 1:0:85:0: [sdcf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.337050] sd 1:0:85:0: [sdcf] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.337204] sd 1:0:67:0: [sdbn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.337235] sd 1:0:21:0: [sdu] Attached SCSI disk [Mon Dec 9 06:17:38 2019][ 31.337283] sd 1:0:108:0: [sddc] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.337587] sd 1:0:114:0: [sddi] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.337641] sd 1:0:106:0: [sdda] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.338015] sd 1:0:2:0: [sdb] Attached SCSI disk [Mon Dec 9 06:17:38 2019][ 31.338094] sd 1:0:84:0: [sdce] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.338147] sd 1:0:83:0: [sdcd] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.338262] sd 1:0:85:0: [sdcf] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.338380] sd 1:0:80:0: [sdca] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.338382] sd 1:0:80:0: [sdca] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.338394] sd 1:0:24:0: [sdx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.339023] sd 1:0:120:0: [sddo] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.339062] sd 1:0:116:0: [sddk] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.339316] sd 1:0:110:0: [sdde] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.339481] sd 1:0:83:0: [sdcd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.339694] sd 1:0:104:0: [sdcy] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.339727] sd 1:0:85:0: [sdcf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.339813] sd 1:0:73:0: [sdbt] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.339963] sd 1:0:115:0: [sddj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.339964] sd 1:0:115:0: [sddj] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.340086] sd 1:0:95:0: [sdcp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.340157] sd 1:0:97:0: [sdcr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.340159] sd 1:0:97:0: [sdcr] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.340548] sd 1:0:10:0: [sdj] Attached SCSI disk [Mon Dec 9 06:17:38 2019][ 31.340678] sd 1:0:112:0: [sddg] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.340716] sd 1:0:96:0: [sdcq] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.340977] sd 1:0:98:0: [sdcs] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.341308] sd 1:0:22:0: [sdv] Attached SCSI disk [Mon Dec 9 06:17:38 2019][ 31.341475] sd 1:0:98:0: [sdcs] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.342032] sd 1:0:74:0: [sdbu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.342034] sd 1:0:74:0: [sdbu] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.342111] sd 1:0:73:0: [sdbt] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.342541] sd 1:0:112:0: [sddg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.342556] sd 1:0:99:0: [sdct] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.342592] sd 1:0:118:0: [sddm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.343012] sd 1:0:124:0: [sddr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.343014] sd 1:0:124:0: [sddr] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.343673] sd 1:0:120:0: [sddo] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.344546] sd 1:0:74:0: [sdbu] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.344810] sd 1:0:53:0: [sdba] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.345294] sd 1:0:105:0: [sdcz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.345296] sd 1:0:105:0: [sdcz] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.345466] sd 1:0:87:0: [sdch] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.345467] sd 1:0:87:0: [sdch] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.345694] sd 1:0:80:0: [sdca] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.345706] sd 1:0:69:0: [sdbp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.345731] sd 1:0:108:0: [sddc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.345821] sd 1:0:9:0: [sdi] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.345855] sd 1:0:17:0: [sdq] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.346350] sd 1:0:64:0: [sdbk] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.346352] sd 1:0:64:0: [sdbk] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.346360] sd 1:0:97:0: [sdcr] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.346751] sd 1:0:80:0: [sdca] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.347549] sd 1:0:96:0: [sdcq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.347697] sd 1:0:115:0: [sddj] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.347959] sd 1:0:105:0: [sdcz] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.348027] sd 1:0:99:0: [sdct] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.348617] sd 1:0:75:0: [sdbv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.348619] sd 1:0:75:0: [sdbv] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.349075] sd 1:0:37:0: [sdak] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.349144] sd 1:0:14:0: [sdn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.349806] sd 1:0:46:0: [sdat] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.350115] sd 1:0:126:0: [sddt] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.350117] sd 1:0:126:0: [sddt] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.350432] sd 1:0:75:0: [sdbv] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.350503] sd 1:0:72:0: [sdbs] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.351771] sd 1:0:72:0: [sdbs] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.351909] sd 1:0:127:0: [sddu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.351911] sd 1:0:127:0: [sddu] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.352074] sd 1:0:124:0: [sddr] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.352639] sd 1:0:75:0: [sdbv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.352647] sd 1:0:71:0: [sdbr] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.353090] sd 1:0:104:0: [sdcy] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.353451] sd 1:0:128:0: [sddv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.353453] sd 1:0:128:0: [sddv] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.353505] sd 1:0:82:0: [sdcc] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.353969] sd 1:0:79:0: [sdbz] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.354962] sd 1:0:16:0: [sdp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.354964] sd 1:0:16:0: [sdp] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.355037] sd 1:0:64:0: [sdbk] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.355175] sd 1:0:38:0: [sdal] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.355278] sd 1:0:125:0: [sdds] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.355280] sd 1:0:125:0: [sdds] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.355427] sd 1:0:77:0: [sdbx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.355435] sd 1:0:77:0: [sdbx] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.355679] sd 1:0:97:0: [sdcr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.355827] sd 1:0:126:0: [sddt] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.356458] sd 1:0:130:0: [sddx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.356461] sd 1:0:130:0: [sddx] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.356710] sd 1:0:40:0: [sdan] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.356712] sd 1:0:40:0: [sdan] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.357847] sd 1:0:65:0: [sdbl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.357852] sd 1:0:65:0: [sdbl] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.357979] sd 1:0:101:0: [sdcv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.357989] sd 1:0:101:0: [sdcv] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.358131] sd 1:0:74:0: [sdbu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.358536] sd 1:0:64:0: [sdbk] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.358558] sd 1:0:40:0: [sdan] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.358726] sd 1:0:98:0: [sdcs] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.359194] sd 1:0:101:0: [sdcv] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.359415] sd 1:0:40:0: [sdan] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.359441] sd 1:0:115:0: [sddj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.360224] sd 1:0:125:0: [sdds] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.360701] sd 1:0:125:0: [sdds] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.360804] sd 1:0:101:0: [sdcv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.361072] sd 1:0:111:0: [sddf] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.362743] sd 1:0:70:0: [sdbq] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.362757] sd 1:0:13:0: [sdm] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.363084] sd 1:0:126:0: [sddt] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.363202] sd 1:0:127:0: [sddu] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.363983] sd 1:0:16:0: [sdp] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.364096] sd 1:0:77:0: [sdbx] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.364224] sd 1:0:50:0: [sdax] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.364563] sd 1:0:87:0: [sdch] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.364756] sd 1:0:135:0: [sdec] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.364758] sd 1:0:135:0: [sdec] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.364888] sd 1:0:131:0: [sddy] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.364890] sd 1:0:131:0: [sddy] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.364930] sd 1:0:130:0: [sddx] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.365049] sd 1:0:87:0: [sdch] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.365091] sd 1:0:77:0: [sdbx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.365259] sd 1:0:114:0: [sddi] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.365321] sd 1:0:16:0: [sdp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.365393] sd 1:0:105:0: [sdcz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.365608] sd 1:0:131:0: [sddy] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.365762] sd 1:0:4:0: [sdd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.366074] sd 1:0:65:0: [sdbl] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.366234] sd 1:0:68:0: [sdbo] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.366237] sd 1:0:68:0: [sdbo] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.366625] sd 1:0:129:0: [sddw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.366627] sd 1:0:129:0: [sddw] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.366637] sd 1:0:131:0: [sddy] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.368114] sd 1:0:127:0: [sddu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.368291] sd 1:0:65:0: [sdbl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.368849] sd 1:0:15:0: [sdo] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.368851] sd 1:0:15:0: [sdo] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.370291] sd 1:0:14:0: [sdn] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.370723] sd 1:0:130:0: [sddx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.370787] sd 1:0:132:0: [sddz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.370789] sd 1:0:132:0: [sddz] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.371544] sd 1:0:132:0: [sddz] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.371863] sd 1:0:92:0: [sdcm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.371865] sd 1:0:92:0: [sdcm] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.372065] sd 1:0:138:0: [sdef] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.372066] sd 1:0:138:0: [sdef] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.372273] sd 1:0:124:0: [sddr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.372600] sd 1:0:132:0: [sddz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.372747] sd 1:0:135:0: [sdec] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.373529] sd 1:0:41:0: [sdao] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.373541] sd 1:0:11:0: [sdk] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.374688] sd 1:0:136:0: [sded] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.374690] sd 1:0:136:0: [sded] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.374821] sd 1:0:33:0: [sdag] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.374907] sd 1:0:139:0: [sdeg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.374909] sd 1:0:139:0: [sdeg] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.375093] sd 1:0:75:0: [sdbv] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.375476] sd 1:0:39:0: [sdam] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.375477] sd 1:0:39:0: [sdam] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.375664] sd 1:0:94:0: [sdco] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.375960] sd 1:0:76:0: [sdbw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.375975] sd 1:0:76:0: [sdbw] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.376306] sd 1:0:129:0: [sddw] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.376355] sd 1:0:15:0: [sdo] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.376464] sd 1:0:61:0: [sdbi] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.376466] sd 1:0:61:0: [sdbi] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.376602] sd 1:0:100:0: [sdcu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.376604] sd 1:0:100:0: [sdcu] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.376780] sd 1:0:129:0: [sddw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.377123] sd 1:0:15:0: [sdo] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.377306] sd 1:0:39:0: [sdam] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.377591] sd 1:0:128:0: [sddv] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.377709] sd 1:0:100:0: [sdcu] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.377821] sd 1:0:102:0: [sdcw] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.378073] sd 1:0:128:0: [sddv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.379159] sd 1:0:72:0: [sdbs] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.379900] sd 1:0:137:0: [sdee] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.379902] sd 1:0:137:0: [sdee] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.380548] sd 1:0:28:0: [sdab] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.381405] sd 1:0:140:0: [sdeh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.381414] sd 1:0:140:0: [sdeh] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.381769] sd 1:0:135:0: [sdec] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.383578] sd 1:0:24:0: [sdx] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.384119] sd 1:0:99:0: [sdct] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.384375] sd 1:0:107:0: [sddb] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.384437] sd 1:0:85:0: [sdcf] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.384979] sd 1:0:138:0: [sdef] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.385077] sd 1:0:136:0: [sded] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.385454] sd 1:0:138:0: [sdef] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.385682] sd 1:0:12:0: [sdl] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.386005] sd 1:0:83:0: [sdcd] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.387022] sd 1:0:54:0: [sdbb] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.387032] sd 1:0:44:0: [sdar] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.387611] sd 1:0:95:0: [sdcp] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.388033] sd 1:0:43:0: [sdaq] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.388229] sd 1:0:93:0: [sdcn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.388231] sd 1:0:93:0: [sdcn] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.388600] sd 1:0:73:0: [sdbt] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.389173] sd 1:0:122:0: [sddq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.389175] sd 1:0:122:0: [sddq] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.389581] sd 1:0:35:0: [sdai] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.390099] sd 1:0:52:0: [sdaz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.390197] sd 1:0:76:0: [sdbw] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.390241] sd 1:0:48:0: [sdav] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.390820] sd 1:0:137:0: [sdee] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.391298] sd 1:0:137:0: [sdee] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.392274] sd 1:0:106:0: [sdda] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.392435] sd 1:0:142:0: [sdej] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.392437] sd 1:0:142:0: [sdej] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.392865] sd 1:0:141:0: [sdei] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.392866] sd 1:0:141:0: [sdei] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.392949] sd 1:0:139:0: [sdeg] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.394364] sd 1:0:56:0: [sdbd] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.395287] sd 1:0:134:0: [sdeb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.395289] sd 1:0:134:0: [sdeb] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.395693] sd 1:0:45:0: [sdas] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.395894] sd 1:0:47:0: [sdau] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.396270] sd 1:0:19:0: [sds] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.398141] sd 1:0:133:0: [sdea] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.398143] sd 1:0:133:0: [sdea] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.398151] sd 1:0:117:0: [sddl] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.398549] sd 1:0:140:0: [sdeh] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.398601] sd 1:0:139:0: [sdeg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.398737] sd 1:0:142:0: [sdej] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.399026] sd 1:0:140:0: [sdeh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.399049] sd 1:0:104:0: [sdcy] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.399680] sd 1:0:26:0: [sdz] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.399700] sd 1:0:109:0: [sddd] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.400122] sd 1:0:134:0: [sdeb] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.400781] sd 1:0:55:0: [sdbc] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.400785] sd 1:0:143:0: [sdek] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.400787] sd 1:0:143:0: [sdek] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.401579] sd 1:0:143:0: [sdek] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.402565] sd 1:0:68:0: [sdbo] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.403091] sd 1:0:87:0: [sdch] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.403228] sd 1:0:136:0: [sded] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.403422] sd 1:0:119:0: [sddn] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.403737] sd 1:0:96:0: [sdcq] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.404281] sd 1:0:58:0: [sdbf] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.404343] sd 1:0:59:0: [sdbg] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.404345] sd 1:0:68:0: [sdbo] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.404716] sd 1:0:80:0: [sdca] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.405477] sd 1:0:141:0: [sdei] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.405801] sd 1:0:134:0: [sdeb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.405854] sd 1:0:74:0: [sdbu] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.405895] sd 1:0:89:0: [sdcj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.405896] sd 1:0:89:0: [sdcj] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.405932] sd 1:0:141:0: [sdei] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.406792] sd 1:0:89:0: [sdcj] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.407018] sd 1:0:28:0: [sdab] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.408360] sd 1:0:89:0: [sdcj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.408978] sd 1:0:126:0: [sddt] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.409127] sd 1:0:125:0: [sdds] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.409454] sd 1:0:49:0: [sdaw] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.409930] sd 1:0:142:0: [sdej] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.410423] sd 1:0:115:0: [sddj] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.411610] sd 1:0:25:0: [sdy] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.412304] sd 1:0:116:0: [sddk] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.412471] sd 1:0:110:0: [sdde] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.413860] sd 1:0:113:0: [sddh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.413862] sd 1:0:113:0: [sddh] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.413921] sd 1:0:7:0: [sdg] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.415027] sd 1:0:86:0: [sdcg] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.415927] sd 1:0:60:0: [sdbh] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.418926] sd 1:0:124:0: [sddr] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.418940] sd 1:0:93:0: [sdcn] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.419767] sd 1:0:130:0: [sddx] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.419809] sd 1:0:108:0: [sddc] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.420474] sd 1:0:93:0: [sdcn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.422160] sd 1:0:127:0: [sddu] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.422775] sd 1:0:129:0: [sddw] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.422788] sd 1:0:36:0: [sdaj] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.422905] sd 1:0:145:0: [sdem] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.422906] sd 1:0:145:0: [sdem] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.423164] sd 1:0:120:0: [sddo] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.423643] sd 1:0:23:0: [sdw] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.425946] sd 1:0:32:0: [sdaf] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.425973] sd 1:0:90:0: [sdck] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.426331] sd 1:0:136:0: [sded] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.426811] sd 1:0:105:0: [sdcz] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.426843] sd 1:0:97:0: [sdcr] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.427286] sd 1:0:61:0: [sdbi] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.427356] sd 1:0:143:0: [sdek] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.427722] sd 1:0:39:0: [sdam] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.428516] sd 1:0:100:0: [sdcu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.430554] sd 1:0:138:0: [sdef] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.430788] sd 1:0:145:0: [sdem] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.431092] sd 1:0:121:0: [sddp] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.434480] sd 1:0:84:0: [sdce] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.434925] sd 1:0:57:0: [sdbe] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.435025] sd 1:0:113:0: [sddh] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.435048] sd 1:0:64:0: [sdbk] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.435629] sd 1:0:128:0: [sddv] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.435692] sd 1:0:135:0: [sdec] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.436055] sd 1:0:144:0: [sdel] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.436056] sd 1:0:144:0: [sdel] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.436439] sd 1:0:133:0: [sdea] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.436681] sd 1:0:118:0: [sddm] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.437585] sd 1:0:122:0: [sddq] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.438531] sd 1:0:146:0: [sden] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.438533] sd 1:0:146:0: [sden] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.438872] sd 1:0:92:0: [sdcm] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.440131] sd 1:0:137:0: [sdee] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.443487] sd 1:0:4:0: [sdd] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.444658] sd 1:0:145:0: [sdem] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.445854] sd 1:0:81:0: [sdcb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.445855] sd 1:0:81:0: [sdcb] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.446167] sd 1:0:61:0: [sdbi] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.446452] sd 1:0:139:0: [sdeg] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.446465] sd 1:0:122:0: [sddq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.447110] sd 1:0:133:0: [sdea] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.447213] sd 1:0:146:0: [sden] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.447632] sd 1:0:20:0: [sdt] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.452168] sd 1:0:147:0: [sdeo] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.452170] sd 1:0:147:0: [sdeo] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.452899] sd 1:0:8:0: [sdh] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.452901] sd 1:0:147:0: [sdeo] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.453369] sd 1:0:147:0: [sdeo] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.454472] sd 1:0:141:0: [sdei] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.455955] sd 1:0:76:0: [sdbw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.457261] sd 1:0:113:0: [sddh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.459607] sd 1:0:146:0: [sden] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.465876] sd 1:0:144:0: [sdel] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.467035] sd 1:0:148:0: [sdep] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.467036] sd 1:0:148:0: [sdep] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.467749] sd 1:0:148:0: [sdep] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.468208] sd 1:0:148:0: [sdep] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.474997] sd 1:0:140:0: [sdeh] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.477250] sd 1:0:149:0: [sdeq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.477252] sd 1:0:149:0: [sdeq] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.477263] sd 1:0:93:0: [sdcn] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.477303] sd 1:0:134:0: [sdeb] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.480094] sd 1:0:103:0: [sdcx] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.482543] sd 1:0:91:0: [sdcl] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.484367] sd 1:0:92:0: [sdcm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.485181] sd 1:0:150:0: [sder] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.485183] sd 1:0:150:0: [sder] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.489257] sd 1:0:81:0: [sdcb] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.490095] sd 1:0:149:0: [sdeq] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.490858] sd 1:0:67:0: [sdbn] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.491291] sd 1:0:27:0: [sdaa] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.492160] sd 1:0:88:0: [sdci] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.493430] sd 1:0:28:0: [sdab] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.495996] sd 1:0:147:0: [sdeo] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.497534] sd 1:0:69:0: [sdbp] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.497651] sd 1:0:151:0: [sdes] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.497652] sd 1:0:151:0: [sdes] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.498366] sd 1:0:151:0: [sdes] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.498609] sd 1:0:78:0: [sdby] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.498663] sd 1:0:51:0: [sday] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.498687] sd 1:0:150:0: [sder] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.498840] sd 1:0:151:0: [sdes] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.499167] sd 1:0:150:0: [sder] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.500318] sd 1:0:149:0: [sdeq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.503253] sd 1:0:89:0: [sdcj] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.505600] sd 1:0:144:0: [sdel] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.507270] sd 1:0:63:0: [sdbj] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.508035] sd 1:0:133:0: [sdea] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.509931] sd 1:0:65:0: [sdbl] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.511466] sd 1:0:112:0: [sddg] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.511937] sd 1:0:146:0: [sden] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.515360] sd 1:0:39:0: [sdam] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.516709] sd 1:0:152:0: [sdet] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.516711] sd 1:0:152:0: [sdet] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.516905] sd 1:0:92:0: [sdcm] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.517416] sd 1:0:152:0: [sdet] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.517883] sd 1:0:152:0: [sdet] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.523403] sd 1:0:81:0: [sdcb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.523598] sd 1:0:16:0: [sdp] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.524561] sd 1:0:77:0: [sdbx] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.524895] sd 1:0:66:0: [sdbm] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.526000] sd 1:0:100:0: [sdcu] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.532011] sd 1:0:148:0: [sdep] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.533355] sd 1:0:122:0: [sddq] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.533396] sd 1:0:145:0: [sdem] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.533652] sd 1:0:142:0: [sdej] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.536070] sd 1:0:153:0: [sdeu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.536072] sd 1:0:153:0: [sdeu] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.536792] sd 1:0:153:0: [sdeu] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.537264] sd 1:0:153:0: [sdeu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.538383] sd 1:0:52:0: [sdaz] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.540923] sd 1:0:149:0: [sdeq] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.541726] sd 1:0:154:0: [sdev] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.541727] sd 1:0:154:0: [sdev] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.541736] sd 1:0:113:0: [sddh] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.541833] sd 1:0:61:0: [sdbi] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.546924] sd 1:0:154:0: [sdev] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.547802] sd 1:0:155:0: [sdew] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.547804] sd 1:0:155:0: [sdew] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.548779] sd 1:0:131:0: [sddy] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.550506] sd 1:0:132:0: [sddz] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.550631] sd 1:0:154:0: [sdev] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.553812] sd 1:0:81:0: [sdcb] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.556737] sd 1:0:155:0: [sdew] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.556772] sd 1:0:68:0: [sdbo] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.563645] sd 1:0:152:0: [sdet] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.569059] sd 1:0:156:0: [sdex] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.569061] sd 1:0:156:0: [sdex] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.574729] sd 1:0:158:0: [sdez] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.574732] sd 1:0:158:0: [sdez] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.575458] sd 1:0:158:0: [sdez] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.575942] sd 1:0:158:0: [sdez] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.577140] sd 1:0:151:0: [sdes] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.577349] sd 1:0:150:0: [sder] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.579122] sd 1:0:156:0: [sdex] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.581150] sd 1:0:153:0: [sdeu] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.582323] sd 1:0:159:0: [sdfa] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.582325] sd 1:0:159:0: [sdfa] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.583024] sd 1:0:159:0: [sdfa] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.583469] sd 1:0:159:0: [sdfa] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.585980] sd 1:0:160:0: [sdfb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.585982] sd 1:0:160:0: [sdfb] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.586702] sd 1:0:160:0: [sdfb] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.586801] sd 1:0:157:0: [sdey] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.586803] sd 1:0:157:0: [sdey] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.587184] sd 1:0:160:0: [sdfb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.587379] sd 1:0:154:0: [sdev] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.593130] sd 1:0:157:0: [sdey] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.593539] sd 1:0:161:0: [sdfc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.593541] sd 1:0:161:0: [sdfc] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.594287] sd 1:0:161:0: [sdfc] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.594762] sd 1:0:161:0: [sdfc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.598132] sd 1:0:162:0: [sdfd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.598133] sd 1:0:162:0: [sdfd] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.600843] sd 1:0:163:0: [sdfe] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.600844] sd 1:0:163:0: [sdfe] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.605308] sd 1:0:164:0: [sdff] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.605309] sd 1:0:164:0: [sdff] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.605831] sd 1:0:162:0: [sdfd] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.606040] sd 1:0:164:0: [sdff] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.606519] sd 1:0:164:0: [sdff] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.608864] sd 1:0:155:0: [sdew] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.609682] sd 1:0:156:0: [sdex] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.611626] sd 1:0:163:0: [sdfe] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.611907] sd 1:0:166:0: [sdfh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.611908] sd 1:0:166:0: [sdfh] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.612458] sd 1:0:157:0: [sdey] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.612622] sd 1:0:166:0: [sdfh] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.613089] sd 1:0:166:0: [sdfh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.613531] sd 1:0:101:0: [sdcv] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.614391] sd 1:0:40:0: [sdan] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.614450] sd 1:0:167:0: [sdfi] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.614451] sd 1:0:167:0: [sdfi] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.614575] sd 1:0:163:0: [sdfe] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.614953] sd 1:0:159:0: [sdfa] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.616728] sd 1:0:165:0: [sdfg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.616730] sd 1:0:165:0: [sdfg] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.617641] sd 1:0:169:0: [sdfk] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.617643] sd 1:0:169:0: [sdfk] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.617721] sd 1:0:162:0: [sdfd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.620471] sd 1:0:173:0: [sdfo] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.620472] sd 1:0:173:0: [sdfo] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.620986] sd 1:0:175:0: [sdfq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.620988] sd 1:0:175:0: [sdfq] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.621717] sd 1:0:175:0: [sdfq] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.622060] sd 1:0:165:0: [sdfg] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.622178] sd 1:0:175:0: [sdfq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.622535] sd 1:0:165:0: [sdfg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.622589] sd 1:0:160:0: [sdfb] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.623098] sd 1:0:177:0: [sdfs] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.623099] sd 1:0:177:0: [sdfs] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.624165] sd 1:0:170:0: [sdfl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.624167] sd 1:0:170:0: [sdfl] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.624195] sd 1:0:178:0: [sdft] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.624196] sd 1:0:178:0: [sdft] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.624240] sd 1:0:167:0: [sdfi] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.624636] sd 1:0:174:0: [sdfp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.624637] sd 1:0:174:0: [sdfp] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.624704] sd 1:0:167:0: [sdfi] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.625390] sd 1:0:179:0: [sdfu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.625392] sd 1:0:179:0: [sdfu] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.626642] sd 1:0:180:0: [sdfv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.626644] sd 1:0:180:0: [sdfv] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.626743] sd 1:0:169:0: [sdfk] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.627373] sd 1:0:180:0: [sdfv] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.628271] sd 1:0:180:0: [sdfv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.628669] sd 1:0:176:0: [sdfr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.628671] sd 1:0:176:0: [sdfr] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.628763] sd 1:0:181:0: [sdfw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.628764] sd 1:0:181:0: [sdfw] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.629083] sd 1:0:172:0: [sdfn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.629085] sd 1:0:172:0: [sdfn] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.630233] sd 1:0:15:0: [sdo] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.631173] sd 1:0:173:0: [sdfo] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.631787] sd 1:0:168:0: [sdfj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.631789] sd 1:0:168:0: [sdfj] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.631827] sd 1:0:171:0: [sdfm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.631828] sd 1:0:171:0: [sdfm] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.632986] sd 1:0:174:0: [sdfp] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.632999] sd 1:0:183:0: [sdfy] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.633001] sd 1:0:183:0: [sdfy] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.633464] sd 1:0:174:0: [sdfp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.633714] sd 1:0:183:0: [sdfy] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.634184] sd 1:0:183:0: [sdfy] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.634476] sd 1:0:179:0: [sdfu] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.635235] sd 1:0:168:0: [sdfj] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.636051] sd 1:0:185:0: [sdfz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.636053] sd 1:0:185:0: [sdfz] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.636443] sd 1:0:177:0: [sdfs] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.636611] sd 1:0:181:0: [sdfw] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.636759] sd 1:0:185:0: [sdfz] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.637091] sd 1:0:177:0: [sdfs] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.637194] sd 1:0:170:0: [sdfl] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.637211] sd 1:0:185:0: [sdfz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.637304] sd 1:0:186:0: [sdga] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.637306] sd 1:0:186:0: [sdga] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.637654] sd 1:0:170:0: [sdfl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.637934] sd 1:0:178:0: [sdft] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.638317] sd 1:0:182:0: [sdfx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.638319] sd 1:0:182:0: [sdfx] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.638354] sd 1:0:169:0: [sdfk] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.638658] sd 1:0:187:0: [sdgb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.638660] sd 1:0:187:0: [sdgb] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.639367] sd 1:0:187:0: [sdgb] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.639602] sd 1:0:188:0: [sdgc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.639603] sd 1:0:188:0: [sdgc] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.639839] sd 1:0:187:0: [sdgb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.640288] sd 1:0:172:0: [sdfn] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.640568] sd 1:0:189:0: [sdgd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.640570] sd 1:0:189:0: [sdgd] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.640762] sd 1:0:172:0: [sdfn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.641279] sd 1:0:189:0: [sdgd] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.641316] sd 1:0:190:0: [sdge] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.641318] sd 1:0:190:0: [sdge] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.641570] sd 1:0:182:0: [sdfx] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.641758] sd 1:0:189:0: [sdgd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.642041] sd 1:0:182:0: [sdfx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.642297] sd 1:0:171:0: [sdfm] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.642436] sd 1:0:164:0: [sdff] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.642527] sd 1:0:176:0: [sdfr] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.642775] sd 1:0:171:0: [sdfm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.642799] sd 1:0:191:0: [sdgf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.642800] sd 1:0:191:0: [sdgf] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.643172] sd 1:0:179:0: [sdfu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.643615] sd 1:0:166:0: [sdfh] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.644213] sd 1:0:173:0: [sdfo] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.644364] sd 1:0:178:0: [sdft] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.645335] sd 1:0:181:0: [sdfw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.648762] sd 1:0:194:0: [sdgi] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.648764] sd 1:0:194:0: [sdgi] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.649970] sd 1:0:186:0: [sdga] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.650445] sd 1:0:186:0: [sdga] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.650771] sd 1:0:191:0: [sdgf] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.651249] sd 1:0:191:0: [sdgf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.651364] sd 1:0:196:0: [sdgk] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.651367] sd 1:0:196:0: [sdgk] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.651616] sd 1:0:176:0: [sdfr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.652092] sd 1:0:196:0: [sdgk] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.652368] sd 1:0:197:0: [sdgl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.652369] sd 1:0:197:0: [sdgl] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.652563] sd 1:0:196:0: [sdgk] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.653100] sd 1:0:197:0: [sdgl] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.653568] sd 1:0:197:0: [sdgl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.654264] sd 1:0:198:0: [sdgm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.654265] sd 1:0:175:0: [sdfq] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.654266] sd 1:0:198:0: [sdgm] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.654780] sd 1:0:190:0: [sdge] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.654977] sd 1:0:198:0: [sdgm] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.655356] sd 1:0:190:0: [sdge] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.655444] sd 1:0:198:0: [sdgm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.655928] sd 1:0:199:0: [sdgn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.655930] sd 1:0:199:0: [sdgn] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.656656] sd 1:0:199:0: [sdgn] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.657135] sd 1:0:199:0: [sdgn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.657620] sd 1:0:194:0: [sdgi] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.658685] sd 1:0:188:0: [sdgc] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.659506] sd 1:0:194:0: [sdgi] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.659904] sd 1:0:201:0: [sdgp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.659906] sd 1:0:201:0: [sdgp] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.661091] sd 1:0:180:0: [sdfv] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.661873] sd 1:0:202:0: [sdgq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.661875] sd 1:0:202:0: [sdgq] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.662599] sd 1:0:202:0: [sdgq] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.663063] sd 1:0:202:0: [sdgq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.663449] sd 1:0:182:0: [sdfx] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.663610] sd 1:0:203:0: [sdgr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.663612] sd 1:0:203:0: [sdgr] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.664319] sd 1:0:203:0: [sdgr] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.664390] sd 1:0:191:0: [sdgf] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.664788] sd 1:0:203:0: [sdgr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.665250] sd 1:0:196:0: [sdgk] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.666033] sd 1:0:172:0: [sdfn] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.666333] sd 1:0:197:0: [sdgl] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.667641] sd 1:0:200:0: [sdgo] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.667643] sd 1:0:200:0: [sdgo] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.668320] sd 1:0:179:0: [sdfu] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.668361] sd 1:0:200:0: [sdgo] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.668591] sd 1:0:157:0: [sdey] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.668828] sd 1:0:200:0: [sdgo] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.669704] sd 1:0:206:0: [sdgu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.669706] sd 1:0:206:0: [sdgu] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.670125] sd 1:0:170:0: [sdfl] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.670471] sd 1:0:168:0: [sdfj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.671632] sd 1:0:201:0: [sdgp] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.672092] sd 1:0:201:0: [sdgp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.672351] sd 1:0:188:0: [sdgc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.674242] sd 1:0:165:0: [sdfg] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.674254] sd 1:0:163:0: [sdfe] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.676884] sd 1:0:161:0: [sdfc] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.676922] sd 1:0:208:0: [sdgw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.676924] sd 1:0:208:0: [sdgw] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.677021] sd 1:0:186:0: [sdga] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.677621] sd 1:0:208:0: [sdgw] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.677744] sd 1:0:202:0: [sdgq] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.678065] sd 1:0:208:0: [sdgw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.678292] sd 1:0:185:0: [sdfz] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.678640] sd 1:0:183:0: [sdfy] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.678758] sd 1:0:203:0: [sdgr] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.679754] sd 1:0:209:0: [sdgx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.679756] sd 1:0:209:0: [sdgx] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.679913] sd 1:0:173:0: [sdfo] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.680445] sd 1:0:189:0: [sdgd] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.680464] sd 1:0:209:0: [sdgx] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.681840] sd 1:0:181:0: [sdfw] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.682232] sd 1:0:200:0: [sdgo] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.682533] sd 1:0:210:0: [sdgy] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.682534] sd 1:0:210:0: [sdgy] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.682697] sd 1:0:207:0: [sdgv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.682698] sd 1:0:207:0: [sdgv] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.683278] sd 1:0:210:0: [sdgy] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.683389] sd 1:0:169:0: [sdfk] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.683758] sd 1:0:210:0: [sdgy] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.684023] sd 1:0:174:0: [sdfp] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.684805] sd 1:0:211:0: [sdgz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.684808] sd 1:0:211:0: [sdgz] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.685019] sd 1:0:171:0: [sdfm] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.685414] sd 1:0:190:0: [sdge] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.685519] sd 1:0:211:0: [sdgz] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.685972] sd 1:0:211:0: [sdgz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.687622] sd 1:0:212:0: [sdha] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.687624] sd 1:0:212:0: [sdha] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.687625] sd 1:0:193:0: [sdgh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.687627] sd 1:0:193:0: [sdgh] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.687666] sd 1:0:195:0: [sdgj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.687668] sd 1:0:195:0: [sdgj] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.688337] sd 1:0:212:0: [sdha] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.688392] sd 1:0:195:0: [sdgj] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.688809] sd 1:0:212:0: [sdha] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.688872] sd 1:0:195:0: [sdgj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.688892] sd 1:0:206:0: [sdgu] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.688962] sd 1:0:207:0: [sdgv] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.689347] sd 1:0:176:0: [sdfr] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.689715] sd 1:0:213:0: [sdhb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.689716] sd 1:0:213:0: [sdhb] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.690447] sd 1:0:213:0: [sdhb] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.690509] sd 1:0:192:0: [sdgg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.690511] sd 1:0:192:0: [sdgg] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.690920] sd 1:0:213:0: [sdhb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.691189] sd 1:0:162:0: [sdfd] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.691722] sd 1:0:208:0: [sdgw] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.691898] sd 1:0:214:0: [sdhc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.691899] sd 1:0:214:0: [sdhc] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.692591] sd 1:0:214:0: [sdhc] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.693060] sd 1:0:214:0: [sdhc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.693666] sd 1:0:206:0: [sdgu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.695225] sd 1:0:215:0: [sdhd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.695227] sd 1:0:215:0: [sdhd] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.695917] sd 1:0:215:0: [sdhd] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.696213] sd 1:0:158:0: [sdez] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.696360] sd 1:0:215:0: [sdhd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.696600] sd 1:0:198:0: [sdgm] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.702201] sd 1:0:188:0: [sdgc] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.702303] sd 1:0:207:0: [sdgv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.702375] sd 1:0:209:0: [sdgx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.703051] sd 1:0:187:0: [sdgb] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.706269] sd 1:0:201:0: [sdgp] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.707388] sd 1:0:218:0: [sdhg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.707390] sd 1:0:218:0: [sdhg] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.708096] sd 1:0:218:0: [sdhg] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.708199] sd 1:0:205:0: [sdgt] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.708201] sd 1:0:205:0: [sdgt] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.708286] sd 1:0:195:0: [sdgj] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.708567] sd 1:0:218:0: [sdhg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.708715] sd 1:0:214:0: [sdhc] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.710282] sd 1:0:215:0: [sdhd] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.711578] sd 1:0:194:0: [sdgi] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.711657] sd 1:0:219:0: [sdhh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.711658] sd 1:0:219:0: [sdhh] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.712364] sd 1:0:219:0: [sdhh] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.712828] sd 1:0:219:0: [sdhh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.714037] sd 1:0:220:0: [sdhi] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.714038] sd 1:0:220:0: [sdhi] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.714755] sd 1:0:220:0: [sdhi] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.715199] sd 1:0:220:0: [sdhi] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.716309] sd 1:0:209:0: [sdgx] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.716949] sd 1:0:221:0: [sdhj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.716951] sd 1:0:221:0: [sdhj] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.717668] sd 1:0:221:0: [sdhj] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.718141] sd 1:0:221:0: [sdhj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.719137] sd 1:0:204:0: [sdgs] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.719139] sd 1:0:204:0: [sdgs] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.720644] sd 1:0:222:0: [sdhk] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.720646] sd 1:0:222:0: [sdhk] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.721349] sd 1:0:222:0: [sdhk] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.721821] sd 1:0:222:0: [sdhk] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.721928] sd 1:0:211:0: [sdgz] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.723019] sd 1:0:193:0: [sdgh] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.723198] sd 1:0:192:0: [sdgg] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.723441] sd 1:0:207:0: [sdgv] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.725122] sd 1:0:223:0: [sdhl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.725124] sd 1:0:223:0: [sdhl] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.725320] sd 1:0:199:0: [sdgn] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.725816] sd 1:0:205:0: [sdgt] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.725862] sd 1:0:223:0: [sdhl] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.726354] sd 1:0:223:0: [sdhl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.728607] sd 1:0:224:0: [sdhm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.728610] sd 1:0:224:0: [sdhm] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.729316] sd 1:0:224:0: [sdhm] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.729785] sd 1:0:224:0: [sdhm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.730816] sd 1:0:220:0: [sdhi] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.733715] sd 1:0:225:0: [sdhn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.733716] sd 1:0:225:0: [sdhn] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.734096] sd 1:0:205:0: [sdgt] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.734458] sd 1:0:225:0: [sdhn] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.734923] sd 1:0:225:0: [sdhn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.735305] sd 1:0:210:0: [sdgy] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.735749] sd 1:0:206:0: [sdgu] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.736066] sd 1:0:226:0: [sdho] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.736069] sd 1:0:226:0: [sdho] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.736826] sd 1:0:226:0: [sdho] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.737288] sd 1:0:226:0: [sdho] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.737746] sd 1:0:222:0: [sdhk] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.739470] sd 1:0:218:0: [sdhg] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.739810] sd 1:0:227:0: [sdhp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.739811] sd 1:0:227:0: [sdhp] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.739909] sd 1:0:204:0: [sdgs] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.740366] sd 1:0:223:0: [sdhl] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.740532] sd 1:0:227:0: [sdhp] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.740978] sd 1:0:227:0: [sdhp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.742090] sd 1:0:212:0: [sdha] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.742092] sd 1:0:221:0: [sdhj] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.743788] sd 1:0:204:0: [sdgs] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.743932] sd 1:0:224:0: [sdhm] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.749052] sd 1:0:225:0: [sdhn] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.749252] sd 1:0:213:0: [sdhb] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.750510] sd 1:0:192:0: [sdgg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.750683] sd 1:0:143:0: [sdek] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.751003] sd 1:0:216:0: [sdhe] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.751005] sd 1:0:216:0: [sdhe] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.751826] sd 1:0:226:0: [sdho] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.754532] sd 1:0:219:0: [sdhh] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.754892] sd 1:0:227:0: [sdhp] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.755256] sd 1:0:76:0: [sdbw] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.756126] sd 1:0:231:0: [sdht] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.756127] sd 1:0:231:0: [sdht] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.756637] sd 1:0:193:0: [sdgh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.756837] sd 1:0:231:0: [sdht] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.757308] sd 1:0:231:0: [sdht] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.760958] sd 1:0:232:0: [sdhu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.760960] sd 1:0:232:0: [sdhu] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.761687] sd 1:0:232:0: [sdhu] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.762173] sd 1:0:232:0: [sdhu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.763937] sd 1:0:233:0: [sdhv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.763939] sd 1:0:233:0: [sdhv] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.764160] sd 1:0:230:0: [sdhs] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.764162] sd 1:0:230:0: [sdhs] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.764656] sd 1:0:233:0: [sdhv] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.764764] sd 1:0:204:0: [sdgs] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.764894] sd 1:0:230:0: [sdhs] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.765111] sd 1:0:233:0: [sdhv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.765421] sd 1:0:230:0: [sdhs] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.767089] sd 1:0:234:0: [sdhw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.767091] sd 1:0:234:0: [sdhw] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.767359] sd 1:0:228:0: [sdhq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.767361] sd 1:0:228:0: [sdhq] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.767812] sd 1:0:234:0: [sdhw] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.768847] sd 1:0:234:0: [sdhw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.768909] sd 1:0:217:0: [sdhf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.768914] sd 1:0:217:0: [sdhf] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.769023] sd 1:0:228:0: [sdhq] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.770648] sd 1:0:193:0: [sdgh] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.770690] sd 1:0:231:0: [sdht] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.774144] sd 1:0:236:0: [sdhy] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.774146] sd 1:0:236:0: [sdhy] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.776016] sd 1:0:237:0: [sdhz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.776018] sd 1:0:237:0: [sdhz] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.776762] sd 1:0:237:0: [sdhz] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.777227] sd 1:0:237:0: [sdhz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.777838] sd 1:0:144:0: [sdel] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.777963] sd 1:0:228:0: [sdhq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.778636] sd 1:0:230:0: [sdhs] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.782302] sd 1:0:229:0: [sdhr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.782304] sd 1:0:229:0: [sdhr] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.783920] sd 1:0:240:0: [sdic] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.783922] sd 1:0:240:0: [sdic] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.784449] sd 1:0:236:0: [sdhy] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.784695] sd 1:0:240:0: [sdic] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.784699] sd 1:0:216:0: [sdhe] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.784908] sd 1:0:236:0: [sdhy] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.785185] sd 1:0:240:0: [sdic] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.786479] sd 1:0:241:0: [sdid] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.786481] sd 1:0:241:0: [sdid] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.787211] sd 1:0:241:0: [sdid] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.787691] sd 1:0:241:0: [sdid] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.788313] sd 1:0:235:0: [sdhx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.788315] sd 1:0:235:0: [sdhx] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.788373] sd 1:0:242:0: [sdie] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.788375] sd 1:0:242:0: [sdie] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.789143] sd 1:0:242:0: [sdie] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.789166] sd 1:0:217:0: [sdhf] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.789611] sd 1:0:242:0: [sdie] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.789860] sd 1:0:243:0: [sdif] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.789862] sd 1:0:243:0: [sdif] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.790582] sd 1:0:243:0: [sdif] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.790793] sd 1:0:237:0: [sdhz] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.791055] sd 1:0:243:0: [sdif] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.800363] sd 1:0:244:0: [sdig] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.800364] sd 1:0:244:0: [sdig] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.801019] sd 1:0:192:0: [sdgg] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.801076] sd 1:0:244:0: [sdig] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.801534] sd 1:0:244:0: [sdig] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.802027] sd 1:0:229:0: [sdhr] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.802915] sd 1:0:235:0: [sdhx] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.806075] sd 1:0:235:0: [sdhx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.812896] sd 1:0:217:0: [sdhf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.813506] sd 1:0:241:0: [sdid] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.813541] sd 1:0:244:0: [sdig] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.813603] sd 1:0:232:0: [sdhu] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.814205] sd 1:0:242:0: [sdie] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.814940] sd 1:0:240:0: [sdic] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.817604] sd 1:0:229:0: [sdhr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.817730] sd 1:0:233:0: [sdhv] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.819102] sd 1:0:234:0: [sdhw] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.820073] sd 1:0:216:0: [sdhe] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.822026] sd 1:0:243:0: [sdif] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.827862] sd 1:0:236:0: [sdhy] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.831404] sd 1:0:235:0: [sdhx] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 31.869136] sd 1:0:238:0: [sdia] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:44 2019][ 31.869138] sd 1:0:238:0: [sdia] 4096-byte physical blocks [Mon Dec 9 06:17:44 2019][ 31.873268] sd 1:0:167:0: [sdfi] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 31.881364] sd 1:0:239:0: [sdib] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:44 2019][ 31.881365] sd 1:0:239:0: [sdib] 4096-byte physical blocks [Mon Dec 9 06:17:44 2019][ 31.886900] sd 1:0:239:0: [sdib] Write Protect is off [Mon Dec 9 06:17:44 2019][ 31.891023] sd 1:0:228:0: [sdhq] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 31.894258] sd 1:0:205:0: [sdgt] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 31.899945] sd 1:0:168:0: [sdfj] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 31.909223] sd 1:0:239:0: [sdib] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:44 2019][ 31.937424] sd 1:0:238:0: [sdia] Write Protect is off [Mon Dec 9 06:17:44 2019][ 31.938245] sd 1:0:155:0: [sdew] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 31.983172] sd 1:0:216:0: [sdhe] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 32.045746] sd 1:0:238:0: [sdia] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:44 2019][ 32.061830] sd 1:0:229:0: [sdhr] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 32.120234] sd 1:0:156:0: [sdex] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 32.170339] sd 1:0:178:0: [sdft] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 32.459480] sd 1:0:217:0: [sdhf] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 32.538296] sd 1:0:239:0: [sdib] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 32.625166] sd 1:0:177:0: [sdfs] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 32.879708] sd 1:0:238:0: [sdia] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 38.910007] sd 1:0:3:0: [sdc] 4096-byte physical blocks [Mon Dec 9 06:17:44 2019][ 38.915954] sd 1:0:3:0: [sdc] Write Protect is off [Mon Dec 9 06:17:44 2019][ 38.921199] sd 1:0:3:0: [sdc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:44 2019][ 39.042872] sd 1:0:3:0: [sdc] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ OK ] Found device PERC_H330_Mini os. [Mon Dec 9 06:17:44 2019][ OK ] Started dracut initqueue hook. [Mon Dec 9 06:17:44 2019][ OK ] Reached target Remote File Systems (Pre). [Mon Dec 9 06:17:44 2019][ OK ] Reached target Remote File Systems. [Mon Dec 9 06:17:44 2019] Starting File System Check on /dev/...4-e7db-49b7-baed-d6c7905c5cdc... [Mon Dec 9 06:17:44 2019][ OK ] Started File System Check on /dev/d...4c4-e7db-49b7-baed-d6c7905c5cdc. [Mon Dec 9 06:17:44 2019] Mounting /sysroot... [Mon Dec 9 06:17:44 2019][ 39.131528] EXT4-fs (sda2): mounted filesystem with ordered data mode. Opts: (null) [Mon Dec 9 06:17:44 2019][ OK ] Mounted /sysroot. [Mon Dec 9 06:17:44 2019][ OK ] Reached target Initrd Root File System. [Mon Dec 9 06:17:44 2019] Starting Reload Configuration from the Real Root... [Mon Dec 9 06:17:44 2019][ OK ] Started Reload Configuration from the Real Root. [Mon Dec 9 06:17:44 2019][ OK ] Reached target Initrd File Systems. [Mon Dec 9 06:17:44 2019][ OK ] Reached target Initrd Default Target. [Mon Dec 9 06:17:44 2019] Starting dracut pre-pivot and cleanup hook... [Mon Dec 9 06:17:44 2019][ OK ] Started dracut pre-pivot and cleanup hook. [Mon Dec 9 06:17:44 2019] Starting Cleaning Up and Shutting Down Daemons... [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Timers. [Mon Dec 9 06:17:44 2019] Starting Plymouth switch root service... [Mon Dec 9 06:17:44 2019][ OK ] Stopped Cleaning Up and Shutting Down Daemons. [Mon Dec 9 06:17:44 2019][ OK ] Stopped dracut pre-pivot and cleanup hook. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Remote File Systems. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Remote File Systems (Pre). [Mon Dec 9 06:17:44 2019][ OK ] Stopped dracut initqueue hook. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Initrd Default Target. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Basic System. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target System Initialization. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Local File Systems. [Mon Dec 9 06:17:44 2019] Stopping udev Kernel Device Manager..[ 39.438534] systemd-journald[365]: Received SIGTERM from PID 1 (systemd). [Mon Dec 9 06:17:44 2019]. [Mon Dec 9 06:17:44 2019][ OK ] Stopped Apply Kernel Variables. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Swap. [Mon Dec 9 06:17:44 2019][ OK ] Stopped udev Coldplug all Devices. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Slices. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Paths. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Sockets. [Mon Dec 9 06:17:44 2019][ OK ] Stopped udev Kernel Device Manager. [Mon Dec 9 06:17:44 2019][ OK [0[ 39.475859] SELinux: Disabled at runtime. [Mon Dec 9 06:17:44 2019]m] Stopped Create Static Device Nodes in /dev. [Mon Dec 9 06:17:44 2019][ OK ] Stopped Create list of required sta...ce nodes for the current kernel. [Mon Dec 9 06:17:44 2019][ OK ] Stopped dracut pre-udev hook. [Mon Dec 9 06:17:44 2019][ OK ] Stopped dracut cmdline hook. [Mon Dec 9 06:17:44 2019][ OK ] Closed udev Kernel Socket. [Mon Dec 9 06:17:44 2019][ OK ] Closed udev Control Socket. [Mon Dec 9 06:17:44 2019] Starting Cleanup udevd DB... [Mon Dec 9 06:17:44 2019][ OK ] Started Plymouth switch root service. [Mon Dec 9 06:17:44 2019][ OK ] Started Cleanup udevd DB. [Mon Dec 9 06:17:44 2019][[ 39.521533] type=1404 audit(1575901064.012:2): selinux=0 auid=4294967295 ses=4294967295 [Mon Dec 9 06:17:44 2019][32m OK ] Reached target Switch Root. [Mon Dec 9 06:17:44 2019] Starting Switch Root... [Mon Dec 9 06:17:44 2019][ 39.552759] ip_tables: (C) 2000-2006 Netfilter Core Team [Mon Dec 9 06:17:44 2019][ 39.558776] systemd[1]: Inserted module 'ip_tables' [Mon Dec 9 06:17:44 2019] [Mon Dec 9 06:17:44 2019]Welcome to CentOS Linux 7 (Core)! [Mon Dec 9 06:17:44 2019] [Mon Dec 9 06:17:44 2019][ OK ] Stopped Switch Root. [Mon Dec 9 06:17:44 2019][ OK ] Stopped Journal Service. [Mon Dec 9 06:17:44 2019] Starting Journal Service... [Mon Dec 9 06:17:44 2019] Starting Create list of required st... nodes for the current kernel... [Mon Dec 9 06:17:44 2019][[ 39.666045] EXT4-fs (sda2): re-mounted. Opts: (null) [Mon Dec 9 06:17:44 2019] OK ] Started Forward Password Requests to Wall Directory Watch. [Mon Dec 9 06:17:44 2019][ OK ] Set up automount Arbitrary Executab...ats File System Automount [ 39.684129] systemd-journald[5778]: Received request to flush runtime journal from PID 1 [Mon Dec 9 06:17:44 2019]Point. [Mon Dec 9 06:17:44 2019][ OK ] Created slice User and Session Slice. [Mon Dec 9 06:17:44 2019][ OK ] Reached target Local Encrypted Volumes. [Mon Dec 9 06:17:44 2019] Mounting Huge Pages File System... [Mon Dec 9 06:17:44 2019] Mounting POSIX Message Queue File System... [Mon Dec 9 06:17:44 2019][ OK ] Listening on Delayed Shutdown Socket. [Mon Dec 9 06:17:44 2019][ OK ] Reached target Paths. [Mon Dec 9 06:17:44 2019] Starting Collect Read-Ahead Data... [Mon Dec 9 06:17:44 2019][ OK ] Reached target Slices. [Mon Dec 9 06:17:44 2019] Mounting Debug File System... [Mon Dec 9 06:17:44 2019][ OK ] Reached target RPC Port Mapper. [Mon Dec 9 06:17:44 2019] Starting Read and set NIS domainname from /etc/sysconfig/network... [Mon Dec 9 06:17:44 2019][ OK ] Listening on udev Kernel Socket. [Mon Dec 9 06:17:44 2019] Starting Availability of block devices... [Mon Dec 9 06:17:44 2019][ OK ] Created slice system-serial\x2dgetty.slice. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Switch Root. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Initrd File Systems. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Initrd Root File System. [Mon Dec 9 06:17:44 2019][ OK ] Li[ 39.774789] piix4_smbus 0000:00:14.0: SMBus Host Controller at 0xb00, revision 0 [Mon Dec 9 06:17:45 2019]stening on /dev/[ 39.782438] piix4_smbus 0000:00:14.0: Using register 0x2e for SMBus port selection [Mon Dec 9 06:17:45 2019]initctl Compatibility Named Pipe. [Mon Dec 9 06:17:45 2019] Starting Replay Read-Ahead Data... [Mon Dec 9 06:17:45 2019][ OK ] Created slice system-selinux\x2dpol...grate\x2dlocal\x2dchanges.slice. [Mon Dec 9 06:17:45 2019][ OK ] C[ 39.807059] ACPI Error: reated slice sysNo handler for Region [SYSI] (ffff8913a9e7da68) [IPMI]tem-getty.slice. (20130517/evregion-162) [Mon Dec 9 06:17:45 2019][ 39.820149] ACPI Error: [Mon Dec 9 06:17:45 2019][ OK Region IPMI (ID=7) has no handler (20130517/exfldio-305) [Mon Dec 9 06:17:45 2019][0m] Listening o[ 39.829034] ACPI Error: n udev Control SMethod parse/execution failed ocket. [Mon Dec 9 06:17:45 2019][ [\_SB_.PMI0._GHL] (Node ffff8913a9e7a5a0) OK ] Start, AE_NOT_EXISTed Collect Read- (20130517/psparse-536) [Mon Dec 9 06:17:45 2019]Ahead Data. [Mon Dec 9 06:17:45 2019][[ 39.847923] ACPI Error: [32m OK ] Method parse/execution failed Started Create l[\_SB_.PMI0._PMC] (Node ffff8913a9e7a500)ist of required , AE_NOT_EXISTsta...ce nodes f (20130517/psparse-536) [Mon Dec 9 06:17:45 2019]or the current k[ 39.866864] ACPI Exception: AE_NOT_EXIST, ernel. [Mon Dec 9 06:17:45 2019][ Evaluating _PMC OK ] Start (20130517/power_meter-753) [Mon Dec 9 06:17:45 2019]ed Availability of block devices[ 39.880616] ipmi message handler version 39.2 [Mon Dec 9 06:17:45 2019]. [Mon Dec 9 06:17:45 2019][ OK [ 39.880745] ccp 0000:02:00.2: 3 command queues available [Mon Dec 9 06:17:45 2019]] Started Re[ 39.880831] ccp 0000:02:00.2: Queue 2 can access 4 LSB regions [Mon Dec 9 06:17:45 2019]play Read-Ahead [ 39.880833] ccp 0000:02:00.2: Queue 3 can access 4 LSB regions [Mon Dec 9 06:17:45 2019][ 39.880835] ccp 0000:02:00.2: Queue 4 can access 4 LSB regions [Mon Dec 9 06:17:45 2019]Data. [Mon Dec 9 06:17:45 2019] [ 39.880836] ccp 0000:02:00.2: Queue 0 gets LSB 4 [Mon Dec 9 06:17:45 2019] Starting Remoun[ 39.880837] ccp 0000:02:00.2: Queue 1 gets LSB 5 [Mon Dec 9 06:17:45 2019][ 39.880838] ccp 0000:02:00.2: Queue 2 gets LSB 6 [Mon Dec 9 06:17:45 2019]t Root and Kerne[ 39.881173] ccp 0000:02:00.2: enabled [Mon Dec 9 06:17:45 2019][ 39.881289] ccp 0000:03:00.1: 5 command queues available [Mon Dec 9 06:17:45 2019]l File Systems..[ 39.881348] ccp 0000:03:00.1: Queue 0 can access 7 LSB regions [Mon Dec 9 06:17:45 2019]. [Mon Dec 9 06:17:45 2019] Sta[ 39.881349] ccp 0000:03:00.1: Queue 1 can access 7 LSB regions [Mon Dec 9 06:17:45 2019]rting Create Sta[ 39.881351] ccp 0000:03:00.1: Queue 2 can access 7 LSB regions [Mon Dec 9 06:17:45 2019]tic Device Nodes[ 39.881353] ccp 0000:03:00.1: Queue 3 can access 7 LSB regions [Mon Dec 9 06:17:45 2019] in /dev... [Mon Dec 9 06:17:45 2019] [ 39.881354] ccp 0000:03:00.1: Queue 4 can access 7 LSB regions [Mon Dec 9 06:17:45 2019] Starting [ 39.881356] ccp 0000:03:00.1: Queue 0 gets LSB 1 [Mon Dec 9 06:17:45 2019]Apply Kernel Var[ 39.881357] ccp 0000:03:00.1: Queue 1 gets LSB 2 [Mon Dec 9 06:17:45 2019]iables... [Mon Dec 9 06:17:45 2019][[3[ 39.881358] ccp 0000:03:00.1: Queue 2 gets LSB 3 [Mon Dec 9 06:17:45 2019]2m OK ] Mo[ 39.881359] ccp 0000:03:00.1: Queue 3 gets LSB 4 [Mon Dec 9 06:17:45 2019]unted Huge Pages[ 39.881360] ccp 0000:03:00.1: Queue 4 gets LSB 5 [Mon Dec 9 06:17:45 2019] File System. [Mon Dec 9 06:17:45 2019][ 39.881791] ccp 0000:03:00.1: enabled [Mon Dec 9 06:17:45 2019][ OK [ 39.881985] ccp 0000:41:00.2: 3 command queues available [Mon Dec 9 06:17:45 2019]] Mounted Debug [ 39.882090] ccp 0000:41:00.2: Queue 2 can access 4 LSB regions [Mon Dec 9 06:17:45 2019]File System. [Mon Dec 9 06:17:45 2019][[ 39.882092] ccp 0000:41:00.2: Queue 3 can access 4 LSB regions [Mon Dec 9 06:17:45 2019] OK ][ 39.882094] ccp 0000:41:00.2: Queue 4 can access 4 LSB regions [Mon Dec 9 06:17:45 2019] Mounted POSIX M[ 39.882096] ccp 0000:41:00.2: Queue 0 gets LSB 4 [Mon Dec 9 06:17:45 2019]essage Queue Fil[ 39.882097] ccp 0000:41:00.2: Queue 1 gets LSB 5 [Mon Dec 9 06:17:45 2019]e System. [Mon Dec 9 06:17:45 2019][[3[ 39.882099] ccp 0000:41:00.2: Queue 2 gets LSB 6 [Mon Dec 9 06:17:45 2019]2m OK ] St[ 39.882415] ccp 0000:41:00.2: enabled [Mon Dec 9 06:17:45 2019]arted Journal Se[ 39.882557] ccp 0000:42:00.1: 5 command queues available [Mon Dec 9 06:17:45 2019]rvice. [Mon Dec 9 06:17:45 2019][ [ 39.882622] ccp 0000:42:00.1: Queue 0 can access 7 LSB regions [Mon Dec 9 06:17:45 2019] OK ] Start[ 39.882624] ccp 0000:42:00.1: Queue 1 can access 7 LSB regions [Mon Dec 9 06:17:45 2019][ 39.882627] ccp 0000:42:00.1: Queue 2 can access 7 LSB regions [Mon Dec 9 06:17:45 2019]ed Read and set [ 39.882629] ccp 0000:42:00.1: Queue 3 can access 7 LSB regions [Mon Dec 9 06:17:45 2019][ 39.882632] ccp 0000:42:00.1: Queue 4 can access 7 LSB regions [Mon Dec 9 06:17:45 2019]NIS domainname f[ 39.882633] ccp 0000:42:00.1: Queue 0 gets LSB 1 [Mon Dec 9 06:17:45 2019][ 39.882634] ccp 0000:42:00.1: Queue 1 gets LSB 2 [Mon Dec 9 06:17:45 2019]rom /etc/sysconf[ 39.882636] ccp 0000:42:00.1: Queue 2 gets LSB 3 [Mon Dec 9 06:17:45 2019][ 39.882637] ccp 0000:42:00.1: Queue 3 gets LSB 4 [Mon Dec 9 06:17:45 2019]ig/network. [Mon Dec 9 06:17:45 2019][[ 39.882638] ccp 0000:42:00.1: Queue 4 gets LSB 5 [Mon Dec 9 06:17:45 2019][ 39.883043] ccp 0000:42:00.1: enabled [Mon Dec 9 06:17:45 2019][32m OK ] [ 39.883203] ccp 0000:85:00.2: 3 command queues available [Mon Dec 9 06:17:45 2019]Started Remount [ 39.883291] ccp 0000:85:00.2: Queue 2 can access 4 LSB regions [Mon Dec 9 06:17:45 2019][ 39.883293] ccp 0000:85:00.2: Queue 3 can access 4 LSB regions [Mon Dec 9 06:17:45 2019]Root and Kernel [ 39.883295] ccp 0000:85:00.2: Queue 4 can access 4 LSB regions [Mon Dec 9 06:17:45 2019][ 39.883297] ccp 0000:85:00.2: Queue 0 gets LSB 4 [Mon Dec 9 06:17:45 2019]File Systems. [Mon Dec 9 06:17:45 2019][ 39.883298] ccp 0000:85:00.2: Queue 1 gets LSB 5 [Mon Dec 9 06:17:45 2019][ 39.883299] ccp 0000:85:00.2: Queue 2 gets LSB 6 [Mon Dec 9 06:17:45 2019] Startin[ 39.883691] ccp 0000:85:00.2: enabled [Mon Dec 9 06:17:45 2019][ 39.883796] ccp 0000:86:00.1: 5 command queues available [Mon Dec 9 06:17:45 2019]g udev Coldplug [ 39.883860] ccp 0000:86:00.1: Queue 0 can access 7 LSB regions [Mon Dec 9 06:17:45 2019][ 39.883862] ccp 0000:86:00.1: Queue 1 can access 7 LSB regions [Mon Dec 9 06:17:45 2019]all Devices... [ 39.883865] ccp 0000:86:00.1: Queue 2 can access 7 LSB regions [Mon Dec 9 06:17:45 2019][ 39.883867] ccp 0000:86:00.1: Queue 3 can access 7 LSB regions [Mon Dec 9 06:17:45 2019] [Mon Dec 9 06:17:45 2019] Starti[ 39.883869] ccp 0000:86:00.1: Queue 4 can access 7 LSB regions [Mon Dec 9 06:17:45 2019][ 39.883871] ccp 0000:86:00.1: Queue 0 gets LSB 1 [Mon Dec 9 06:17:45 2019]ng Configure rea[ 39.883872] ccp 0000:86:00.1: Queue 1 gets LSB 2 [Mon Dec 9 06:17:45 2019][ 39.883874] ccp 0000:86:00.1: Queue 2 gets LSB 3 [Mon Dec 9 06:17:45 2019]d-only root supp[ 39.883875] ccp 0000:86:00.1: Queue 3 gets LSB 4 [Mon Dec 9 06:17:45 2019][ 39.883876] ccp 0000:86:00.1: Queue 4 gets LSB 5 [Mon Dec 9 06:17:45 2019]ort... [Mon Dec 9 06:17:45 2019] [ 39.884336] ccp 0000:86:00.1: enabled [Mon Dec 9 06:17:45 2019][ 39.884514] ccp 0000:c2:00.2: 3 command queues available [Mon Dec 9 06:17:45 2019] Starting Flush[ 39.884609] ccp 0000:c2:00.2: Queue 2 can access 4 LSB regions [Mon Dec 9 06:17:45 2019][ 39.884611] ccp 0000:c2:00.2: Queue 3 can access 4 LSB regions [Mon Dec 9 06:17:45 2019] Journal to Pers[ 39.884613] ccp 0000:c2:00.2: Queue 4 can access 4 LSB regions [Mon Dec 9 06:17:45 2019][ 39.884614] ccp 0000:c2:00.2: Queue 0 gets LSB 4 [Mon Dec 9 06:17:45 2019]istent Storage..[ 39.884616] ccp 0000:c2:00.2: Queue 1 gets LSB 5 [Mon Dec 9 06:17:45 2019][ 39.884617] ccp 0000:c2:00.2: Queue 2 gets LSB 6 [Mon Dec 9 06:17:45 2019]. [Mon Dec 9 06:17:45 2019][ OK [ 39.884927] ccp 0000:c2:00.2: enabled [Mon Dec 9 06:17:45 2019][ 39.885035] ccp 0000:c3:00.1: 5 command queues available [Mon Dec 9 06:17:45 2019]] Started Ap[ 39.885087] ccp 0000:c3:00.1: Queue 0 can access 7 LSB regions [Mon Dec 9 06:17:45 2019][ 39.885089] ccp 0000:c3:00.1: Queue 1 can access 7 LSB regions [Mon Dec 9 06:17:45 2019]ply Kernel Varia[ 39.885091] ccp 0000:c3:00.1: Queue 2 can access 7 LSB regions [Mon Dec 9 06:17:45 2019][ 39.885092] ccp 0000:c3:00.1: Queue 3 can access 7 LSB regions [Mon Dec 9 06:17:45 2019]bles. [Mon Dec 9 06:17:45 2019][ [ 39.885094] ccp 0000:c3:00.1: Queue 4 can access 7 LSB regions [Mon Dec 9 06:17:45 2019][ 39.885095] ccp 0000:c3:00.1: Queue 0 gets LSB 1 [Mon Dec 9 06:17:45 2019]OK ] Starte[ 39.885096] ccp 0000:c3:00.1: Queue 1 gets LSB 2 [Mon Dec 9 06:17:45 2019][ 39.885097] ccp 0000:c3:00.1: Queue 2 gets LSB 3 [Mon Dec 9 06:17:45 2019]d Create Static [ 39.885098] ccp 0000:c3:00.1: Queue 3 gets LSB 4 [Mon Dec 9 06:17:45 2019][ 39.885099] ccp 0000:c3:00.1: Queue 4 gets LSB 5 [Mon Dec 9 06:17:45 2019]Device Nodes in [ 39.885462] ccp 0000:c3:00.1: enabled [Mon Dec 9 06:17:45 2019][ 39.978579] sd 0:2:0:0: Attached scsi generic sg0 type 0 [Mon Dec 9 06:17:45 2019]/dev. [Mon Dec 9 06:17:45 2019][ [ 39.978782] scsi 1:0:0:0: Attached scsi generic sg1 type 13 [Mon Dec 9 06:17:45 2019][ 39.979237] scsi 1:0:1:0: Attached scsi generic sg2 type 13 [Mon Dec 9 06:17:45 2019]OK ] Reache[ 39.979772] sd 1:0:2:0: Attached scsi generic sg3 type 0 [Mon Dec 9 06:17:45 2019][ 39.980475] sd 1:0:3:0: Attached scsi generic sg4 type 0 [Mon Dec 9 06:17:45 2019]d target Local F[ 39.981364] sd 1:0:4:0: Attached scsi generic sg5 type 0 [Mon Dec 9 06:17:45 2019][ 39.981781] sd 1:0:5:0: Attached scsi generic sg6 type 0 [Mon Dec 9 06:17:45 2019]ile Systems (Pre[ 39.982219] sd 1:0:6:0: Attached scsi generic sg7 type 0 [Mon Dec 9 06:17:45 2019][ 39.982748] sd 1:0:7:0: Attached scsi generic sg8 type 0 [Mon Dec 9 06:17:45 2019]). [Mon Dec 9 06:17:45 2019] St[ 39.983085] sd 1:0:8:0: Attached scsi generic sg9 type 0 [Mon Dec 9 06:17:45 2019][ 39.983451] sd 1:0:9:0: Attached scsi generic sg10 type 0 [Mon Dec 9 06:17:45 2019]arting udev Kern[ 39.983767] sd 1:0:10:0: Attached scsi generic sg11 type 0 [Mon Dec 9 06:17:45 2019][ 39.984304] sd 1:0:11:0: Attached scsi generic sg12 type 0 [Mon Dec 9 06:17:45 2019]el Device Manage[ 39.984869] sd 1:0:12:0: Attached scsi generic sg13 type 0 [Mon Dec 9 06:17:45 2019]r... [Mon Dec 9 06:17:45 2019][ O[ 39.985272] sd 1:0:13:0: Attached scsi generic sg14 type 0 [Mon Dec 9 06:17:45 2019][ 39.985828] sd 1:0:14:0: Attached scsi generic sg15 type 0 [Mon Dec 9 06:17:45 2019]K ] Started[ 39.986197] sd 1:0:15:0: Attached scsi generic sg16 type 0 [Mon Dec 9 06:17:45 2019][ 39.986459] sd 1:0:16:0: Attached scsi generic sg17 type 0 [Mon Dec 9 06:17:45 2019] Configure read-[ 39.986698] sd 1:0:17:0: Attached scsi generic sg18 type 0 [Mon Dec 9 06:17:45 2019][ 39.987545] sd 1:0:18:0: Attached scsi generic sg19 type 0 [Mon Dec 9 06:17:45 2019]only root suppor[ 39.988001] sd 1:0:19:0: Attached scsi generic sg20 type 0 [Mon Dec 9 06:17:45 2019][ 39.988360] sd 1:0:20:0: Attached scsi generic sg21 type 0 [Mon Dec 9 06:17:45 2019]t. [Mon Dec 9 06:17:45 2019] St[ 39.988620] sd 1:0:21:0: Attached scsi generic sg22 type 0 [Mon Dec 9 06:17:45 2019][ 39.988875] sd 1:0:22:0: Attached scsi generic sg23 type 0 [Mon Dec 9 06:17:45 2019]arting Load/Save[ 39.989676] sd 1:0:23:0: Attached scsi generic sg24 type 0 [Mon Dec 9 06:17:45 2019][ 39.990382] sd 1:0:24:0: Attached scsi generic sg25 type 0 [Mon Dec 9 06:17:45 2019] Random Seed... [ 39.991164] sd 1:0:25:0: Attached scsi generic sg26 type 0 [Mon Dec 9 06:17:45 2019][ 39.991680] sd 1:0:26:0: Attached scsi generic sg27 type 0 [Mon Dec 9 06:17:45 2019] [Mon Dec 9 06:17:45 2019][ OK [[ 39.992131] sd 1:0:27:0: Attached scsi generic sg28 type 0 [Mon Dec 9 06:17:45 2019][ 39.992596] sd 1:0:28:0: Attached scsi generic sg29 type 0 [Mon Dec 9 06:17:45 2019]0m] Started Flus[ 39.992887] sd 1:0:29:0: Attached scsi generic sg30 type 0 [Mon Dec 9 06:17:45 2019][ 39.993941] sd 1:0:30:0: Attached scsi generic sg31 type 0 [Mon Dec 9 06:17:45 2019]h Journal to Per[ 39.994451] sd 1:0:31:0: Attached scsi generic sg32 type 0 [Mon Dec 9 06:17:45 2019][ 39.995036] sd 1:0:32:0: Attached scsi generic sg33 type 0 [Mon Dec 9 06:17:45 2019]sistent Storage.[ 39.995597] sd 1:0:33:0: Attached scsi generic sg34 type 0 [Mon Dec 9 06:17:45 2019][ 39.995859] sd 1:0:34:0: Attached scsi generic sg35 type 0 [Mon Dec 9 06:17:45 2019] [Mon Dec 9 06:17:45 2019][ OK [ 39.997725] sd 1:0:35:0: Attached scsi generic sg36 type 0 [Mon Dec 9 06:17:45 2019][0m] Started Loa[ 39.999311] sd 1:0:36:0: Attached scsi generic sg37 type 0 [Mon Dec 9 06:17:45 2019][ 40.000600] sd 1:0:37:0: Attached scsi generic sg38 type 0 [Mon Dec 9 06:17:45 2019]d/Save Random Se[ 40.000952] sd 1:0:38:0: Attached scsi generic sg39 type 0 [Mon Dec 9 06:17:45 2019][ 40.001234] sd 1:0:39:0: Attached scsi generic sg40 type 0 [Mon Dec 9 06:17:45 2019]ed. [Mon Dec 9 06:17:45 2019][ OK[ 40.001409] sd 1:0:40:0: Attached scsi generic sg41 type 0 [Mon Dec 9 06:17:45 2019][ 40.003122] sd 1:0:41:0: Attached scsi generic sg42 type 0 [Mon Dec 9 06:17:45 2019] ] Started [ 40.003664] sd 1:0:42:0: Attached scsi generic sg43 type 0 [Mon Dec 9 06:17:45 2019][ 40.004289] sd 1:0:43:0: Attached scsi generic sg44 type 0 [Mon Dec 9 06:17:45 2019]udev Kernel Devi[ 40.008943] sd 1:0:44:0: Attached scsi generic sg45 type 0 [Mon Dec 9 06:17:45 2019][ 40.009505] sd 1:0:45:0: Attached scsi generic sg46 type 0 [Mon Dec 9 06:17:45 2019]ce Manager. [Mon Dec 9 06:17:45 2019][ 40.009952] sd 1:0:46:0: Attached scsi generic sg47 type 0 [Mon Dec 9 06:17:45 2019][ 40.012087] sd 1:0:47:0: Attached scsi generic sg48 type 0 [Mon Dec 9 06:17:45 2019][ 40.013766] sd 1:0:48:0: Attached scsi generic sg49 type 0 [Mon Dec 9 06:17:45 2019][ 40.014150] sd 1:0:49:0: Attached scsi generic sg50 type 0 [Mon Dec 9 06:17:45 2019][ 40.016746] sd 1:0:50:0: Attached scsi generic sg51 type 0 [Mon Dec 9 06:17:45 2019][ 40.017210] sd 1:0:51:0: Attached scsi generic sg52 type 0 [Mon Dec 9 06:17:45 2019][ 40.017674] sd 1:0:52:0: Attached scsi generic sg53 type 0 [Mon Dec 9 06:17:45 2019][ 40.018220] sd 1:0:53:0: Attached scsi generic sg54 type 0 [Mon Dec 9 06:17:45 2019][ 40.018968] sd 1:0:54:0: Attached scsi generic sg55 type 0 [Mon Dec 9 06:17:45 2019][ 40.019521] sd 1:0:55:0: Attached scsi generic sg56 type 0 [Mon Dec 9 06:17:45 2019][ 40.021127] sd 1:0:56:0: Attached scsi generic sg57 type 0 [Mon Dec 9 06:17:45 2019][ 40.021541] sd 1:0:57:0: Attached scsi generic sg58 type 0 [Mon Dec 9 06:17:45 2019][ 40.023123] sd 1:0:58:0: Attached scsi generic sg59 type 0 [Mon Dec 9 06:17:45 2019][ 40.023550] sd 1:0:59:0: Attached scsi generic sg60 type 0 [Mon Dec 9 06:17:45 2019][ 40.025883] sd 1:0:60:0: Attached scsi generic sg61 type 0 [Mon Dec 9 06:17:45 2019][ 40.026256] sd 1:0:61:0: Attached scsi generic sg62 type 0 [Mon Dec 9 06:17:45 2019][ 40.026485] scsi 1:0:62:0: Attached scsi generic sg63 type 13 [Mon Dec 9 06:17:45 2019][ 40.026678] sd 1:0:63:0: Attached scsi generic sg64 type 0 [Mon Dec 9 06:17:45 2019][ 40.026809] sd 1:0:64:0: Attached scsi generic sg65 type 0 [Mon Dec 9 06:17:45 2019][ 40.026878] sd 1:0:65:0: Attached scsi generic sg66 type 0 [Mon Dec 9 06:17:45 2019][ 40.026919] sd 1:0:66:0: Attached scsi generic sg67 type 0 [Mon Dec 9 06:17:45 2019][ 40.026961] sd 1:0:67:0: Attached scsi generic sg68 type 0 [Mon Dec 9 06:17:46 2019][ 40.027083] sd 1:0:68:0: Attached scsi generic sg69 type 0 [Mon Dec 9 06:17:46 2019][ 40.027433] sd 1:0:69:0: Attached scsi generic sg70 type 0 [Mon Dec 9 06:17:46 2019][ 40.029594] sd 1:0:70:0: Attached scsi generic sg71 type 0 [Mon Dec 9 06:17:46 2019][ 40.029931] sd 1:0:71:0: Attached scsi generic sg72 type 0 [Mon Dec 9 06:17:46 2019][ 40.030284] sd 1:0:72:0: Attached scsi generic sg73 type 0 [Mon Dec 9 06:17:46 2019][ 40.031785] sd 1:0:73:0: Attached scsi generic sg74 type 0 [Mon Dec 9 06:17:46 2019][ 40.033119] sd 1:0:74:0: Attached scsi generic sg75 type 0 [Mon Dec 9 06:17:46 2019][ 40.035467] sd 1:0:75:0: Attached scsi generic sg76 type 0 [Mon Dec 9 06:17:46 2019][ 40.036846] sd 1:0:76:0: Attached scsi generic sg77 type 0 [Mon Dec 9 06:17:46 2019][ 40.037137] sd 1:0:77:0: Attached scsi generic sg78 type 0 [Mon Dec 9 06:17:46 2019][ 40.037345] sd 1:0:78:0: Attached scsi generic sg79 type 0 [Mon Dec 9 06:17:46 2019][ 40.037592] sd 1:0:79:0: Attached scsi generic sg80 type 0 [Mon Dec 9 06:17:46 2019][ 40.038648] sd 1:0:80:0: Attached scsi generic sg81 type 0 [Mon Dec 9 06:17:46 2019][ 40.039831] sd 1:0:81:0: Attached scsi generic sg82 type 0 [Mon Dec 9 06:17:46 2019][ 40.040675] sd 1:0:82:0: Attached scsi generic sg83 type 0 [Mon Dec 9 06:17:46 2019][ 40.041209] sd 1:0:83:0: Attached scsi generic sg84 type 0 [Mon Dec 9 06:17:46 2019][ 40.041610] sd 1:0:84:0: Attached scsi generic sg85 type 0 [Mon Dec 9 06:17:46 2019][ 40.044140] sd 1:0:85:0: Attached scsi generic sg86 type 0 [Mon Dec 9 06:17:46 2019][ 40.044524] sd 1:0:86:0: Attached scsi generic sg87 type 0 [Mon Dec 9 06:17:46 2019][ 40.046308] sd 1:0:87:0: Attached scsi generic sg88 type 0 [Mon Dec 9 06:17:46 2019][ 40.046660] sd 1:0:88:0: Attached scsi generic sg89 type 0 [Mon Dec 9 06:17:46 2019][ 40.047483] sd 1:0:89:0: Attached scsi generic sg90 type 0 [Mon Dec 9 06:17:46 2019][ 40.048100] sd 1:0:90:0: Attached scsi generic sg91 type 0 [Mon Dec 9 06:17:46 2019][ 40.048653] sd 1:0:91:0: Attached scsi generic sg92 type 0 [Mon Dec 9 06:17:46 2019][ 40.049326] sd 1:0:92:0: Attached scsi generic sg93 type 0 [Mon Dec 9 06:17:46 2019][ 40.050057] sd 1:0:93:0: Attached scsi generic sg94 type 0 [Mon Dec 9 06:17:46 2019][ 40.050610] sd 1:0:94:0: Attached scsi generic sg95 type 0 [Mon Dec 9 06:17:46 2019][ 40.051299] sd 1:0:95:0: Attached scsi generic sg96 type 0 [Mon Dec 9 06:17:46 2019][ 40.052011] sd 1:0:96:0: Attached scsi generic sg97 type 0 [Mon Dec 9 06:17:46 2019][ 40.052751] sd 1:0:97:0: Attached scsi generic sg98 type 0 [Mon Dec 9 06:17:46 2019][ 40.053355] sd 1:0:98:0: Attached scsi generic sg99 type 0 [Mon Dec 9 06:17:46 2019][ 40.054131] sd 1:0:99:0: Attached scsi generic sg100 type 0 [Mon Dec 9 06:17:46 2019][ 40.056040] sd 1:0:100:0: Attached scsi generic sg101 type 0 [Mon Dec 9 06:17:46 2019][ 40.058756] sd 1:0:101:0: Attached scsi generic sg102 type 0 [Mon Dec 9 06:17:46 2019][ 40.063950] sd 1:0:102:0: Attached scsi generic sg103 type 0 [Mon Dec 9 06:17:46 2019][ 40.064363] sd 1:0:103:0: Attached scsi generic sg104 type 0 [Mon Dec 9 06:17:46 2019][ 40.064663] sd 1:0:104:0: Attached scsi generic sg105 type 0 [Mon Dec 9 06:17:46 2019][ 40.066695] sd 1:0:105:0: Attached scsi generic sg106 type 0 [Mon Dec 9 06:17:46 2019][ 40.067289] sd 1:0:106:0: Attached scsi generic sg107 type 0 [Mon Dec 9 06:17:46 2019][ 40.070381] sd 1:0:107:0: Attached scsi generic sg108 type 0 [Mon Dec 9 06:17:46 2019][ 40.071065] sd 1:0:108:0: Attached scsi generic sg109 type 0 [Mon Dec 9 06:17:46 2019][ 40.071658] sd 1:0:109:0: Attached scsi generic sg110 type 0 [Mon Dec 9 06:17:46 2019][ 40.072264] sd 1:0:110:0: Attached scsi generic sg111 type 0 [Mon Dec 9 06:17:46 2019][ OK [ 41.016318] sd 1:0:111:0: Attached scsi generic sg112 type 0 [Mon Dec 9 06:17:46 2019]] Started udev C[ 41.018549] ipmi device interface [Mon Dec 9 06:17:46 2019]oldplug all Devices. [Mon Dec 9 06:17:46 2019] Starting Device-[ 41.030678] sd 1:0:112:0: Attached scsi generic sg113 type 0 [Mon Dec 9 06:17:46 2019]Mapper Multipath Device Controll[ 41.038979] sd 1:0:113:0: Attached scsi generic sg114 type 0 [Mon Dec 9 06:17:46 2019]er... [Mon Dec 9 06:17:46 2019][ 41.041079] IPMI System Interface driver [Mon Dec 9 06:17:46 2019][ 41.041125] ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS [Mon Dec 9 06:17:46 2019][ 41.041128] ipmi_si: SMBIOS: io 0xca8 regsize 1 spacing 4 irq 10 [Mon Dec 9 06:17:46 2019][ 41.041129] ipmi_si: Adding SMBIOS-specified kcs state machine [Mon Dec 9 06:17:46 2019][ 41.041177] ipmi_si IPI0001:00: ipmi_platform: probing via ACPI [Mon Dec 9 06:17:46 2019][ 41.041202] ipmi_si IPI0001:00: [io 0x0ca8] regsize 1 spacing 4 irq 10 [Mon Dec 9 06:17:46 2019][ 41.041204] ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI [Mon Dec 9 06:17:46 2019][ 41.041204] ipmi_si: Adding ACPI-specified kcs state machine [Mon Dec 9 06:17:46 2019][ 41.041564] ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca8, slave address 0x20, irq 10 [Mon Dec 9 06:17:46 2019][ 41.066626] ipmi_si IPI0001:00: The BMC does not support setting the recv irq bit, compensating, but the BMC needs to be fixed. [Mon Dec 9 06:17:46 2019][ 41.074679] ipmi_si IPI0001:00: Using irq 10 [Mon Dec 9 06:17:46 2019][ 41.100010] ipmi_si IPI0001:00: Found new BMC (man_id: 0x0002a2, prod_id: 0x0100, dev_id: 0x20) [Mon Dec 9 06:17:46 2019][ 41.130277] sd 1:0:114:0: Attached scsi generic sg115 type 0 [Mon Dec 9 06:17:46 2019][ 41.134196] device-mapper: uevent: version 1.0.3 [Mon Dec 9 06:17:46 2019][ 41.138049] device-mapper: ioctl: 4.37.1-ioctl (2018-04-03) initialised: dm-devel@redhat.com [Mon Dec 9 06:17:46 2019][ 41.153114] sd 1:0:115:0: Attached scsi generic sg116 type 0 [Mon Dec 9 06:17:46 2019][ 41.159138] sd 1:0:116:0: Attached scsi generic sg117 type 0 [Mon Dec 9 06:17:46 2019][ 41.166824] sd 1:0:117:0: Attached scsi generic sg118 type 0 [Mon Dec 9 06:17:46 2019][ 41.173200] sd 1:0:118:0: Attached scsi generic sg119 type 0 [Mon Dec 9 06:17:46 2019][ 41.179262] sd 1:0:119:0: Attached scsi generic sg120 type 0 [Mon Dec 9 06:17:46 2019][ 41.184534] ipmi_si IPI0001:00: IPMI kcs interface initialized [Mon Dec 9 06:17:46 2019][ OK ] Started Device-Mapper Multipath Device Controller. [Mon Dec 9 06:17:46 2019][ 41.197001] sd 1:0:120:0: Attached scsi generic sg121 type 0 [Mon Dec 9 06:17:46 2019][ 41.203602] sd 1:0:121:0: Attached scsi generic sg122 type 0 [Mon Dec 9 06:17:46 2019][ 41.210854] sd 1:0:122:0: Attached scsi generic sg123 type 0 [Mon Dec 9 06:17:46 2019][ 41.218846] scsi 1:0:123:0: Attached scsi generic sg124 type 13 [Mon Dec 9 06:17:46 2019][ 41.226222] sd 1:0:124:0: Attached scsi generic sg125 type 0 [Mon Dec 9 06:17:46 2019][ 41.232499] sd 1:0:125:0: Attached scsi generic sg126 type 0 [Mon Dec 9 06:17:46 2019][ 41.238880] sd 1:0:126:0: Attached scsi generic sg127 type 0 [Mon Dec 9 06:17:46 2019][ 41.247689] sd 1:0:127:0: Attached scsi generic sg128 type 0 [Mon Dec 9 06:17:46 2019][ 41.254079] sd 1:0:128:0: Attached scsi generic sg129 type 0 [Mon Dec 9 06:17:46 2019][ 41.260439] sd 1:0:129:0: Attached scsi generic sg130 type 0 [Mon Dec 9 06:17:46 2019][ 41.267986] sd 1:0:130:0: Attached scsi generic sg131 type 0 [Mon Dec 9 06:17:46 2019][ 41.274038] sd 1:0:131:0: Attached scsi generic sg132 type 0 [Mon Dec 9 06:17:46 2019][ 41.280543] sd 1:0:132:0: Attached scsi generic sg133 type 0 [Mon Dec 9 06:17:46 2019][ 41.286763] sd 1:0:133:0: Attached scsi generic sg134 type 0 [Mon Dec 9 06:17:46 2019][ 41.293040] sd 1:0:134:0: Attached scsi generic sg135 type 0 [Mon Dec 9 06:17:46 2019][ 41.301028] sd 1:0:135:0: Attached scsi generic sg136 type 0 [Mon Dec 9 06:17:46 2019][ 41.307132] sd 1:0:136:0: Attached scsi generic sg137 type 0 [Mon Dec 9 06:17:46 2019][ 41.314248] sd 1:0:137:0: Attached scsi generic sg138 type 0 [Mon Dec 9 06:17:46 2019][ 41.320309] sd 1:0:138:0: Attached scsi generic sg139 type 0 [Mon Dec 9 06:17:46 2019][ 41.326443] sd 1:0:139:0: Attached scsi generic sg140 type 0 [Mon Dec 9 06:17:46 2019][ 41.332765] sd 1:0:140:0: Attached scsi generic sg141 type 0 [Mon Dec 9 06:17:46 2019][ 41.338768] sd 1:0:141:0: Attached scsi generic sg142 type 0 [Mon Dec 9 06:17:46 2019][ 41.344830] sd 1:0:142:0: Attached scsi generic sg143 type 0 [Mon Dec 9 06:17:46 2019][ 41.351071] sd 1:0:143:0: Attached scsi generic sg144 type 0 [Mon Dec 9 06:17:46 2019][ 41.357222] sd 1:0:144:0: Attached scsi generic sg145 type 0 [Mon Dec 9 06:17:46 2019][ 41.363205] sd 1:0:145:0: Attached scsi generic sg146 type 0 [Mon Dec 9 06:17:46 2019][ 41.369107] sd 1:0:146:0: Attached scsi generic sg147 type 0 [Mon Dec 9 06:17:46 2019][ 41.375532] sd 1:0:147:0: Attached scsi generic sg148 type 0 [Mon Dec 9 06:17:46 2019][ 41.381258] sd 1:0:148:0: Attached scsi generic sg149 type 0 [Mon Dec 9 06:17:46 2019][ 41.387214] sd 1:0:149:0: Attached scsi generic sg150 type 0 [Mon Dec 9 06:17:46 2019][ 41.393498] sd 1:0:150:0: Attached scsi generic sg151 type 0 [Mon Dec 9 06:17:46 2019][ 41.399730] sd 1:0:151:0: Attached scsi generic sg152 type 0 [Mon Dec 9 06:17:46 2019][ 41.405958] sd 1:0:152:0: Attached scsi generic sg153 type 0 [Mon Dec 9 06:17:46 2019][ 41.412226] sd 1:0:153:0: Attached scsi generic sg154 type 0 [Mon Dec 9 06:17:46 2019][ 41.420016] sd 1:0:154:0: Attached scsi generic sg155 type 0 [Mon Dec 9 06:17:46 2019][ 41.426076] sd 1:0:155:0: Attached scsi generic sg156 type 0 [Mon Dec 9 06:17:46 2019][ 41.432318] sd 1:0:156:0: Attached scsi generic sg157 type 0 [Mon Dec 9 06:17:46 2019][ 41.439160] sd 1:0:157:0: Attached scsi generic sg158 type 0 [Mon Dec 9 06:17:46 2019][ 41.445400] sd 1:0:158:0: Attached scsi generic sg159 type 0 [Mon Dec 9 06:17:46 2019][ 41.451484] sd 1:0:159:0: Attached scsi generic sg160 type 0 [Mon Dec 9 06:17:46 2019][ 41.458614] sd 1:0:160:0: Attached scsi generic sg161 type 0 [Mon Dec 9 06:17:46 2019][ 41.466218] sd 1:0:161:0: Attached scsi generic sg162 type 0 [Mon Dec 9 06:17:46 2019][ 41.473285] sd 1:0:162:0: Attached scsi generic sg163 type 0 [Mon Dec 9 06:17:46 2019][ 41.479548] sd 1:0:163:0: Attached scsi generic sg164 type 0 [Mon Dec 9 06:17:46 2019][ 41.485777] sd 1:0:164:0: Attached scsi generic sg165 type 0 [Mon Dec 9 06:17:46 2019][ 41.492726] sd 1:0:165:0: Attached scsi generic sg166 type 0 [Mon Dec 9 06:17:46 2019][ 41.499066] sd 1:0:166:0: Attached scsi generic sg167 type 0 [Mon Dec 9 06:17:46 2019][ 41.505354] sd 1:0:167:0: Attached scsi generic sg168 type 0 [Mon Dec 9 06:17:46 2019][ 41.511522] sd 1:0:168:0: Attached scsi generic sg169 type 0 [Mon Dec 9 06:17:46 2019][ 41.517710] sd 1:0:169:0: Attached scsi generic sg170 type 0 [Mon Dec 9 06:17:46 2019][ 41.523860] sd 1:0:170:0: Attached scsi generic sg171 type 0 [Mon Dec 9 06:17:46 2019][ 41.530058] sd 1:0:171:0: Attached scsi generic sg172 type 0 [Mon Dec 9 06:17:46 2019][ 41.536124] sd 1:0:172:0: Attached scsi generic sg173 type 0 [Mon Dec 9 06:17:46 2019][ 41.542479] sd 1:0:173:0: Attached scsi generic sg174 type 0 [Mon Dec 9 06:17:46 2019][ 41.549159] sd 1:0:174:0: Attached scsi generic sg175 type 0 [Mon Dec 9 06:17:46 2019][ 41.555247] sd 1:0:175:0: Attached scsi generic sg176 type 0 [Mon Dec 9 06:17:46 2019][ 41.561114] sd 1:0:176:0: Attached scsi generic sg177 type 0 [Mon Dec 9 06:17:46 2019][ 41.567475] sd 1:0:177:0: Attached scsi generic sg178 type 0 [Mon Dec 9 06:17:46 2019][ 41.573641] sd 1:0:178:0: Attached scsi generic sg179 type 0 [Mon Dec 9 06:17:46 2019][ 41.580125] sd 1:0:179:0: Attached scsi generic sg180 type 0 [Mon Dec 9 06:17:46 2019][ 41.586444] sd 1:0:180:0: Attached scsi generic sg181 type 0 [Mon Dec 9 06:17:46 2019][ 41.592620] sd 1:0:181:0: Attached scsi generic sg182 type 0 [Mon Dec 9 06:17:46 2019][ 41.599135] sd 1:0:182:0: Attached scsi generic sg183 type 0 [Mon Dec 9 06:17:46 2019][ 41.605200] sd 1:0:183:0: Attached scsi generic sg184 type 0 [Mon Dec 9 06:17:46 2019][ 41.611259] scsi 1:0:184:0: Attached scsi generic sg185 type 13 [Mon Dec 9 06:17:46 2019][ 41.617786] sd 1:0:185:0: Attached scsi generic sg186 type 0 [Mon Dec 9 06:17:46 2019][ 41.625144] sd 1:0:186:0: Attached scsi generic sg187 type 0 [Mon Dec 9 06:17:46 2019][ 41.631113] sd 1:0:187:0: Attached scsi generic sg188 type 0 [Mon Dec 9 06:17:46 2019][ 41.637306] sd 1:0:188:0: Attached scsi generic sg189 type 0 [Mon Dec 9 06:17:46 2019][ 41.643163] sd 1:0:189:0: Attached scsi generic sg190 type 0 [Mon Dec 9 06:17:46 2019][ 41.649153] sd 1:0:190:0: Attached scsi generic sg191 type 0 [Mon Dec 9 06:17:46 2019][ 41.655794] sd 1:0:191:0: Attached scsi generic sg192 type 0 [Mon Dec 9 06:17:46 2019][ 41.661669] sd 1:0:192:0: Attached scsi generic sg193 type 0 [Mon Dec 9 06:17:46 2019][ 41.667527] sd 1:0:193:0: Attached scsi generic sg194 type 0 [Mon Dec 9 06:17:46 2019][ 41.673356] sd 1:0:194:0: Attached scsi generic sg195 type 0 [Mon Dec 9 06:17:46 2019][ 41.679092] sd 1:0:195:0: Attached scsi generic sg196 type 0 [Mon Dec 9 06:17:46 2019][ 41.684847] sd 1:0:196:0: Attached scsi generic sg197 type 0 [Mon Dec 9 06:17:46 2019][ 41.690598] sd 1:0:197:0: Attached scsi generic sg198 type 0 [Mon Dec 9 06:17:46 2019][ 41.696312] sd 1:0:198:0: Attached scsi generic sg199 type 0 [Mon Dec 9 06:17:46 2019][ 41.702041] sd 1:0:199:0: Attached scsi generic sg200 type 0 [Mon Dec 9 06:17:46 2019][ 41.707748] sd 1:0:200:0: Attached scsi generic sg201 type 0 [Mon Dec 9 06:17:46 2019][ 41.713454] sd 1:0:201:0: Attached scsi generic sg202 type 0 [Mon Dec 9 06:17:46 2019][ 41.719176] sd 1:0:202:0: Attached scsi generic sg203 type 0 [Mon Dec 9 06:17:46 2019][ 41.724884] sd 1:0:203:0: Attached scsi generic sg204 type 0 [Mon Dec 9 06:17:46 2019][ 41.730599] sd 1:0:204:0: Attached scsi generic sg205 type 0 [Mon Dec 9 06:17:46 2019][ 41.736299] sd 1:0:205:0: Attached scsi generic sg206 type 0 [Mon Dec 9 06:17:46 2019][ 41.742022] sd 1:0:206:0: Attached scsi generic sg207 type 0 [Mon Dec 9 06:17:46 2019][ 41.747725] sd 1:0:207:0: Attached scsi generic sg208 type 0 [Mon Dec 9 06:17:46 2019][ 41.753444] sd 1:0:208:0: Attached scsi generic sg209 type 0 [Mon Dec 9 06:17:46 2019][ 41.759144] sd 1:0:209:0: Attached scsi generic sg210 type 0 [Mon Dec 9 06:17:46 2019][ 41.764853] sd 1:0:210:0: Attached scsi generic sg211 type 0 [Mon Dec 9 06:17:46 2019][ 41.770549] sd 1:0:211:0: Attached scsi generic sg212 type 0 [Mon Dec 9 06:17:46 2019][ 41.776269] sd 1:0:212:0: Attached scsi generic sg213 type 0 [Mon Dec 9 06:17:47 2019][ 41.781975] sd 1:0:213:0: Attached scsi generic sg214 type 0 [Mon Dec 9 06:17:47 2019][ 41.787692] sd 1:0:214:0: Attached scsi generic sg215 type 0 [Mon Dec 9 06:17:47 2019][ 41.793400] sd 1:0:215:0: Attached scsi generic sg216 type 0 [Mon Dec 9 06:17:47 2019][ 41.799104] sd 1:0:216:0: Attached scsi generic sg217 type 0 [Mon Dec 9 06:17:47 2019][ 41.804818] sd 1:0:217:0: Attached scsi generic sg218 type 0 [Mon Dec 9 06:17:47 2019][ 41.810527] sd 1:0:218:0: Attached scsi generic sg219 type 0 [Mon Dec 9 06:17:47 2019][ 41.816225] sd 1:0:219:0: Attached scsi generic sg220 type 0 [Mon Dec 9 06:17:47 2019][ 41.821930] sd 1:0:220:0: Attached scsi generic sg221 type 0 [Mon Dec 9 06:17:47 2019][ 41.827631] sd 1:0:221:0: Attached scsi generic sg222 type 0 [Mon Dec 9 06:17:47 2019][ 41.833351] sd 1:0:222:0: Attached scsi generic sg223 type 0 [Mon Dec 9 06:17:47 2019][ 41.839059] sd 1:0:223:0: Attached scsi generic sg224 type 0 [Mon Dec 9 06:17:47 2019][ 41.844775] sd 1:0:224:0: Attached scsi generic sg225 type 0 [Mon Dec 9 06:17:47 2019][ 41.850480] sd 1:0:225:0: Attached scsi generic sg226 type 0 [Mon Dec 9 06:17:47 2019][ 41.856185] sd 1:0:226:0: Attached scsi generic sg227 type 0 [Mon Dec 9 06:17:47 2019][ 41.861897] sd 1:0:227:0: Attached scsi generic sg228 type 0 [Mon Dec 9 06:17:47 2019][ 41.867613] sd 1:0:228:0: Attached scsi generic sg229 type 0 [Mon Dec 9 06:17:47 2019][ 41.873314] sd 1:0:229:0: Attached scsi generic sg230 type 0 [Mon Dec 9 06:17:47 2019][ 41.879034] sd 1:0:230:0: Attached scsi generic sg231 type 0 [Mon Dec 9 06:17:47 2019][ 41.884735] sd 1:0:231:0: Attached scsi generic sg232 type 0 [Mon Dec 9 06:17:47 2019][ 41.890443] sd 1:0:232:0: Attached scsi generic sg233 type 0 [Mon Dec 9 06:17:47 2019][ 41.896143] sd 1:0:233:0: Attached scsi generic sg234 type 0 [Mon Dec 9 06:17:47 2019][ 41.901849] sd 1:0:234:0: Attached scsi generic sg235 type 0 [Mon Dec 9 06:17:47 2019][ 41.907551] sd 1:0:235:0: Attached scsi generic sg236 type 0 [Mon Dec 9 06:17:47 2019][ 41.913257] sd 1:0:236:0: Attached scsi generic sg237 type 0 [Mon Dec 9 06:17:47 2019][ 41.918971] sd 1:0:237:0: Attached scsi generic sg238 type 0 [Mon Dec 9 06:17:47 2019][ 41.924675] sd 1:0:238:0: Attached scsi generic sg239 type 0 [Mon Dec 9 06:17:47 2019][ 41.930382] sd 1:0:239:0: Attached scsi generic sg240 type 0 [Mon Dec 9 06:17:47 2019][ 41.936084] sd 1:0:240:0: Attached scsi generic sg241 type 0 [Mon Dec 9 06:17:47 2019][ 41.941798] sd 1:0:241:0: Attached scsi generic sg242 type 0 [Mon Dec 9 06:17:47 2019][ 41.947507] sd 1:0:242:0: Attached scsi generic sg243 type 0 [Mon Dec 9 06:17:47 2019][ 41.953204] sd 1:0:243:0: Attached scsi generic sg244 type 0 [Mon Dec 9 06:17:47 2019][ 41.958911] sd 1:0:244:0: Attached scsi generic sg245 type 0 [Mon Dec 9 06:17:49 2019][ 44.574939] device-mapper: multipath service-time: version 0.3.0 loaded [Mon Dec 9 06:17:54 2019][ 49.035616] ses 1:0:0:0: Attached Enclosure device [Mon Dec 9 06:17:54 2019][ 49.047420] ses 1:0:1:0: Attached Enclosure device [Mon Dec 9 06:17:54 2019][ 49.052325] ses 1:0:62:0: Attached Enclosure device [Mon Dec 9 06:17:54 2019][ 49.057324] ses 1:0:123:0: Attached Enclosure device [Mon Dec 9 06:17:54 2019][ 49.062417] ses 1:0:184:0: Attached Enclosure device [Mon Dec 9 06:17:54 2019][ 49.555777] input: PC Speaker as /devices/platform/pcspkr/input/input2 [Mon Dec 9 06:17:54 2019][ OK ] Found device /dev/ttyS0. [Mon Dec 9 06:17:54 2019][ 49.668905] cryptd: max_cpu_qlen set to 1000 [Mon Dec 9 06:17:54 2019][ 49.707892] AVX2 version of gcm_enc/dec engaged. [Mon Dec 9 06:17:54 2019][ 49.712536] AES CTR mode by8 optimization enabled [Mon Dec 9 06:17:54 2019][ 49.720922] alg: No test for __gcm-aes-aesni (__driver-gcm-aes-aesni) [Mon Dec 9 06:17:54 2019][ 49.727517] alg: No test for __generic-gcm-aes-aesni (__driver-generic-gcm-aes-aesni) [Mon Dec 9 06:17:55 2019][ 49.819024] kvm: Nested Paging enabled [Mon Dec 9 06:17:55 2019][ 49.827753] MCE: In-kernel MCE decoding enabled. [Mon Dec 9 06:17:55 2019][ 49.836456] AMD64 EDAC driver v3.4.0 [Mon Dec 9 06:17:55 2019][ 49.840068] EDAC amd64: DRAM ECC enabled. [Mon Dec 9 06:17:55 2019][ 49.844102] EDAC amd64: F17h detected (node 0). [Mon Dec 9 06:17:55 2019][ 49.848707] EDAC amd64: MC: 0: 0MB 1: 0MB [Mon Dec 9 06:17:55 2019][ 49.853424] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Mon Dec 9 06:17:55 2019][ 49.858146] EDAC amd64: MC: 4: 0MB 5: 0MB [Mon Dec 9 06:17:55 2019][ 49.862864] EDAC amd64: MC: 6: 0MB 7: 0MB [Mon Dec 9 06:17:55 2019][ 49.867586] EDAC amd64: MC: 0: 0MB 1: 0MB [Mon Dec 9 06:17:55 2019][ 49.872292] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Mon Dec 9 06:17:55 2019][ 49.877011] EDAC amd64: MC: 4: 0MB 5: 0MB [Mon Dec 9 06:17:55 2019][ 49.881722] EDAC amd64: MC: 6: 0MB 7: 0MB [Mon Dec 9 06:17:55 2019][ 49.886438] EDAC amd64: using x8 syndromes. [Mon Dec 9 06:17:55 2019][ 49.890635] EDAC amd64: MCT channel count: 2 [Mon Dec 9 06:17:55 2019][ 49.901276] EDAC MC0: Giving out device to 'amd64_edac' 'F17h': DEV 0000:00:18.3 [Mon Dec 9 06:17:55 2019][ 49.908683] EDAC amd64: DRAM ECC enabled. [Mon Dec 9 06:17:55 2019][ 49.912700] EDAC amd64: F17h detected (node 1). [Mon Dec 9 06:17:55 2019][ 49.917287] EDAC amd64: MC: 0: 0MB 1: 0MB [Mon Dec 9 06:17:55 2019][ 49.921999] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Mon Dec 9 06:17:55 2019][ 49.926714] EDAC amd64: MC: 4: 0MB 5: 0MB [Mon Dec 9 06:17:55 2019][ 49.931431] EDAC amd64: MC: 6: 0MB 7: 0MB [Mon Dec 9 06:17:55 2019][ 49.936151] EDAC amd64: MC: 0: 0MB 1: 0MB [Mon Dec 9 06:17:55 2019][ 49.940867] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Mon Dec 9 06:17:55 2019][ 49.945581] EDAC amd64: MC: 4: 0MB 5: 0MB [Mon Dec 9 06:17:55 2019][ 49.950294] EDAC amd64: MC: 6: 0MB 7: 0MB [Mon Dec 9 06:17:55 2019][ 49.955007] EDAC amd64: using x8 syndromes. [Mon Dec 9 06:17:55 2019][ 49.959199] EDAC amd64: MCT channel count: 2 [Mon Dec 9 06:17:55 2019][ 49.970360] EDAC MC1: Giving out device to 'amd64_edac' 'F17h': DEV 0000:00:19.3 [Mon Dec 9 06:17:55 2019][ 49.978033] EDAC amd64: DRAM ECC enabled. [Mon Dec 9 06:17:55 2019][ 49.982056] EDAC amd64: F17h detected (node 2). [Mon Dec 9 06:17:55 2019][ 49.986652] EDAC amd64: MC: 0: 0MB 1: 0MB [Mon Dec 9 06:17:55 2019][ 49.991446] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Mon Dec 9 06:17:55 2019][ 49.996186] EDAC amd64: MC: 4: 0MB 5: 0MB [Mon Dec 9 06:17:55 2019][ 50.000908] EDAC amd64: MC: 6: 0MB 7: 0MB [Mon Dec 9 06:17:55 2019][ 50.005640] EDAC amd64: MC: 0: 0MB 1: 0MB [Mon Dec 9 06:17:55 2019][ 50.010370] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Mon Dec 9 06:17:55 2019][ 50.015080] EDAC amd64: MC: 4: 0MB 5: 0MB [Mon Dec 9 06:17:55 2019][ 50.019795] EDAC amd64: MC: 6: 0MB 7: 0MB [Mon Dec 9 06:17:55 2019][ 50.024508] EDAC amd64: using x8 syndromes. [Mon Dec 9 06:17:55 2019][ 50.028706] EDAC amd64: MCT channel count: 2 [Mon Dec 9 06:17:55 2019][ 50.036602] EDAC MC2: Giving out device to 'amd64_edac' 'F17h': DEV 0000:00:1a.3 [Mon Dec 9 06:17:55 2019][ 50.044009] EDAC amd64: DRAM ECC enabled. [Mon Dec 9 06:17:55 2019][ 50.048034] EDAC amd64: F17h detected (node 3). [Mon Dec 9 06:17:55 2019][ 50.052633] EDAC amd64: MC: 0: 0MB 1: 0MB [Mon Dec 9 06:17:55 2019][ 50.057352] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Mon Dec 9 06:17:55 2019][ 50.062065] EDAC amd64: MC: 4: 0MB 5: 0MB [Mon Dec 9 06:17:55 2019][ 50.066778] EDAC amd64: MC: 6: 0MB 7: 0MB [Mon Dec 9 06:17:55 2019][ 50.071498] EDAC amd64: MC: 0: 0MB 1: 0MB [Mon Dec 9 06:17:55 2019][ 50.076207] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Mon Dec 9 06:17:55 2019][ 50.080926] EDAC amd64: MC: 4: 0MB 5: 0MB [Mon Dec 9 06:17:55 2019][ 50.085638] EDAC amd64: MC: 6: 0MB 7: 0MB [Mon Dec 9 06:17:55 2019][ 50.090351] EDAC amd64: using x8 syndromes. [Mon Dec 9 06:17:55 2019][ 50.094550] EDAC amd64: MCT channel count: 2 [Mon Dec 9 06:17:55 2019][ 50.105189] EDAC MC3: Giving out device to 'amd64_edac' 'F17h': DEV 0000:00:1b.3 [Mon Dec 9 06:17:55 2019][ 50.112934] EDAC PCI0: Giving out device to module 'amd64_edac' controller 'EDAC PCI controller': DEV '0000:00:18.0' (POLLED) [Mon Dec 9 06:17:56 2019][ 50.843845] dcdbas dcdbas: Dell Systems Management Base Driver (version 5.6.0-3.3) [Mon Dec 9 06:17:56 2019]%G%G[ OK ] Found device PERC_H330_Mini EFI\x20System\x20Partition. [Mon Dec 9 06:18:20 2019] [ 75.482153] Adding 4194300k swap on /dev/sda3. Priority:-2 extents:1 across:4194300k FS [Mon Dec 9 06:18:20 2019] Mounting /boot/efi... [Mon Dec 9 06:18:20 2019][ OK ] Found device PERC_H330_Mini 3. [Mon Dec 9 06:18:20 2019] Activating swap /dev/disk/by-uuid/4...7-253b-4b35-98bd-0ebd94f347e5... [Mon Dec 9 06:18:20 2019][ OK ] Activated swap /dev/disk/by-uuid/401ce0e7-253b-4b35-98bd-0ebd94f347e5. [Mon Dec 9 06:18:20 2019][ OK ] Reached target Swap. [Mon Dec 9 06:18:20 2019][ OK ] Mounted /boot/efi. [Mon Dec 9 06:18:20 2019][ OK ] Reached target Local File Systems[ 75.523442] type=1305 audit(1575901100.013:3): audit_pid=49307 old=0 auid=4294967295 ses=4294967295 res=1 [Mon Dec 9 06:18:20 2019]. [Mon Dec 9 06:18:20 2019] Starting Preprocess NFS configuration... [Mon Dec 9 06:18:20 2019] Starting Tell Plymouth To Write Out Runtime Data... [Mon Dec 9 06:18:20 2019] Starting Import [ 75.545883] RPC: Registered named UNIX socket transport module. [Mon Dec 9 06:18:20 2019]network configur[ 75.552220] RPC: Registered udp transport module. [Mon Dec 9 06:18:20 2019]ation from initr[ 75.558315] RPC: Registered tcp transport module. [Mon Dec 9 06:18:20 2019]amfs... [Mon Dec 9 06:18:20 2019][[ 75.564406] RPC: Registered tcp NFSv4.1 backchannel transport module. [Mon Dec 9 06:18:20 2019] OK ] Started Preprocess NFS configuration. [Mon Dec 9 06:18:20 2019][ OK ] Started Tell Plymouth To Write Out Runtime Data. [Mon Dec 9 06:18:20 2019][ OK ] Started Import network configuration from initramfs. [Mon Dec 9 06:18:20 2019] Starting Create Volatile Files and Directories... [Mon Dec 9 06:18:20 2019][ OK ] Started Create Volatile Files and Directories. [Mon Dec 9 06:18:20 2019] Starting Security Auditing Service... [Mon Dec 9 06:18:20 2019] Mounting RPC Pipe File System... [Mon Dec 9 06:18:20 2019][ OK ] Mounted RPC Pipe File System. [Mon Dec 9 06:18:20 2019][ OK ] Reached target rpc_pipefs.target. [Mon Dec 9 06:18:20 2019][ OK ] Started Security Auditing Service. [Mon Dec 9 06:18:20 2019] Starting Update UTMP about System Boot/Shutdown... [Mon Dec 9 06:18:20 2019][ OK ] Started Update UTMP about System Boot/Shutdown. [Mon Dec 9 06:18:20 2019][ OK ] Reached target System Initialization. [Mon Dec 9 06:18:20 2019] Starting openibd - configure Mellanox devices... [Mon Dec 9 06:18:20 2019][ OK ] Listening on D-Bus System Message Bus Socket. [Mon Dec 9 06:18:20 2019][ OK ] Started Daily Cleanup of Temporary Directories. [Mon Dec 9 06:18:20 2019][ OK ] Reached target Timers. [Mon Dec 9 06:18:20 2019][ OK ] Listening on RPCbind Server Activation Socket. [Mon Dec 9 06:18:20 2019][ OK ] Reached target Sockets. [Mon Dec 9 06:18:20 2019][ OK ] Reached target Basic System. [Mon Dec 9 06:18:20 2019] Starting Load CPU microcode update... [Mon Dec 9 06:18:20 2019] Starting Systems Management Event Management... [Mon Dec 9 06:18:20 2019] Starting Resets System Activity Logs... [Mon Dec 9 06:18:20 2019][ OK ] Started D-Bus System Message Bus. [Mon Dec 9 06:18:20 2019] Starting Authorization Manager... [Mon Dec 9 06:18:20 2019] Starting Software RAID monitoring and management... [Mon Dec 9 06:18:20 2019] Starting System Security Services Daemon... [Mon Dec 9 06:18:20 2019] Starting NTP client/server... [Mon Dec 9 06:18:20 2019][ OK ] Started Self Monitoring and Reporting Technology (SMART) Daemon. [Mon Dec 9 06:18:20 2019] Starting Systems Management Device Drivers... [Mon Dec 9 06:18:20 2019] Starting GSSAPI Proxy Daemon... [Mon Dec 9 06:18:20 2019][ OK ] Started irqbalance daemon. [Mon Dec 9 06:18:20 2019] Starting Dump dmesg to /var/log/dmesg... [Mon Dec 9 06:18:20 2019] Starting RPC bind service... [Mon Dec 9 06:18:20 2019][ OK ] Started Systems Management Event Management. [Mon Dec 9 06:18:20 2019][ OK ] Started Software RAID monitoring and management. [Mon Dec 9 06:18:20 2019][ OK ] Started Resets System Activity Logs. [Mon Dec 9 06:18:20 2019][ OK ] Started GSSAPI Proxy Daemon. [Mon Dec 9 06:18:21 2019][ OK ] Reached target NFS client services. [Mon Dec 9 06:18:21 2019][ OK ] Started Authorization Manager. [Mon Dec 9 06:18:21 2019][ OK ] Started RPC bind service. [Mon Dec 9 06:18:21 2019][ OK ] Started Load CPU microcode update. [Mon Dec 9 06:18:21 2019][ OK ] Started Dump dmesg to /var/log/dmesg. [Mon Dec 9 06:18:21 2019][ OK ] Started NTP client/server. [Mon Dec 9 06:18:21 2019][ OK ] Started System Security Services Daemon. [Mon Dec 9 06:18:21 2019][ OK ] Reached target User and Group Name Lookups. [Mon Dec 9 06:18:21 2019] Starting Login Service... [Mon Dec 9 06:18:21 2019][ OK ] Started Login Service. [Mon Dec 9 06:18:21 2019][ 76.205183] mlx5_core 0000:01:00.0: slow_pci_heuristic:5575:(pid 49586): Max link speed = 100000, PCI BW = 126016 [Mon Dec 9 06:18:21 2019][ 76.215508] mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0) [Mon Dec 9 06:18:21 2019][ 76.223801] mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0) [Mon Dec 9 06:18:21 2019][ OK ] Created slice system-mlnx_interface_mgr.slice. [Mon Dec 9 06:18:21 2019][ OK ] Started mlnx_interface_mgr - configure ib0. [Mon Dec 9 06:18:21 2019][ OK ] Started openibd - configure Mellanox devices. [Mon Dec 9 06:18:21 2019][ OK ] Reached target Remote File Systems (Pre). [Mon Dec 9 06:18:21 2019] Starting LSB: Bring up/down networking... [Mon Dec 9 06:18:29 2019]0m] Started Systems Management Device Drivers. [Mon Dec 9 06:18:29 2019] Starting Systems Management Data Engine... [Mon Dec 9 06:18:29 2019][ 76.810728] IPv6: ADDRCONF(NETDEV_UP): em1: link is not ready [Mon Dec 9 06:18:29 2019][ 80.343797] tg3 0000:81:00.0 em1: Link is up at 1000 Mbps, full duplex [Mon Dec 9 06:18:29 2019][ 80.350338] tg3 0000:81:00.0 em1: Flow control is on for TX and on for RX [Mon Dec 9 06:18:29 2019][ 80.357183] tg3 0000:81:00.0 em1: EEE is enabled [Mon Dec 9 06:18:29 2019][ 80.361816] IPv6: ADDRCONF(NETDEV_CHANGE): em1: link becomes ready [Mon Dec 9 06:18:29 2019][ 81.170910] IPv6: ADDRCONF(NETDEV_UP): ib0: link is not ready [Mon Dec 9 06:18:29 2019][ 81.444140] IPv6: ADDRCONF(NETDEV_CHANGE): ib0: link becomes ready [Mon Dec 9 06:18:30 2019][ OK ] Started LSB: Bring up/down networking. [Mon Dec 9 06:18:30 2019][ OK ] Reached target Network. [Mon Dec 9 06:18:30 2019][ OK ] Reached target Network is Online. [Mon Dec 9 06:18:30 2019] Starting N[ 85.577696] FS-Cache: Loaded [Mon Dec 9 06:18:30 2019]otify NFS peers of a restart... [Mon Dec 9 06:18:30 2019] Starting Collectd statistics daemon... [Mon Dec 9 06:18:30 2019] Starting Postfix Mail Transport Agent... [Mon Dec 9 06:18:30 2019] Starting Dynamic System Tuning Daemon... [Mon Dec 9 06:18:30 2019] Starting System Logging Service... [Mon Dec 9 06:18:30 2019] Mounting /share... [Mon Dec 9 06:18:30 2019] Starting OpenSSH server daemon... [Mon Dec 9 06:18:30 2019][ OK [ 85.608522] FS-Cache: Netfs 'nfs' registered for caching [Mon Dec 9 06:18:30 2019] ] Started Notify NFS peers of a restart. [Mon Dec 9 06:18:30 2019][ 85.618255] Key type dns_resolver registered [Mon Dec 9 06:18:30 2019][ OK ] Started Collectd statistics daemon. [Mon Dec 9 06:18:30 2019][ OK ] Started System Logging Service. [Mon Dec 9 06:18:30 2019] Starting xcat service on compute no...script and update node status... [Mon Dec 9 06:18:30 2019][ OK ] Started OpenSSH server daemon. [Mon Dec 9 06:18:30 2019][ OK ] Started xcat s[ 85.646616] NFS: Registering the id_resolver key type [Mon Dec 9 06:18:30 2019]ervice on comput[ 85.651939] Key type id_resolver registered [Mon Dec 9 06:18:30 2019]e nod...otscript[ 85.657486] Key type id_legacy registered [Mon Dec 9 06:18:30 2019] and update node status. [Mon Dec 9 06:18:30 2019][ OK ] Mounted /share. [Mon Dec 9 06:18:30 2019][ OK ] Reached target Remote File Systems. [Mon Dec 9 06:18:30 2019] Starting Crash recovery kernel arming... [Mon Dec 9 06:18:30 2019] Starting Permit User Sessions... [Mon Dec 9 06:18:30 2019][ OK ] Started Permit User Sessions. [Mon Dec 9 06:18:30 2019][ OK ] Started Lookout metrics collector. [Mon Dec 9 06:18:30 2019] Starting Terminate Plymouth Boot Screen... [Mon Dec 9 06:18:30 2019][ OK ] Started Command Scheduler. [Mon Dec 9 06:18:30 2019] Starting Wait for Plymouth Boot Screen to Quit... [Mon Dec 9 06:18:36 2019] [Mon Dec 9 06:18:36 2019]CentOS Linux 7 (Core) [Mon Dec 9 06:18:36 2019]Kernel 3.10.0-957.27.2.el7_lustre.pl2.x86_64 on an x86_64 [Mon Dec 9 06:18:36 2019] [Mon Dec 9 06:18:36 2019]fir-io8-s1 login: [ 191.261132] mpt3sas_cm0: log_info(0x31200205): originator(PL), code(0x20), sub_code(0x0205) [Mon Dec 9 06:22:08 2019][ 303.048704] LNet: HW NUMA nodes: 4, HW CPU cores: 48, npartitions: 4 [Mon Dec 9 06:22:08 2019][ 303.056201] alg: No test for adler32 (adler32-zlib) [Mon Dec 9 06:22:09 2019][ 303.856304] Lustre: Lustre: Build Version: 2.12.3_4_g142b4d4 [Mon Dec 9 06:22:09 2019][ 303.961403] LNet: 63766:0:(config.c:1627:lnet_inet_enumerate()) lnet: Ignoring interface em2: it's down [Mon Dec 9 06:22:09 2019][ 303.971183] LNet: Using FastReg for registration [Mon Dec 9 06:22:09 2019][ 303.988081] LNet: Added LNI 10.0.10.115@o2ib7 [8/256/0/180] [Mon Dec 9 06:52:17 2019][ 2112.595584] md: md6 stopped. [Mon Dec 9 06:52:17 2019][ 2112.606902] async_tx: api initialized (async) [Mon Dec 9 06:52:17 2019][ 2112.613260] xor: automatically using best checksumming function: [Mon Dec 9 06:52:17 2019][ 2112.628302] avx : 9596.000 MB/sec [Mon Dec 9 06:52:17 2019][ 2112.661307] raid6: sse2x1 gen() 6082 MB/s [Mon Dec 9 06:52:17 2019][ 2112.682304] raid6: sse2x2 gen() 11304 MB/s [Mon Dec 9 06:52:17 2019][ 2112.703304] raid6: sse2x4 gen() 12933 MB/s [Mon Dec 9 06:52:17 2019][ 2112.724303] raid6: avx2x1 gen() 14250 MB/s [Mon Dec 9 06:52:17 2019][ 2112.745309] raid6: avx2x2 gen() 18863 MB/s [Mon Dec 9 06:52:17 2019][ 2112.766304] raid6: avx2x4 gen() 18812 MB/s [Mon Dec 9 06:52:17 2019][ 2112.770579] raid6: using algorithm avx2x2 gen() (18863 MB/s) [Mon Dec 9 06:52:17 2019][ 2112.776241] raid6: using avx2x2 recovery algorithm [Mon Dec 9 06:52:17 2019][ 2112.797874] md/raid:md6: device dm-1 operational as raid disk 0 [Mon Dec 9 06:52:17 2019][ 2112.803814] md/raid:md6: device dm-34 operational as raid disk 9 [Mon Dec 9 06:52:17 2019][ 2112.809838] md/raid:md6: device dm-54 operational as raid disk 8 [Mon Dec 9 06:52:17 2019][ 2112.815854] md/raid:md6: device dm-16 operational as raid disk 7 [Mon Dec 9 06:52:17 2019][ 2112.821866] md/raid:md6: device dm-6 operational as raid disk 6 [Mon Dec 9 06:52:18 2019][ 2112.827795] md/raid:md6: device dm-42 operational as raid disk 5 [Mon Dec 9 06:52:18 2019][ 2112.833812] md/raid:md6: device dm-26 operational as raid disk 4 [Mon Dec 9 06:52:18 2019][ 2112.839823] md/raid:md6: device dm-17 operational as raid disk 3 [Mon Dec 9 06:52:18 2019][ 2112.845844] md/raid:md6: device dm-5 operational as raid disk 2 [Mon Dec 9 06:52:18 2019][ 2112.851766] md/raid:md6: device dm-43 operational as raid disk 1 [Mon Dec 9 06:52:18 2019][ 2112.858918] md/raid:md6: raid level 6 active with 10 out of 10 devices, algorithm 2 [Mon Dec 9 06:52:18 2019][ 2112.898947] md6: detected capacity change from 0 to 64011422924800 [Mon Dec 9 06:52:18 2019][ 2112.916672] md: md8 stopped. [Mon Dec 9 06:52:18 2019][ 2112.928560] md/raid:md8: device dm-44 operational as raid disk 0 [Mon Dec 9 06:52:18 2019][ 2112.934581] md/raid:md8: device dm-46 operational as raid disk 9 [Mon Dec 9 06:52:18 2019][ 2112.940592] md/raid:md8: device dm-23 operational as raid disk 8 [Mon Dec 9 06:52:18 2019][ 2112.946613] md/raid:md8: device dm-33 operational as raid disk 7 [Mon Dec 9 06:52:18 2019][ 2112.952647] md/raid:md8: device dm-45 operational as raid disk 6 [Mon Dec 9 06:52:18 2019][ 2112.958684] md/raid:md8: device dm-4 operational as raid disk 5 [Mon Dec 9 06:52:18 2019][ 2112.964624] md/raid:md8: device dm-8 operational as raid disk 4 [Mon Dec 9 06:52:18 2019][ 2112.970570] md/raid:md8: device dm-25 operational as raid disk 3 [Mon Dec 9 06:52:18 2019][ 2112.976615] md/raid:md8: device dm-21 operational as raid disk 2 [Mon Dec 9 06:52:18 2019][ 2112.982625] md/raid:md8: device dm-53 operational as raid disk 1 [Mon Dec 9 06:52:18 2019][ 2112.989407] md/raid:md8: raid level 6 active with 10 out of 10 devices, algorithm 2 [Mon Dec 9 06:52:18 2019][ 2113.025969] md8: detected capacity change from 0 to 64011422924800 [Mon Dec 9 06:52:18 2019][ 2113.055680] md: md4 stopped. [Mon Dec 9 06:52:18 2019][ 2113.067145] md/raid:md4: device dm-116 operational as raid disk 0 [Mon Dec 9 06:52:18 2019][ 2113.073272] md/raid:md4: device dm-100 operational as raid disk 9 [Mon Dec 9 06:52:18 2019][ 2113.079372] md/raid:md4: device dm-107 operational as raid disk 8 [Mon Dec 9 06:52:18 2019][ 2113.085471] md/raid:md4: device dm-94 operational as raid disk 7 [Mon Dec 9 06:52:18 2019][ 2113.091486] md/raid:md4: device dm-84 operational as raid disk 6 [Mon Dec 9 06:52:18 2019][ 2113.097499] md/raid:md4: device dm-76 operational as raid disk 5 [Mon Dec 9 06:52:18 2019][ 2113.103513] md/raid:md4: device dm-83 operational as raid disk 4 [Mon Dec 9 06:52:18 2019][ 2113.109531] md/raid:md4: device dm-66 operational as raid disk 3 [Mon Dec 9 06:52:18 2019][ 2113.115547] md/raid:md4: device dm-69 operational as raid disk 2 [Mon Dec 9 06:52:18 2019][ 2113.121564] md/raid:md4: device dm-117 operational as raid disk 1 [Mon Dec 9 06:52:18 2019][ 2113.128560] md/raid:md4: raid level 6 active with 10 out of 10 devices, algorithm 2 [Mon Dec 9 06:52:18 2019][ 2113.169475] md4: detected capacity change from 0 to 64011422924800 [Mon Dec 9 06:52:18 2019][ 2113.196823] md: md0 stopped. [Mon Dec 9 06:52:18 2019][ 2113.210826] md/raid:md0: device dm-60 operational as raid disk 0 [Mon Dec 9 06:52:18 2019][ 2113.216844] md/raid:md0: device dm-95 operational as raid disk 9 [Mon Dec 9 06:52:18 2019][ 2113.222857] md/raid:md0: device dm-91 operational as raid disk 8 [Mon Dec 9 06:52:18 2019][ 2113.228876] md/raid:md0: device dm-80 operational as raid disk 7 [Mon Dec 9 06:52:18 2019][ 2113.234891] md/raid:md0: device dm-88 operational as raid disk 6 [Mon Dec 9 06:52:18 2019][ 2113.240903] md/raid:md0: device dm-65 operational as raid disk 5 [Mon Dec 9 06:52:18 2019][ 2113.246915] md/raid:md0: device dm-64 operational as raid disk 4 [Mon Dec 9 06:52:18 2019][ 2113.252934] md/raid:md0: device dm-89 operational as raid disk 3 [Mon Dec 9 06:52:18 2019][ 2113.258949] md/raid:md0: device dm-74 operational as raid disk 2 [Mon Dec 9 06:52:18 2019][ 2113.264965] md/raid:md0: device dm-104 operational as raid disk 1 [Mon Dec 9 06:52:18 2019][ 2113.271903] md/raid:md0: raid level 6 active with 10 out of 10 devices, algorithm 2 [Mon Dec 9 06:52:18 2019][ 2113.308972] md0: detected capacity change from 0 to 64011422924800 [Mon Dec 9 06:52:18 2019][ 2113.341960] md: md10 stopped. [Mon Dec 9 06:52:18 2019][ 2113.354658] md/raid:md10: device dm-58 operational as raid disk 0 [Mon Dec 9 06:52:18 2019][ 2113.360759] md/raid:md10: device dm-18 operational as raid disk 9 [Mon Dec 9 06:52:18 2019][ 2113.366857] md/raid:md10: device dm-57 operational as raid disk 8 [Mon Dec 9 06:52:18 2019][ 2113.372959] md/raid:md10: device dm-15 operational as raid disk 7 [Mon Dec 9 06:52:18 2019][ 2113.379058] md/raid:md10: device dm-7 operational as raid disk 6 [Mon Dec 9 06:52:18 2019][ 2113.385073] md/raid:md10: device dm-27 operational as raid disk 5 [Mon Dec 9 06:52:18 2019][ 2113.391173] md/raid:md10: device dm-40 operational as raid disk 4 [Mon Dec 9 06:52:18 2019][ 2113.397273] md/raid:md10: device dm-28 operational as raid disk 3 [Mon Dec 9 06:52:18 2019][ 2113.403369] md/raid:md10: device dm-3 operational as raid disk 2 [Mon Dec 9 06:52:18 2019][ 2113.409384] md/raid:md10: device dm-56 operational as raid disk 1 [Mon Dec 9 06:52:18 2019][ 2113.416142] md/raid:md10: raid level 6 active with 10 out of 10 devices, algorithm 2 [Mon Dec 9 06:52:18 2019][ 2113.466385] md10: detected capacity change from 0 to 64011422924800 [Mon Dec 9 06:52:18 2019][ 2113.495745] md: md2 stopped. [Mon Dec 9 06:52:18 2019][ 2113.508936] md/raid:md2: device dm-119 operational as raid disk 0 [Mon Dec 9 06:52:18 2019][ 2113.515039] md/raid:md2: device dm-99 operational as raid disk 9 [Mon Dec 9 06:52:18 2019][ 2113.521056] md/raid:md2: device dm-114 operational as raid disk 8 [Mon Dec 9 06:52:18 2019][ 2113.527163] md/raid:md2: device dm-79 operational as raid disk 7 [Mon Dec 9 06:52:18 2019][ 2113.533173] md/raid:md2: device dm-86 operational as raid disk 6 [Mon Dec 9 06:52:18 2019][ 2113.539190] md/raid:md2: device dm-77 operational as raid disk 5 [Mon Dec 9 06:52:18 2019][ 2113.545206] md/raid:md2: device dm-73 operational as raid disk 4 [Mon Dec 9 06:52:18 2019][ 2113.551224] md/raid:md2: device dm-101 operational as raid disk 3 [Mon Dec 9 06:52:18 2019][ 2113.557329] md/raid:md2: device dm-105 operational as raid disk 2 [Mon Dec 9 06:52:18 2019][ 2113.563436] md/raid:md2: device dm-106 operational as raid disk 1 [Mon Dec 9 06:52:18 2019][ 2113.570345] md/raid:md2: raid level 6 active with 10 out of 10 devices, algorithm 2 [Mon Dec 9 06:52:18 2019][ 2113.592487] md2: detected capacity change from 0 to 64011422924800 [Mon Dec 9 06:52:18 2019][ 2113.817580] LDISKFS-fs (md6): file extents enabled, maximum tree depth=5 [Mon Dec 9 06:52:19 2019][ 2113.946595] LDISKFS-fs (md8): file extents enabled, maximum tree depth=5 [Mon Dec 9 06:52:19 2019][ 2114.147645] LDISKFS-fs (md6): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Mon Dec 9 06:52:19 2019][ 2114.202582] LDISKFS-fs (md4): file extents enabled, maximum tree depth=5 [Mon Dec 9 06:52:19 2019][ 2114.328693] LDISKFS-fs (md0): file extents enabled, maximum tree depth=5 [Mon Dec 9 06:52:19 2019][ 2114.342963] LDISKFS-fs (md8): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Mon Dec 9 06:52:19 2019][ 2114.534044] LDISKFS-fs (md4): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Mon Dec 9 06:52:19 2019][ 2114.659742] LDISKFS-fs (md0): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Mon Dec 9 06:52:19 2019][ 2114.683575] LDISKFS-fs (md10): file extents enabled, maximum tree depth=5 [Mon Dec 9 06:52:19 2019][ 2114.694623] LDISKFS-fs (md2): file extents enabled, maximum tree depth=5 [Mon Dec 9 06:52:19 2019][ 2114.774385] LustreError: 137-5: fir-OST0054_UUID: not available for connect from 10.8.27.17@o2ib6 (no target). If you are running an HA pair check that the target is mounted on the other server. [Mon Dec 9 06:52:19 2019][ 2114.791669] LustreError: Skipped 17 previous similar messages [Mon Dec 9 06:52:20 2019][ 2114.921315] Lustre: fir-OST005a: Not available for connect from 10.9.117.8@o2ib4 (not set up) [Mon Dec 9 06:52:20 2019][ 2114.929849] Lustre: Skipped 1 previous similar message [Mon Dec 9 06:52:20 2019][ 2115.042213] LDISKFS-fs (md10): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Mon Dec 9 06:52:20 2019][ 2115.282969] LustreError: 137-5: fir-OST005c_UUID: not available for connect from 10.9.117.27@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Mon Dec 9 06:52:20 2019][ 2115.300335] LustreError: Skipped 220 previous similar messages [Mon Dec 9 06:52:20 2019][ 2115.453675] Lustre: fir-OST005a: Not available for connect from 10.9.102.25@o2ib4 (not set up) [Mon Dec 9 06:52:20 2019][ 2115.462300] Lustre: Skipped 39 previous similar messages [Mon Dec 9 06:52:21 2019][ 2116.011575] LDISKFS-fs (md2): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Mon Dec 9 06:52:21 2019][ 2116.086764] Lustre: fir-OST005a: Imperative Recovery enabled, recovery window shrunk from 300-900 down to 150-900 [Mon Dec 9 06:52:21 2019][ 2116.097595] Lustre: fir-OST005a: in recovery but waiting for the first client to connect [Mon Dec 9 06:52:21 2019][ 2116.097864] Lustre: fir-OST005a: Will be in recovery for at least 2:30, or until 1291 clients reconnect [Mon Dec 9 06:52:21 2019][ 2116.098139] Lustre: fir-OST005a: Connection restored to (at 10.8.24.17@o2ib6) [Mon Dec 9 06:52:21 2019][ 2116.301499] LustreError: 137-5: fir-OST0054_UUID: not available for connect from 10.9.105.45@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Mon Dec 9 06:52:21 2019][ 2116.318868] LustreError: Skipped 415 previous similar messages [Mon Dec 9 06:52:21 2019][ 2116.607374] Lustre: fir-OST005c: Connection restored to 7f6916f2-c589-3558-df52-0f5294f8fa05 (at 10.9.102.19@o2ib4) [Mon Dec 9 06:52:21 2019][ 2116.617640] Lustre: fir-OST0058: Not available for connect from 10.9.102.60@o2ib4 (not set up) [Mon Dec 9 06:52:21 2019][ 2116.617642] Lustre: Skipped 52 previous similar messages [Mon Dec 9 06:52:21 2019][ 2116.631749] Lustre: Skipped 46 previous similar messages [Mon Dec 9 06:52:21 2019][ 2116.753977] Lustre: fir-OST0058: Imperative Recovery enabled, recovery window shrunk from 300-900 down to 150-900 [Mon Dec 9 06:52:21 2019][ 2116.764243] Lustre: Skipped 1 previous similar message [Mon Dec 9 06:52:21 2019][ 2116.769973] Lustre: fir-OST0058: in recovery but waiting for the first client to connect [Mon Dec 9 06:52:21 2019][ 2116.772765] Lustre: fir-OST0058: Will be in recovery for at least 2:30, or until 1291 clients reconnect [Mon Dec 9 06:52:21 2019][ 2116.772766] Lustre: Skipped 1 previous similar message [Mon Dec 9 06:52:21 2019][ 2116.792617] Lustre: Skipped 1 previous similar message [Mon Dec 9 06:52:22 2019][ 2117.610633] Lustre: fir-OST005e: Connection restored to (at 10.8.24.1@o2ib6) [Mon Dec 9 06:52:22 2019][ 2117.617780] Lustre: Skipped 209 previous similar messages [Mon Dec 9 06:52:22 2019][ 2117.757906] Lustre: fir-OST0056: Imperative Recovery enabled, recovery window shrunk from 300-900 down to 150-900 [Mon Dec 9 06:52:22 2019][ 2117.768170] Lustre: Skipped 2 previous similar messages [Mon Dec 9 06:52:22 2019][ 2117.773910] Lustre: fir-OST0056: in recovery but waiting for the first client to connect [Mon Dec 9 06:52:22 2019][ 2117.782003] Lustre: Skipped 2 previous similar messages [Mon Dec 9 06:52:22 2019][ 2117.788887] Lustre: fir-OST0056: Will be in recovery for at least 2:30, or until 1291 clients reconnect [Mon Dec 9 06:52:22 2019][ 2117.798284] Lustre: Skipped 2 previous similar messages [Mon Dec 9 06:52:24 2019][ 2119.614365] Lustre: fir-OST005e: Connection restored to f5313c6f-3647-048f-259c-ceddb6cbc1d1 (at 10.9.103.43@o2ib4) [Mon Dec 9 06:52:24 2019][ 2119.614367] Lustre: fir-OST005c: Connection restored to f5313c6f-3647-048f-259c-ceddb6cbc1d1 (at 10.9.103.43@o2ib4) [Mon Dec 9 06:52:24 2019][ 2119.614369] Lustre: fir-OST0056: Connection restored to f5313c6f-3647-048f-259c-ceddb6cbc1d1 (at 10.9.103.43@o2ib4) [Mon Dec 9 06:52:24 2019][ 2119.614370] Lustre: fir-OST005a: Connection restored to f5313c6f-3647-048f-259c-ceddb6cbc1d1 (at 10.9.103.43@o2ib4) [Mon Dec 9 06:52:24 2019][ 2119.614371] Lustre: fir-OST0058: Connection restored to f5313c6f-3647-048f-259c-ceddb6cbc1d1 (at 10.9.103.43@o2ib4) [Mon Dec 9 06:52:24 2019][ 2119.614373] Lustre: Skipped 784 previous similar messages [Mon Dec 9 06:52:24 2019][ 2119.614374] Lustre: Skipped 784 previous similar messages [Mon Dec 9 06:52:24 2019][ 2119.614375] Lustre: Skipped 784 previous similar messages [Mon Dec 9 06:52:24 2019][ 2119.614379] Lustre: Skipped 784 previous similar messages [Mon Dec 9 06:52:24 2019][ 2119.688119] Lustre: Skipped 17 previous similar messages [Mon Dec 9 06:52:27 2019][ 2122.534723] Lustre: fir-OST005e: Denying connection for new client 5f11dd29-1211-44a2-2612-f8309cf085b3 (at 10.8.21.18@o2ib6), waiting for 1290 known clients (528 recovered, 17 in progress, and 0 evicted) to recover in 2:24 [Mon Dec 9 06:52:27 2019][ 2122.554537] Lustre: Skipped 2 previous similar messages [Mon Dec 9 06:52:32 2019][ 2127.578715] Lustre: fir-OST0058: Recovery over after 0:11, of 1291 clients 1291 recovered and 0 were evicted. [Mon Dec 9 06:52:32 2019][ 2127.604937] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11587727 to 0x1800000401:11587777 [Mon Dec 9 06:52:32 2019][ 2127.636081] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3010908 to 0x1a80000402:3010945 [Mon Dec 9 06:52:32 2019][ 2127.646210] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11643012 to 0x1980000401:11643041 [Mon Dec 9 06:52:32 2019][ 2127.651249] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3015076 to 0x1900000402:3015105 [Mon Dec 9 06:52:32 2019][ 2127.666810] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11601326 to 0x1a80000400:11601409 [Mon Dec 9 06:52:32 2019][ 2127.676743] Lustre: fir-OST005e: deleting orphan objects from 0x0:27453436 to 0x0:27453473 [Mon Dec 9 06:52:32 2019][ 2127.677004] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000400:11457723 to 0x1a00000400:11457761 [Mon Dec 9 06:52:32 2019][ 2127.700276] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:2978880 to 0x1a00000401:2978945 [Mon Dec 9 06:52:32 2019][ 2127.702764] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:792278 to 0x1980000400:792321 [Mon Dec 9 06:52:32 2019][ 2127.703020] Lustre: fir-OST005a: deleting orphan objects from 0x0:27548344 to 0x0:27548385 [Mon Dec 9 06:52:32 2019][ 2127.704825] Lustre: fir-OST0056: deleting orphan objects from 0x1880000402:3009048 to 0x1880000402:3009121 [Mon Dec 9 06:52:32 2019][ 2127.707244] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:788482 to 0x1a80000401:788513 [Mon Dec 9 06:52:32 2019][ 2127.712646] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3018544 to 0x1980000402:3018625 [Mon Dec 9 06:52:32 2019][ 2127.737264] Lustre: fir-OST0056: deleting orphan objects from 0x1880000400:11579442 to 0x1880000400:11579521 [Mon Dec 9 06:52:32 2019][ 2127.757860] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:789629 to 0x1900000400:789665 [Mon Dec 9 06:52:32 2019][ 2127.758850] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11597256 to 0x1900000401:11597281 [Mon Dec 9 06:52:32 2019][ 2127.782821] Lustre: fir-OST0058: deleting orphan objects from 0x0:27492955 to 0x0:27492993 [Mon Dec 9 06:52:32 2019][ 2127.784564] Lustre: fir-OST0056: deleting orphan objects from 0x1880000401:789050 to 0x1880000401:789089 [Mon Dec 9 06:52:32 2019][ 2127.801547] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3000247 to 0x1800000400:3000289 [Mon Dec 9 06:52:32 2019][ 2127.814050] Lustre: fir-OST005c: deleting orphan objects from 0x0:27178781 to 0x0:27178817 [Mon Dec 9 06:52:32 2019][ 2127.831910] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:777103 to 0x1a00000402:777121 [Mon Dec 9 06:52:33 2019][ 2127.849043] Lustre: fir-OST0054: deleting orphan objects from 0x0:27444185 to 0x0:27444225 [Mon Dec 9 06:52:33 2019][ 2127.852518] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:786910 to 0x1800000402:786945 [Mon Dec 9 06:52:33 2019][ 2127.914689] Lustre: fir-OST0056: deleting orphan objects from 0x0:27466201 to 0x0:27466241 [Mon Dec 9 06:52:52 2019][ 2147.537429] Lustre: fir-OST005e: Connection restored to 5f11dd29-1211-44a2-2612-f8309cf085b3 (at 10.8.21.18@o2ib6) [Mon Dec 9 06:52:52 2019][ 2147.547785] Lustre: Skipped 6485 previous similar messages [Mon Dec 9 08:53:21 2019][ 9376.001426] Lustre: fir-OST0054: haven't heard from client 798dc93c-11ba-328e-acec-b07846966ea5 (at 10.8.0.67@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892252ab7800, cur 1575910401 expire 1575910251 last 1575910174 [Mon Dec 9 09:49:21 2019][12736.930259] Lustre: fir-OST0054: Connection restored to 798dc93c-11ba-328e-acec-b07846966ea5 (at 10.8.0.67@o2ib6) [Mon Dec 9 09:49:21 2019][12736.940519] Lustre: Skipped 5 previous similar messages [Mon Dec 9 09:51:55 2019][12890.468581] Lustre: fir-OST0054: Connection restored to 9a70df35-6de0-4 (at 10.8.19.7@o2ib6) [Mon Dec 9 09:51:55 2019][12890.477027] Lustre: Skipped 5 previous similar messages [Mon Dec 9 09:52:03 2019][12898.060656] Lustre: fir-OST0058: haven't heard from client 1d08460b-716a-03a1-30aa-d26bf61d87fe (at 10.8.0.65@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8922804ed000, cur 1575913923 expire 1575913773 last 1575913696 [Mon Dec 9 09:52:03 2019][12898.082275] Lustre: Skipped 5 previous similar messages [Mon Dec 9 10:01:44 2019][13479.118042] Lustre: fir-OST0054: Connection restored to 8171b0fd-9423-4 (at 10.9.109.27@o2ib4) [Mon Dec 9 10:01:44 2019][13479.126662] Lustre: Skipped 4 previous similar messages [Mon Dec 9 10:28:31 2019][15086.261148] Lustre: fir-OST0054: Connection restored to 1d08460b-716a-03a1-30aa-d26bf61d87fe (at 10.8.0.65@o2ib6) [Mon Dec 9 10:28:31 2019][15086.271416] Lustre: Skipped 5 previous similar messages [Mon Dec 9 11:39:58 2019][19374.041469] LustreError: 67930:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005e: cli f9f503f0-6ff6-698f-9a8d-14bd128a6d42 claims 16801792 GRANT, real grant 16752640 [Mon Dec 9 12:35:20 2019][22695.260464] Lustre: fir-OST0054: haven't heard from client 83281a6e-8cdd-af0c-d930-afb3d26c7eba (at 10.8.23.14@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892252b8a000, cur 1575923720 expire 1575923570 last 1575923493 [Mon Dec 9 12:35:20 2019][22695.282202] Lustre: Skipped 5 previous similar messages [Mon Dec 9 12:35:35 2019][22710.880614] Lustre: fir-OST0054: Connection restored to 83281a6e-8cdd-af0c-d930-afb3d26c7eba (at 10.8.23.14@o2ib6) [Mon Dec 9 12:35:35 2019][22710.890963] Lustre: Skipped 5 previous similar messages [Mon Dec 9 12:43:09 2019][23164.274873] Lustre: fir-OST0056: haven't heard from client 50bb3322-2186-2682-e22f-d2e40908bd0d (at 10.8.23.14@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff890d9fd03800, cur 1575924189 expire 1575924039 last 1575923962 [Mon Dec 9 12:43:09 2019][23164.296576] Lustre: Skipped 5 previous similar messages [Mon Dec 9 12:43:32 2019][23187.511794] Lustre: fir-OST0054: Connection restored to 83281a6e-8cdd-af0c-d930-afb3d26c7eba (at 10.8.23.14@o2ib6) [Mon Dec 9 12:43:32 2019][23187.522142] Lustre: Skipped 5 previous similar messages [Mon Dec 9 12:53:36 2019][23791.966216] Lustre: fir-OST0054: Connection restored to 83281a6e-8cdd-af0c-d930-afb3d26c7eba (at 10.8.23.14@o2ib6) [Mon Dec 9 12:53:36 2019][23791.976564] Lustre: Skipped 5 previous similar messages [Mon Dec 9 12:54:01 2019][23816.286759] Lustre: fir-OST0054: haven't heard from client 75aebac8-89c1-69e4-9dfa-1727b2d47fae (at 10.8.23.14@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff890bba3a7c00, cur 1575924841 expire 1575924691 last 1575924614 [Mon Dec 9 12:54:01 2019][23816.308467] Lustre: Skipped 5 previous similar messages [Mon Dec 9 13:09:05 2019][24720.298181] Lustre: fir-OST0058: haven't heard from client 2867fefc-6124-ed47-3fcc-acf48d637860 (at 10.8.18.35@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892252ce6000, cur 1575925745 expire 1575925595 last 1575925518 [Mon Dec 9 13:09:05 2019][24720.319902] Lustre: Skipped 5 previous similar messages [Mon Dec 9 16:48:39 2019][37894.763171] perf: interrupt took too long (2503 > 2500), lowering kernel.perf_event_max_sample_rate to 79000 [Mon Dec 9 21:34:55 2019][55071.320836] Lustre: 67791:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575956084/real 1575956084] req@ffff88eded431680 x1652452367426464/t0(0) o106->fir-OST0054@10.9.106.54@o2ib4:15/16 lens 296/280 e 0 to 1 dl 1575956095 ref 1 fl Rpc:X/0/ffffffff rc 0/-1 [Mon Dec 9 21:34:55 2019][55071.320838] Lustre: 67848:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575956084/real 1575956084] req@ffff88e821f9c800 x1652452367426448/t0(0) o106->fir-OST0054@10.9.106.54@o2ib4:15/16 lens 296/280 e 0 to 1 dl 1575956095 ref 1 fl Rpc:X/0/ffffffff rc 0/-1 [Mon Dec 9 21:35:06 2019][55082.321056] Lustre: 67848:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575956095/real 1575956095] req@ffff88e821f9c800 x1652452367426448/t0(0) o106->fir-OST0054@10.9.106.54@o2ib4:15/16 lens 296/280 e 0 to 1 dl 1575956106 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Mon Dec 9 21:35:17 2019][55093.348286] Lustre: 67848:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575956106/real 1575956106] req@ffff88e821f9c800 x1652452367426448/t0(0) o106->fir-OST0054@10.9.106.54@o2ib4:15/16 lens 296/280 e 0 to 1 dl 1575956117 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Mon Dec 9 21:35:17 2019][55093.375649] Lustre: 67848:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 2 previous similar messages [Mon Dec 9 21:35:28 2019][55104.375502] Lustre: 67791:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575956117/real 1575956117] req@ffff88eded431680 x1652452367426464/t0(0) o106->fir-OST0054@10.9.106.54@o2ib4:15/16 lens 296/280 e 0 to 1 dl 1575956128 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Mon Dec 9 21:35:28 2019][55104.402874] Lustre: 67791:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 1 previous similar message [Mon Dec 9 21:35:39 2019][55115.385726] Lustre: 67848:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575956128/real 1575956128] req@ffff88e821f9c800 x1652452367426448/t0(0) o106->fir-OST0054@10.9.106.54@o2ib4:15/16 lens 296/280 e 0 to 1 dl 1575956139 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Mon Dec 9 21:35:39 2019][55115.413060] Lustre: 67848:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 1 previous similar message [Mon Dec 9 21:35:50 2019][55126.412953] Lustre: 67791:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575956139/real 1575956139] req@ffff88eded431680 x1652452367426464/t0(0) o106->fir-OST0054@10.9.106.54@o2ib4:15/16 lens 296/280 e 0 to 1 dl 1575956150 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Mon Dec 9 21:35:50 2019][55126.440294] Lustre: 67791:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 1 previous similar message [Mon Dec 9 21:36:12 2019][55148.424402] Lustre: 67848:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575956161/real 1575956161] req@ffff88e821f9c800 x1652452367426448/t0(0) o106->fir-OST0054@10.9.106.54@o2ib4:15/16 lens 296/280 e 0 to 1 dl 1575956172 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Mon Dec 9 21:36:12 2019][55148.451748] Lustre: 67848:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 2 previous similar messages [Mon Dec 9 21:36:45 2019][55181.452062] Lustre: 67791:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575956194/real 1575956194] req@ffff88eded431680 x1652452367426464/t0(0) o106->fir-OST0054@10.9.106.54@o2ib4:15/16 lens 296/280 e 0 to 1 dl 1575956205 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Mon Dec 9 21:36:45 2019][55181.479405] Lustre: 67791:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 5 previous similar messages [Mon Dec 9 21:37:06 2019][55201.974017] Lustre: fir-OST0056: haven't heard from client 1316ac10-17f9-20d9-6734-8b32fc11fac2 (at 10.9.106.54@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892251566800, cur 1575956226 expire 1575956076 last 1575955999 [Mon Dec 9 21:37:06 2019][55201.995833] Lustre: Skipped 5 previous similar messages [Mon Dec 9 21:37:06 2019][55202.001208] LustreError: 67848:0:(ldlm_lockd.c:681:ldlm_handle_ast_error()) ### client (nid 10.9.106.54@o2ib4) failed to reply to glimpse AST (req@ffff88e821f9c800 x1652452367426448 status 0 rc -5), evict it ns: filter-fir-OST0054_UUID lock: ffff88fef56bc380/0x7066c9c1891f377c lrc: 3/0,0 mode: PW/PW res: [0x1800000401:0xa16b7b:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 67108864->68719476735) flags: 0x40000080000000 nid: 10.9.106.54@o2ib4 remote: 0xa907bc36138dc384 expref: 7 pid: 67663 timeout: 0 lvb_type: 0 [Mon Dec 9 21:37:06 2019][55202.001216] LustreError: 138-a: fir-OST0054: A client on nid 10.9.106.54@o2ib4 was evicted due to a lock glimpse callback time out: rc -5 [Mon Dec 9 21:37:06 2019][55202.001246] LustreError: 66071:0:(ldlm_lockd.c:256:expired_lock_main()) ### lock callback timer expired after 1575956226s: evicting client at 10.9.106.54@o2ib4 ns: filter-fir-OST0054_UUID lock: ffff88f5e8f93600/0x7066c9c1891f78a5 lrc: 3/0,0 mode: PW/PW res: [0x1800000401:0xb38953:0x0].0x0 rrc: 2 type: EXT [0->18446744073709551615] (req 34359738368->18446744073709551615) flags: 0x40000000000000 nid: 10.9.106.54@o2ib4 remote: 0xa907bc36138dc513 expref: 5 pid: 67900 timeout: 0 lvb_type: 0 [Mon Dec 9 21:37:06 2019][55202.102654] LustreError: 67848:0:(ldlm_lockd.c:681:ldlm_handle_ast_error()) Skipped 1 previous similar message [Mon Dec 9 21:37:16 2019][55211.915449] Lustre: fir-OST005e: haven't heard from client 1316ac10-17f9-20d9-6734-8b32fc11fac2 (at 10.9.106.54@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892250a28800, cur 1575956236 expire 1575956086 last 1575956009 [Mon Dec 9 21:37:16 2019][55211.937252] Lustre: Skipped 3 previous similar messages [Mon Dec 9 21:37:21 2019][55216.926307] Lustre: fir-OST005a: haven't heard from client 1316ac10-17f9-20d9-6734-8b32fc11fac2 (at 10.9.106.54@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff88f2aecda800, cur 1575956241 expire 1575956091 last 1575956014 [Mon Dec 9 21:39:41 2019][55357.690391] Lustre: fir-OST0054: Connection restored to 1316ac10-17f9-20d9-6734-8b32fc11fac2 (at 10.9.106.54@o2ib4) [Mon Dec 9 21:39:41 2019][55357.700832] Lustre: Skipped 5 previous similar messages [Tue Dec 10 02:48:10 2019][73866.289223] Lustre: fir-OST005a: haven't heard from client 0af2ee10-72ea-97a8-65e7-44544fdbc0b9 (at 10.9.108.39@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff88f2aec88800, cur 1575974890 expire 1575974740 last 1575974663 [Tue Dec 10 05:18:04 2019][82860.494528] Lustre: fir-OST0058: Client 67942120-f44f-42ea-60c3-96f62fccea78 (at 10.9.109.39@o2ib4) reconnecting [Tue Dec 10 05:18:04 2019][82860.504721] Lustre: fir-OST0058: Connection restored to 67942120-f44f-42ea-60c3-96f62fccea78 (at 10.9.109.39@o2ib4) [Tue Dec 10 05:18:12 2019][82868.720930] Lustre: fir-OST0056: Client c9911b4c-e55e-f4aa-416a-b652019239f7 (at 10.9.117.40@o2ib4) reconnecting [Tue Dec 10 05:18:12 2019][82868.731127] Lustre: fir-OST0056: Connection restored to c9911b4c-e55e-f4aa-416a-b652019239f7 (at 10.9.117.40@o2ib4) [Tue Dec 10 05:18:16 2019][82872.652414] Lustre: fir-OST0058: Client e72387a4-2bab-d686-07ea-8e45160d2e1d (at 10.9.117.23@o2ib4) reconnecting [Tue Dec 10 05:18:16 2019][82872.662616] Lustre: fir-OST0058: Connection restored to e72387a4-2bab-d686-07ea-8e45160d2e1d (at 10.9.117.23@o2ib4) [Tue Dec 10 05:18:18 2019][82874.824840] Lustre: fir-OST005a: Client d873db05-7c48-65ad-d97d-599447705616 (at 10.9.106.5@o2ib4) reconnecting [Tue Dec 10 05:18:18 2019][82874.834931] Lustre: Skipped 5 previous similar messages [Tue Dec 10 05:18:18 2019][82874.840194] Lustre: fir-OST005a: Connection restored to d873db05-7c48-65ad-d97d-599447705616 (at 10.9.106.5@o2ib4) [Tue Dec 10 05:18:18 2019][82874.850558] Lustre: Skipped 5 previous similar messages [Tue Dec 10 05:18:22 2019][82878.869944] Lustre: fir-OST005a: Client acd3ae51-2d23-93df-d1b3-33ff6a3945ef (at 10.9.114.5@o2ib4) reconnecting [Tue Dec 10 05:18:22 2019][82878.880035] Lustre: Skipped 64 previous similar messages [Tue Dec 10 05:18:22 2019][82878.885392] Lustre: fir-OST005a: Connection restored to acd3ae51-2d23-93df-d1b3-33ff6a3945ef (at 10.9.114.5@o2ib4) [Tue Dec 10 05:18:22 2019][82878.895765] Lustre: Skipped 64 previous similar messages [Tue Dec 10 05:18:25 2019][82881.458358] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.209@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:18:25 2019][82881.471341] LustreError: 67718:0:(ldlm_lib.c:3256:target_bulk_io()) @@@ Reconnect on bulk WRITE req@ffff8922ca374050 x1649309827682656/t0(0) o4->c93954af-761b-f1eb-f651-9881322a7a72@10.9.108.51@o2ib4:698/0 lens 488/448 e 1 to 0 dl 1575983923 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:18:25 2019][82881.496115] Lustre: fir-OST0058: Bulk IO write error with c93954af-761b-f1eb-f651-9881322a7a72 (at 10.9.108.51@o2ib4), client will retry: rc = -110 [Tue Dec 10 05:18:29 2019][82885.471409] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.210@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:18:29 2019][82885.484361] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) Skipped 1 previous similar message [Tue Dec 10 05:18:29 2019][82885.495092] LustreError: 67822:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) @@@ truncated bulk GET 14680064(16777216) req@ffff88f2fbc1e050 x1652122491273216/t0(0) o4->eb7e3af2-d117-4@10.9.101.1@o2ib4:702/0 lens 488/448 e 1 to 0 dl 1575983927 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:18:29 2019][82885.519616] Lustre: fir-OST0054: Bulk IO write error with eb7e3af2-d117-4 (at 10.9.101.1@o2ib4), client will retry: rc = -110 [Tue Dec 10 05:18:30 2019][82886.919156] Lustre: fir-OST0054: Client 32524583-8f43-9f54-827c-15b3a46fedcc (at 10.8.30.10@o2ib6) reconnecting [Tue Dec 10 05:18:30 2019][82886.919272] Lustre: fir-OST005c: Connection restored to d69fcdf7-730b-8cda-70aa-8ec0410da18f (at 10.8.29.1@o2ib6) [Tue Dec 10 05:18:30 2019][82886.919275] Lustre: Skipped 104 previous similar messages [Tue Dec 10 05:18:30 2019][82886.944900] Lustre: Skipped 105 previous similar messages [Tue Dec 10 05:18:39 2019][82895.494630] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.210@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:18:39 2019][82895.507604] LustreError: 67989:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) @@@ truncated bulk GET 9437184(12582912) req@ffff89224e0e5050 x1649309827682656/t0(0) o4->c93954af-761b-f1eb-f651-9881322a7a72@10.9.108.51@o2ib4:710/0 lens 488/448 e 0 to 0 dl 1575983935 ref 1 fl Interpret:/2/0 rc 0/0 [Tue Dec 10 05:18:39 2019][82895.533682] Lustre: fir-OST0058: Bulk IO write error with c93954af-761b-f1eb-f651-9881322a7a72 (at 10.9.108.51@o2ib4), client will retry: rc = -110 [Tue Dec 10 05:18:44 2019][82900.507721] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.211@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:18:44 2019][82900.520690] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) Skipped 2 previous similar messages [Tue Dec 10 05:18:44 2019][82900.531503] LustreError: 67722:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) @@@ truncated bulk GET 13631488(16777216) req@ffff8902f6709050 x1652122491471360/t0(0) o4->eb7e3af2-d117-4@10.9.101.1@o2ib4:718/0 lens 488/448 e 1 to 0 dl 1575983943 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:18:44 2019][82900.556080] Lustre: fir-OST0054: Bulk IO write error with eb7e3af2-d117-4 (at 10.9.101.1@o2ib4), client will retry: rc = -110 [Tue Dec 10 05:18:44 2019][82901.259219] LustreError: 137-5: fir-OST0059_UUID: not available for connect from 10.9.116.11@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:18:44 2019][82901.276627] LustreError: Skipped 204 previous similar messages [Tue Dec 10 05:18:44 2019][82901.347731] Lustre: 67873:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575983917/real 1575983917] req@ffff8912f703f080 x1652452421351632/t0(0) o105->fir-OST0054@10.9.101.29@o2ib4:15/16 lens 360/224 e 0 to 1 dl 1575983924 ref 1 fl Rpc:X/0/ffffffff rc 0/-1 [Tue Dec 10 05:18:44 2019][82901.375070] Lustre: 67873:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 2 previous similar messages [Tue Dec 10 05:18:46 2019][82902.982452] Lustre: fir-OST005e: Client 91d6d3f9-54bc-fd90-b16e-873e3af76326 (at 10.9.106.44@o2ib4) reconnecting [Tue Dec 10 05:18:46 2019][82902.983396] Lustre: fir-OST0054: Connection restored to 91d6d3f9-54bc-fd90-b16e-873e3af76326 (at 10.9.106.44@o2ib4) [Tue Dec 10 05:18:46 2019][82902.983398] Lustre: Skipped 187 previous similar messages [Tue Dec 10 05:18:46 2019][82903.008483] Lustre: Skipped 187 previous similar messages [Tue Dec 10 05:18:50 2019][82906.576837] LustreError: 67929:0:(ldlm_lib.c:3256:target_bulk_io()) @@@ Reconnect on bulk WRITE req@ffff8907c0ee1850 x1649049588956192/t0(0) o4->20463417-fb32-2f92-5aae-59bfa8e287e3@10.9.101.29@o2ib4:725/0 lens 488/448 e 1 to 0 dl 1575983950 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:18:50 2019][82906.601645] Lustre: fir-OST0054: Bulk IO write error with 20463417-fb32-2f92-5aae-59bfa8e287e3 (at 10.9.101.29@o2ib4), client will retry: rc = -110 [Tue Dec 10 05:18:52 2019][82908.822798] LustreError: 137-5: fir-OST0059_UUID: not available for connect from 10.8.31.2@o2ib6 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:18:52 2019][82908.839992] LustreError: Skipped 1 previous similar message [Tue Dec 10 05:18:54 2019][82910.607090] LustreError: 137-5: fir-OST0059_UUID: not available for connect from 10.9.101.28@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:18:54 2019][82910.624484] LustreError: Skipped 2 previous similar messages [Tue Dec 10 05:18:59 2019][82915.531020] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.211@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:18:59 2019][82915.531026] LustreError: 67718:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) @@@ truncated bulk GET 0(205619) req@ffff89224e0e3850 x1652123335105408/t0(0) o4->4cec062a-e1ff-4@10.9.101.3@o2ib4:732/0 lens 488/448 e 1 to 0 dl 1575983957 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:18:59 2019][82915.531049] Lustre: fir-OST0054: Bulk IO write error with 4cec062a-e1ff-4 (at 10.9.101.3@o2ib4), client will retry: rc = -110 [Tue Dec 10 05:18:59 2019][82915.578309] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) Skipped 1 previous similar message [Tue Dec 10 05:19:00 2019][82916.566038] LustreError: 67989:0:(ldlm_lib.c:3256:target_bulk_io()) @@@ Reconnect on bulk WRITE req@ffff892251cea050 x1650930158014448/t0(0) o4->fe46e801-2d86-9439-0b24-b78514ed5486@10.9.109.8@o2ib4:739/0 lens 488/448 e 1 to 0 dl 1575983964 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:19:04 2019][82921.314277] LustreError: 137-5: fir-OST005f_UUID: not available for connect from 10.8.17.19@o2ib6 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:19:04 2019][82921.331565] LustreError: Skipped 2 previous similar messages [Tue Dec 10 05:19:09 2019][82926.121728] LustreError: 137-5: fir-OST0059_UUID: not available for connect from 10.9.105.11@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:19:09 2019][82926.139115] LustreError: Skipped 11 previous similar messages [Tue Dec 10 05:19:17 2019][82933.838387] LustreError: 67722:0:(ldlm_lib.c:3256:target_bulk_io()) @@@ Reconnect on bulk WRITE req@ffff88f3a9515050 x1649292592204048/t0(0) o4->d269b7b3-c7ee-1895-0bbf-8293c505cff2@10.9.110.44@o2ib4:1/0 lens 488/448 e 1 to 0 dl 1575983981 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:19:17 2019][82933.862916] Lustre: fir-OST0058: Bulk IO write error with d269b7b3-c7ee-1895-0bbf-8293c505cff2 (at 10.9.110.44@o2ib4), client will retry: rc = -110 [Tue Dec 10 05:19:17 2019][82933.876140] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:19:18 2019][82934.748407] LustreError: 137-5: fir-OST0055_UUID: not available for connect from 10.9.107.48@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:19:18 2019][82934.765772] LustreError: Skipped 14 previous similar messages [Tue Dec 10 05:19:18 2019][82934.985866] Lustre: fir-OST005c: Client f5acbb80-2671-675b-21f5-81352b190567 (at 10.9.110.49@o2ib4) reconnecting [Tue Dec 10 05:19:18 2019][82934.996050] Lustre: Skipped 1086 previous similar messages [Tue Dec 10 05:19:18 2019][82935.001544] Lustre: fir-OST005c: Connection restored to (at 10.9.110.49@o2ib4) [Tue Dec 10 05:19:18 2019][82935.008881] Lustre: Skipped 1086 previous similar messages [Tue Dec 10 05:19:34 2019][82950.477728] Lustre: 89234:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575983967/real 1575983967] req@ffff88fddaa5d580 x1652452421353168/t0(0) o105->fir-OST005a@10.9.110.16@o2ib4:15/16 lens 360/224 e 0 to 1 dl 1575983974 ref 1 fl Rpc:X/0/ffffffff rc 0/-1 [Tue Dec 10 05:19:35 2019][82952.145857] LustreError: 137-5: fir-OST0057_UUID: not available for connect from 10.8.23.19@o2ib6 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:19:35 2019][82952.163151] LustreError: Skipped 40 previous similar messages [Tue Dec 10 05:19:39 2019][82955.589834] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.211@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:19:39 2019][82955.602809] LustreError: 67980:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) @@@ truncated bulk GET 7864320(16252928) req@ffff88f2b024d850 x1649292592204048/t0(0) o4->d269b7b3-c7ee-1895-0bbf-8293c505cff2@10.9.110.44@o2ib4:15/0 lens 488/448 e 0 to 0 dl 1575983995 ref 1 fl Interpret:/2/0 rc 0/0 [Tue Dec 10 05:20:08 2019][82984.912818] LustreError: 137-5: fir-OST0055_UUID: not available for connect from 10.8.27.15@o2ib6 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:20:08 2019][82984.930100] LustreError: Skipped 77 previous similar messages [Tue Dec 10 05:20:22 2019][82999.030286] Lustre: fir-OST005c: Client 9622ebd9-08dd-84f5-187b-b07758b1dd55 (at 10.9.103.48@o2ib4) reconnecting [Tue Dec 10 05:20:22 2019][82999.030363] Lustre: fir-OST005a: Connection restored to 7fc3ef05-0495-25a3-7cdb-c6f981dcc2b9 (at 10.9.102.68@o2ib4) [Tue Dec 10 05:20:22 2019][82999.030365] Lustre: Skipped 1252 previous similar messages [Tue Dec 10 05:20:22 2019][82999.056380] Lustre: Skipped 1255 previous similar messages [Tue Dec 10 05:20:38 2019][83014.761030] Lustre: 67693:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575984031/real 1575984031] req@ffff8922f461d100 x1652452421355920/t0(0) o104->fir-OST0054@10.9.108.37@o2ib4:15/16 lens 296/224 e 0 to 1 dl 1575984038 ref 1 fl Rpc:X/0/ffffffff rc 0/-1 [Tue Dec 10 05:20:41 2019][83017.628089] LustreError: 68013:0:(ldlm_lib.c:3256:target_bulk_io()) @@@ Reconnect on bulk WRITE req@ffff88f2ab53a850 x1649049588956192/t0(0) o4->20463417-fb32-2f92-5aae-59bfa8e287e3@10.9.101.29@o2ib4:91/0 lens 488/448 e 0 to 0 dl 1575984071 ref 1 fl Interpret:/2/0 rc 0/0 [Tue Dec 10 05:20:41 2019][83017.652133] LustreError: 68013:0:(ldlm_lib.c:3256:target_bulk_io()) Skipped 1 previous similar message [Tue Dec 10 05:20:41 2019][83017.661969] Lustre: fir-OST0054: Bulk IO write error with 20463417-fb32-2f92-5aae-59bfa8e287e3 (at 10.9.101.29@o2ib4), client will retry: rc = -110 [Tue Dec 10 05:20:41 2019][83017.675188] Lustre: Skipped 2 previous similar messages [Tue Dec 10 05:20:52 2019][83028.770318] LustreError: 67718:0:(ldlm_lib.c:3256:target_bulk_io()) @@@ Reconnect on bulk WRITE req@ffff89224f960050 x1649292592204048/t0(0) o4->d269b7b3-c7ee-1895-0bbf-8293c505cff2@10.9.110.44@o2ib4:102/0 lens 488/448 e 0 to 0 dl 1575984082 ref 1 fl Interpret:/2/0 rc 0/0 [Tue Dec 10 05:21:12 2019][83048.937549] LustreError: 137-5: fir-OST005d_UUID: not available for connect from 10.9.104.8@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:21:12 2019][83048.954829] LustreError: Skipped 6990 previous similar messages [Tue Dec 10 05:21:31 2019][83067.484865] Lustre: fir-OST005a: haven't heard from client 2bacacc9-821b-1013-eb06-dd3bdbe6bf12 (at 10.9.104.8@o2ib4) in 162 seconds. I think it's dead, and I am evicting it. exp ffff8902cd64e400, cur 1575984091 expire 1575983941 last 1575983929 [Tue Dec 10 05:21:31 2019][83067.506600] Lustre: Skipped 5 previous similar messages [Tue Dec 10 05:21:32 2019][83068.476206] Lustre: fir-OST005e: haven't heard from client 2bacacc9-821b-1013-eb06-dd3bdbe6bf12 (at 10.9.104.8@o2ib4) in 163 seconds. I think it's dead, and I am evicting it. exp ffff8922509fb000, cur 1575984092 expire 1575983942 last 1575983929 [Tue Dec 10 05:25:20 2019][83296.692149] LustreError: 137-5: fir-OST0057_UUID: not available for connect from 10.9.109.11@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:25:20 2019][83296.709516] LustreError: Skipped 111 previous similar messages [Tue Dec 10 05:25:34 2019][83311.284314] Lustre: fir-OST0058: Client d758ce23-488a-e6d5-8c6f-41cbf6d78ec4 (at 10.9.105.21@o2ib4) reconnecting [Tue Dec 10 05:25:34 2019][83311.293300] Lustre: fir-OST005e: Connection restored to d758ce23-488a-e6d5-8c6f-41cbf6d78ec4 (at 10.9.105.21@o2ib4) [Tue Dec 10 05:25:34 2019][83311.293302] Lustre: Skipped 11935 previous similar messages [Tue Dec 10 05:25:34 2019][83311.310497] Lustre: Skipped 11936 previous similar messages [Tue Dec 10 05:25:49 2019][83325.611334] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.211@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:25:49 2019][83325.624301] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) Skipped 2 previous similar messages [Tue Dec 10 05:25:49 2019][83325.635126] LustreError: 68003:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) @@@ truncated bulk GET 12582912(16777216) req@ffff89025a28b850 x1652122504801216/t0(0) o4->eb7e3af2-d117-4@10.9.101.1@o2ib4:398/0 lens 488/448 e 0 to 0 dl 1575984378 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:25:49 2019][83325.659776] Lustre: fir-OST0054: Bulk IO write error with eb7e3af2-d117-4 (at 10.9.101.1@o2ib4), client will retry: rc = -110 [Tue Dec 10 05:25:49 2019][83325.671083] Lustre: Skipped 3 previous similar messages [Tue Dec 10 05:26:22 2019][83358.938007] LustreError: 67808:0:(ldlm_lib.c:3256:target_bulk_io()) @@@ Reconnect on bulk WRITE req@ffff8922ce6f2850 x1648661001299536/t0(0) o4->226a4739-0dcd-665f-3f64-f361283e71b8@10.9.105.16@o2ib4:433/0 lens 488/448 e 0 to 0 dl 1575984413 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:26:22 2019][83358.962179] LustreError: 67808:0:(ldlm_lib.c:3256:target_bulk_io()) Skipped 2 previous similar messages [Tue Dec 10 05:26:22 2019][83359.223675] md: md3 stopped. [Tue Dec 10 05:26:22 2019][83359.236553] md/raid:md3: device dm-108 operational as raid disk 0 [Tue Dec 10 05:26:22 2019][83359.242672] md/raid:md3: device dm-85 operational as raid disk 9 [Tue Dec 10 05:26:22 2019][83359.248686] md/raid:md3: device dm-97 operational as raid disk 8 [Tue Dec 10 05:26:22 2019][83359.254715] md/raid:md3: device dm-82 operational as raid disk 7 [Tue Dec 10 05:26:22 2019][83359.260729] md/raid:md3: device dm-98 operational as raid disk 6 [Tue Dec 10 05:26:22 2019][83359.266737] md/raid:md3: device dm-72 operational as raid disk 5 [Tue Dec 10 05:26:22 2019][83359.272751] md/raid:md3: device dm-81 operational as raid disk 4 [Tue Dec 10 05:26:22 2019][83359.278766] md/raid:md3: device dm-61 operational as raid disk 3 [Tue Dec 10 05:26:22 2019][83359.284780] md/raid:md3: device dm-103 operational as raid disk 2 [Tue Dec 10 05:26:22 2019][83359.290873] md/raid:md3: device dm-109 operational as raid disk 1 [Tue Dec 10 05:26:22 2019][83359.298897] md/raid:md3: raid level 6 active with 10 out of 10 devices, algorithm 2 [Tue Dec 10 05:26:22 2019][83359.337097] md3: detected capacity change from 0 to 64011422924800 [Tue Dec 10 05:26:22 2019][83359.367126] md: md1 stopped. [Tue Dec 10 05:26:22 2019][83359.377897] md/raid:md1: device dm-63 operational as raid disk 0 [Tue Dec 10 05:26:22 2019][83359.383913] md/raid:md1: device dm-102 operational as raid disk 9 [Tue Dec 10 05:26:22 2019][83359.390016] md/raid:md1: device dm-113 operational as raid disk 8 [Tue Dec 10 05:26:22 2019][83359.396120] md/raid:md1: device dm-96 operational as raid disk 7 [Tue Dec 10 05:26:22 2019][83359.402134] md/raid:md1: device dm-92 operational as raid disk 6 [Tue Dec 10 05:26:22 2019][83359.408144] md/raid:md1: device dm-67 operational as raid disk 5 [Tue Dec 10 05:26:22 2019][83359.414153] md/raid:md1: device dm-71 operational as raid disk 4 [Tue Dec 10 05:26:22 2019][83359.420166] md/raid:md1: device dm-112 operational as raid disk 3 [Tue Dec 10 05:26:22 2019][83359.426268] md/raid:md1: device dm-115 operational as raid disk 2 [Tue Dec 10 05:26:22 2019][83359.432370] md/raid:md1: device dm-118 operational as raid disk 1 [Tue Dec 10 05:26:22 2019][83359.439214] md/raid:md1: raid level 6 active with 10 out of 10 devices, algorithm 2 [Tue Dec 10 05:26:22 2019][83359.460496] md1: detected capacity change from 0 to 64011422924800 [Tue Dec 10 05:26:23 2019][83359.491278] md: md11 stopped. [Tue Dec 10 05:26:23 2019][83359.504426] md/raid:md11: device dm-51 operational as raid disk 0 [Tue Dec 10 05:26:23 2019][83359.510538] md/raid:md11: device dm-47 operational as raid disk 9 [Tue Dec 10 05:26:23 2019][83359.516646] md/raid:md11: device dm-50 operational as raid disk 8 [Tue Dec 10 05:26:23 2019][83359.522757] md/raid:md11: device dm-49 operational as raid disk 7 [Tue Dec 10 05:26:23 2019][83359.528866] md/raid:md11: device dm-20 operational as raid disk 6 [Tue Dec 10 05:26:23 2019][83359.534979] md/raid:md11: device dm-31 operational as raid disk 5 [Tue Dec 10 05:26:23 2019][83359.541088] md/raid:md11: device dm-32 operational as raid disk 4 [Tue Dec 10 05:26:23 2019][83359.547195] md/raid:md11: device dm-22 operational as raid disk 3 [Tue Dec 10 05:26:23 2019][83359.553301] md/raid:md11: device dm-11 operational as raid disk 2 [Tue Dec 10 05:26:23 2019][83359.559404] md/raid:md11: device dm-9 operational as raid disk 1 [Tue Dec 10 05:26:23 2019][83359.569083] md/raid:md11: raid level 6 active with 10 out of 10 devices, algorithm 2 [Tue Dec 10 05:26:23 2019][83359.591851] md11: detected capacity change from 0 to 64011422924800 [Tue Dec 10 05:26:23 2019][83359.619668] md: md7 stopped. [Tue Dec 10 05:26:23 2019][83359.638795] md/raid:md7: device dm-0 operational as raid disk 0 [Tue Dec 10 05:26:23 2019][83359.644738] md/raid:md7: device dm-55 operational as raid disk 9 [Tue Dec 10 05:26:23 2019][83359.650792] md/raid:md7: device dm-14 operational as raid disk 8 [Tue Dec 10 05:26:23 2019][83359.656817] md/raid:md7: device dm-13 operational as raid disk 7 [Tue Dec 10 05:26:23 2019][83359.662855] md/raid:md7: device dm-41 operational as raid disk 6 [Tue Dec 10 05:26:23 2019][83359.668899] md/raid:md7: device dm-29 operational as raid disk 5 [Tue Dec 10 05:26:23 2019][83359.674964] md/raid:md7: device dm-24 operational as raid disk 4 [Tue Dec 10 05:26:23 2019][83359.681029] md/raid:md7: device dm-35 operational as raid disk 3 [Tue Dec 10 05:26:23 2019][83359.687052] md/raid:md7: device dm-52 operational as raid disk 2 [Tue Dec 10 05:26:23 2019][83359.693078] md/raid:md7: device dm-30 operational as raid disk 1 [Tue Dec 10 05:26:23 2019][83359.700039] md/raid:md7: raid level 6 active with 10 out of 10 devices, algorithm 2 [Tue Dec 10 05:26:23 2019][83359.721696] md7: detected capacity change from 0 to 64011422924800 [Tue Dec 10 05:26:23 2019][83359.747849] md: md5 stopped. [Tue Dec 10 05:26:23 2019][83359.767611] md/raid:md5: device dm-110 operational as raid disk 0 [Tue Dec 10 05:26:23 2019][83359.773734] md/raid:md5: device dm-93 operational as raid disk 9 [Tue Dec 10 05:26:23 2019][83359.779766] md/raid:md5: device dm-111 operational as raid disk 8 [Tue Dec 10 05:26:23 2019][83359.785881] md/raid:md5: device dm-87 operational as raid disk 7 [Tue Dec 10 05:26:23 2019][83359.791911] md/raid:md5: device dm-90 operational as raid disk 6 [Tue Dec 10 05:26:23 2019][83359.797933] md/raid:md5: device dm-75 operational as raid disk 5 [Tue Dec 10 05:26:23 2019][83359.803959] md/raid:md5: device dm-78 operational as raid disk 4 [Tue Dec 10 05:26:23 2019][83359.809981] md/raid:md5: device dm-70 operational as raid disk 3 [Tue Dec 10 05:26:23 2019][83359.816010] md/raid:md5: device dm-62 operational as raid disk 2 [Tue Dec 10 05:26:23 2019][83359.822035] md/raid:md5: device dm-68 operational as raid disk 1 [Tue Dec 10 05:26:23 2019][83359.829120] md/raid:md5: raid level 6 active with 10 out of 10 devices, algorithm 2 [Tue Dec 10 05:26:23 2019][83359.863208] md5: detected capacity change from 0 to 64011422924800 [Tue Dec 10 05:26:23 2019][83359.900980] md: md9 stopped. [Tue Dec 10 05:26:23 2019][83359.924250] md/raid:md9: device dm-59 operational as raid disk 0 [Tue Dec 10 05:26:23 2019][83359.930270] md/raid:md9: device dm-19 operational as raid disk 9 [Tue Dec 10 05:26:23 2019][83359.936303] md/raid:md9: device dm-48 operational as raid disk 8 [Tue Dec 10 05:26:23 2019][83359.942419] md/raid:md9: device dm-39 operational as raid disk 7 [Tue Dec 10 05:26:23 2019][83359.948471] md/raid:md9: device dm-37 operational as raid disk 6 [Tue Dec 10 05:26:23 2019][83359.954508] md/raid:md9: device dm-10 operational as raid disk 5 [Tue Dec 10 05:26:23 2019][83359.960562] md/raid:md9: device dm-38 operational as raid disk 4 [Tue Dec 10 05:26:23 2019][83359.966609] md/raid:md9: device dm-2 operational as raid disk 3 [Tue Dec 10 05:26:23 2019][83359.972651] md/raid:md9: device dm-12 operational as raid disk 2 [Tue Dec 10 05:26:23 2019][83359.978740] md/raid:md9: device dm-36 operational as raid disk 1 [Tue Dec 10 05:26:23 2019][83359.988671] md/raid:md9: raid level 6 active with 10 out of 10 devices, algorithm 2 [Tue Dec 10 05:26:23 2019][83360.017458] md9: detected capacity change from 0 to 64011422924800 [Tue Dec 10 05:26:23 2019][83360.466294] LDISKFS-fs (md3): file extents enabled, maximum tree depth=5 [Tue Dec 10 05:26:24 2019][83360.515278] LDISKFS-fs (md1): file extents enabled, maximum tree depth=5 [Tue Dec 10 05:26:24 2019][83360.800303] LDISKFS-fs (md11): file extents enabled, maximum tree depth=5 [Tue Dec 10 05:26:24 2019][83360.805067] LDISKFS-fs (md3): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Tue Dec 10 05:26:24 2019][83360.861175] LDISKFS-fs (md1): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Tue Dec 10 05:26:24 2019][83361.154663] LDISKFS-fs (md11): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Tue Dec 10 05:26:24 2019][83361.224318] LDISKFS-fs (md7): file extents enabled, maximum tree depth=5 [Tue Dec 10 05:26:25 2019][83361.456315] LDISKFS-fs (md5): file extents enabled, maximum tree depth=5 [Tue Dec 10 05:26:25 2019][83361.481350] Lustre: fir-OST0057: Not available for connect from 10.8.30.26@o2ib6 (not set up) [Tue Dec 10 05:26:25 2019][83361.489945] Lustre: Skipped 29 previous similar messages [Tue Dec 10 05:26:25 2019][83361.525126] LDISKFS-fs (md9): file extents enabled, maximum tree depth=5 [Tue Dec 10 05:26:25 2019][83361.602428] Lustre: fir-OST0057: Imperative Recovery enabled, recovery window shrunk from 300-900 down to 150-900 [Tue Dec 10 05:26:25 2019][83361.613606] Lustre: fir-OST0057: in recovery but waiting for the first client to connect [Tue Dec 10 05:26:25 2019][83361.637346] LDISKFS-fs (md7): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Tue Dec 10 05:26:25 2019][83361.653835] Lustre: fir-OST0057: Will be in recovery for at least 2:30, or until 1291 clients reconnect [Tue Dec 10 05:26:25 2019][83361.815987] LDISKFS-fs (md5): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Tue Dec 10 05:26:25 2019][83362.152906] LDISKFS-fs (md9): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Tue Dec 10 05:26:25 2019][83362.162098] Lustre: fir-OST005f: Not available for connect from 10.9.108.14@o2ib4 (not set up) [Tue Dec 10 05:26:25 2019][83362.162101] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:26:25 2019][83362.282047] Lustre: fir-OST005f: Imperative Recovery enabled, recovery window shrunk from 300-900 down to 150-900 [Tue Dec 10 05:26:25 2019][83362.292310] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:26:25 2019][83362.298236] Lustre: fir-OST005f: in recovery but waiting for the first client to connect [Tue Dec 10 05:26:25 2019][83362.306355] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:26:25 2019][83362.321596] Lustre: fir-OST005f: Will be in recovery for at least 2:30, or until 1291 clients reconnect [Tue Dec 10 05:26:25 2019][83362.330995] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:26:26 2019][83363.239466] Lustre: fir-OST005d: Not available for connect from 10.8.20.17@o2ib6 (not set up) [Tue Dec 10 05:26:26 2019][83363.247993] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:26:26 2019][83363.306303] Lustre: fir-OST005d: Imperative Recovery enabled, recovery window shrunk from 300-900 down to 150-900 [Tue Dec 10 05:26:26 2019][83363.316565] Lustre: Skipped 2 previous similar messages [Tue Dec 10 05:26:26 2019][83363.322667] Lustre: fir-OST005d: in recovery but waiting for the first client to connect [Tue Dec 10 05:26:26 2019][83363.330787] Lustre: Skipped 2 previous similar messages [Tue Dec 10 05:26:26 2019][83363.395627] Lustre: fir-OST005d: Will be in recovery for at least 2:30, or until 1290 clients reconnect [Tue Dec 10 05:26:26 2019][83363.405029] Lustre: Skipped 2 previous similar messages [Tue Dec 10 05:26:33 2019][83369.610785] Lustre: fir-OST0055: Client fc841094-f1fd-2756-1968-f74105b220e6 (at 10.8.8.30@o2ib6) reconnected, waiting for 1291 clients in recovery for 2:22 [Tue Dec 10 05:26:34 2019][83370.635246] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.211@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:26:34 2019][83370.635254] LustreError: 68000:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) @@@ truncated bulk GET 2097152(2445312) req@ffff88f2fdf04050 x1648661001299536/t0(0) o4->226a4739-0dcd-665f-3f64-f361283e71b8@10.9.105.16@o2ib4:444/0 lens 488/448 e 0 to 0 dl 1575984424 ref 1 fl Interpret:/2/0 rc 0/0 [Tue Dec 10 05:26:34 2019][83370.673701] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) Skipped 2 previous similar messages [Tue Dec 10 05:26:34 2019][83370.813831] Lustre: fir-OST005d: Client df232092-858e-a632-396d-0cfff0b9daea (at 10.9.110.47@o2ib4) reconnected, waiting for 1290 clients in recovery for 2:22 [Tue Dec 10 05:26:34 2019][83370.828004] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:26:35 2019][83372.045287] Lustre: fir-OST005d: Client 3c222422-1505-df45-a734-88e013dbd97d (at 10.9.102.41@o2ib4) reconnected, waiting for 1290 clients in recovery for 2:21 [Tue Dec 10 05:26:35 2019][83372.059447] Lustre: Skipped 3 previous similar messages [Tue Dec 10 05:26:39 2019][83375.472761] Lustre: fir-OST005d: Client 0cf25d46-002a-85b1-4e67-848f0710e2b1 (at 10.9.109.64@o2ib4) reconnected, waiting for 1290 clients in recovery for 2:17 [Tue Dec 10 05:26:39 2019][83375.486940] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:26:43 2019][83379.572113] Lustre: fir-OST005d: Client 1c30f97b-7d47-8c9d-c1e8-e8bf522ea702 (at 10.8.24.11@o2ib6) reconnected, waiting for 1290 clients in recovery for 2:13 [Tue Dec 10 05:26:43 2019][83379.586191] Lustre: Skipped 13 previous similar messages [Tue Dec 10 05:26:44 2019][83380.684456] LustreError: 67723:0:(ldlm_lib.c:3271:target_bulk_io()) @@@ truncated bulk READ 3145728(4194304) req@ffff88f2fbc53050 x1650576591366400/t0(0) o3->5499f23a-1ea6-ba5d-b45d-cc3f43f05d7e@10.9.109.20@o2ib4:443/0 lens 488/440 e 1 to 0 dl 1575984423 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:26:44 2019][83380.684688] Lustre: fir-OST0056: Bulk IO read error with 5499f23a-1ea6-ba5d-b45d-cc3f43f05d7e (at 10.9.109.20@o2ib4), client will retry: rc -110 [Tue Dec 10 05:26:44 2019][83380.722686] LustreError: 67723:0:(ldlm_lib.c:3271:target_bulk_io()) Skipped 1 previous similar message [Tue Dec 10 05:26:51 2019][83387.587269] Lustre: fir-OST005b: Client 7077f577-10fa-a102-c9d8-a4ca3b92f52f (at 10.9.110.25@o2ib4) reconnected, waiting for 1291 clients in recovery for 2:05 [Tue Dec 10 05:26:51 2019][83387.601436] Lustre: Skipped 56 previous similar messages [Tue Dec 10 05:27:07 2019][83403.620422] Lustre: fir-OST0057: Client 9c41e276-bb54-ccfd-4d34-4092e6989764 (at 10.9.103.70@o2ib4) reconnected, waiting for 1291 clients in recovery for 1:48 [Tue Dec 10 05:27:07 2019][83403.634588] Lustre: Skipped 143 previous similar messages [Tue Dec 10 05:27:38 2019][83434.930767] Lustre: fir-OST0055: Client 5f11dd29-1211-44a2-2612-f8309cf085b3 (at 10.8.21.18@o2ib6) refused connection, still busy with 6 references [Tue Dec 10 05:27:39 2019][83436.163818] Lustre: fir-OST005d: Client c104d961-ddd0-a5eb-3382-4ecbd88b591c (at 10.8.18.16@o2ib6) reconnected, waiting for 1290 clients in recovery for 1:17 [Tue Dec 10 05:27:39 2019][83436.177946] Lustre: Skipped 143 previous similar messages [Tue Dec 10 05:27:40 2019][83436.930456] Lustre: fir-OST005b: Client a507eb44-8ff1-13e2-fab8-30d1823663f8 (at 10.8.22.24@o2ib6) refused connection, still busy with 6 references [Tue Dec 10 05:27:44 2019][83440.685672] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.211@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:27:44 2019][83440.685678] LustreError: 67978:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) @@@ truncated bulk GET 0(871544) req@ffff89224fd74050 x1652166659120640/t0(0) o4->da9f6e55-12b4-4@10.9.112.5@o2ib4:517/0 lens 488/448 e 0 to 0 dl 1575984497 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:27:44 2019][83440.685681] LustreError: 67978:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) Skipped 5 previous similar messages [Tue Dec 10 05:27:44 2019][83440.731356] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) Skipped 13 previous similar messages [Tue Dec 10 05:27:47 2019][83444.232488] Lustre: fir-OST0055: Client 3db7ac8a-faba-9fd6-d84d-1b8e92435cfb (at 10.8.26.18@o2ib6) refused connection, still busy with 6 references [Tue Dec 10 05:27:47 2019][83444.245695] Lustre: Skipped 2 previous similar messages [Tue Dec 10 05:27:50 2019][83446.930688] Lustre: fir-OST0057: Client d30d2da1-5a39-6a53-6def-eb7c150e8cb6 (at 10.8.31.1@o2ib6) refused connection, still busy with 6 references [Tue Dec 10 05:27:50 2019][83446.943815] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:27:57 2019][83454.163673] Lustre: fir-OST005d: Client 72b66a84-eb6d-8862-b24a-97d6ffec93b7 (at 10.8.24.22@o2ib6) refused connection, still busy with 6 references [Tue Dec 10 05:27:57 2019][83454.176886] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:28:07 2019][83464.155643] Lustre: fir-OST005d: Client ca09bd61-a4b3-111c-b997-9c7823236764 (at 10.8.22.17@o2ib6) refused connection, still busy with 6 references [Tue Dec 10 05:28:07 2019][83464.168857] Lustre: Skipped 158 previous similar messages [Tue Dec 10 05:28:23 2019][83480.159355] Lustre: fir-OST0059: Client 5028c448-3432-783a-f116-6a44a16b46a7 (at 10.8.29.8@o2ib6) refused connection, still busy with 6 references [Tue Dec 10 05:28:23 2019][83480.172483] Lustre: Skipped 35 previous similar messages [Tue Dec 10 05:28:43 2019][83500.442869] Lustre: fir-OST005d: Client b34be8aa-32d9-4 (at 10.9.113.13@o2ib4) reconnected, waiting for 1290 clients in recovery for 0:12 [Tue Dec 10 05:28:43 2019][83500.455221] Lustre: Skipped 1934 previous similar messages [Tue Dec 10 05:28:54 2019][83511.176440] Lustre: fir-OST0057: Recovery already passed deadline 0:00. If you do not want to wait more, you may force taget eviction via 'lctl --device fir-OST0057 abort_recovery. [Tue Dec 10 05:28:55 2019][83511.656879] Lustre: fir-OST0057: recovery is timed out, evict stale exports [Tue Dec 10 05:28:55 2019][83511.664127] Lustre: fir-OST0057: disconnecting 10 stale clients [Tue Dec 10 05:28:55 2019][83511.806156] Lustre: fir-OST0057: Recovery over after 2:30, of 1291 clients 1281 recovered and 10 were evicted. [Tue Dec 10 05:28:55 2019][83511.816161] Lustre: Skipped 5 previous similar messages [Tue Dec 10 05:28:55 2019][83511.860836] Lustre: fir-OST0057: deleting orphan objects from 0x18c0000401:11791589 to 0x18c0000401:11792961 [Tue Dec 10 05:28:55 2019][83511.896502] Lustre: fir-OST0057: deleting orphan objects from 0x18c0000400:3036937 to 0x18c0000400:3037633 [Tue Dec 10 05:28:55 2019][83511.932613] Lustre: fir-OST0057: deleting orphan objects from 0x0:27483952 to 0x0:27483969 [Tue Dec 10 05:28:55 2019][83512.043223] Lustre: fir-OST0057: deleting orphan objects from 0x18c0000402:922876 to 0x18c0000402:922913 [Tue Dec 10 05:28:55 2019][83512.155024] Lustre: fir-OST005b: Recovery already passed deadline 0:00. If you do not want to wait more, you may force taget eviction via 'lctl --device fir-OST005b abort_recovery. [Tue Dec 10 05:28:55 2019][83512.171098] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:28:55 2019][83512.324646] Lustre: fir-OST005f: recovery is timed out, evict stale exports [Tue Dec 10 05:28:55 2019][83512.331617] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:28:55 2019][83512.336947] Lustre: fir-OST005f: disconnecting 2 stale clients [Tue Dec 10 05:28:55 2019][83512.342815] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:28:56 2019][83513.398676] Lustre: fir-OST005d: recovery is timed out, evict stale exports [Tue Dec 10 05:28:56 2019][83513.405644] Lustre: Skipped 2 previous similar messages [Tue Dec 10 05:28:56 2019][83513.411046] Lustre: fir-OST005d: disconnecting 1 stale clients [Tue Dec 10 05:28:56 2019][83513.416915] Lustre: Skipped 2 previous similar messages [Tue Dec 10 05:28:59 2019][83516.311131] Lustre: fir-OST005b: Denying connection for new client 3a18a690-f6fb-7d4d-c179-697da5c59619 (at 10.9.116.10@o2ib4), waiting for 1291 known clients (1188 recovered, 100 in progress, and 3 evicted) to recover in 0:56 [Tue Dec 10 05:29:01 2019][83518.150125] Lustre: fir-OST005f: Denying connection for new client 3c020cd0-089d-acb1-e879-86429192cebf (at 10.8.27.2@o2ib6), waiting for 1291 known clients (1193 recovered, 96 in progress, and 2 evicted) to recover in 0:53 [Tue Dec 10 05:29:07 2019][83524.142848] Lustre: fir-OST0059: Denying connection for new client 5a3d40f3-7440-8bab-3ed3-c953b35f5db5 (at 10.9.104.11@o2ib4), waiting for 1291 known clients (1199 recovered, 91 in progress, and 1 evicted) to recover in 0:48 [Tue Dec 10 05:29:09 2019][83526.157270] Lustre: fir-OST005f: Recovery over after 2:44, of 1291 clients 1289 recovered and 2 were evicted. [Tue Dec 10 05:29:09 2019][83526.227237] Lustre: fir-OST005f: deleting orphan objects from 0x1ac0000400:3045562 to 0x1ac0000400:3045857 [Tue Dec 10 05:29:09 2019][83526.463633] Lustre: fir-OST005f: deleting orphan objects from 0x0:27483501 to 0x0:27483521 [Tue Dec 10 05:29:10 2019][83526.484622] Lustre: fir-OST005f: deleting orphan objects from 0x1ac0000402:920820 to 0x1ac0000402:920897 [Tue Dec 10 05:29:10 2019][83526.494815] Lustre: fir-OST005f: deleting orphan objects from 0x1ac0000401:11753307 to 0x1ac0000401:11753953 [Tue Dec 10 05:29:13 2019][83529.508304] Lustre: fir-OST0055: Denying connection for new client 3dc3e4b3-1daf-f260-3956-f8f68e141bca (at 10.9.117.42@o2ib4), waiting for 1291 known clients (1183 recovered, 107 in progress, and 1 evicted) to recover in 0:41 [Tue Dec 10 05:29:13 2019][83529.528358] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:29:14 2019][83530.457223] Lustre: fir-OST0059: Recovery over after 2:47, of 1291 clients 1290 recovered and 1 was evicted. [Tue Dec 10 05:29:14 2019][83530.548753] Lustre: fir-OST0059: deleting orphan objects from 0x1940000401:2999880 to 0x1940000401:3000289 [Tue Dec 10 05:29:14 2019][83530.672559] Lustre: fir-OST0059: deleting orphan objects from 0x1940000400:906085 to 0x1940000400:906145 [Tue Dec 10 05:29:14 2019][83530.685548] Lustre: fir-OST0059: deleting orphan objects from 0x0:27234514 to 0x0:27234529 [Tue Dec 10 05:29:14 2019][83530.709523] Lustre: fir-OST0059: deleting orphan objects from 0x1940000402:11643021 to 0x1940000402:11644417 [Tue Dec 10 05:29:21 2019][83537.618300] Lustre: 111966:0:(ldlm_lib.c:1765:extend_recovery_timer()) fir-OST005b: extended recovery timer reaching hard limit: 900, extend: 1 [Tue Dec 10 05:29:21 2019][83537.760812] Lustre: fir-OST005b: Recovery over after 2:55, of 1291 clients 1288 recovered and 3 were evicted. [Tue Dec 10 05:29:21 2019][83537.842315] Lustre: fir-OST005b: deleting orphan objects from 0x19c0000401:3019027 to 0x19c0000401:3020481 [Tue Dec 10 05:29:21 2019][83538.024106] Lustre: fir-OST005b: deleting orphan objects from 0x19c0000400:916141 to 0x19c0000400:916193 [Tue Dec 10 05:29:21 2019][83538.084457] Lustre: fir-OST005b: deleting orphan objects from 0x19c0000402:11725588 to 0x19c0000402:11726529 [Tue Dec 10 05:29:21 2019][83538.084461] Lustre: fir-OST005b: deleting orphan objects from 0x0:27420356 to 0x0:27420385 [Tue Dec 10 05:29:38 2019][83554.597085] Lustre: fir-OST0055: Denying connection for new client 3dc3e4b3-1daf-f260-3956-f8f68e141bca (at 10.9.117.42@o2ib4), waiting for 1291 known clients (1184 recovered, 106 in progress, and 1 evicted) to recover in 0:16 [Tue Dec 10 05:29:38 2019][83554.617160] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:30:03 2019][83579.685583] Lustre: fir-OST0055: Denying connection for new client 3dc3e4b3-1daf-f260-3956-f8f68e141bca (at 10.9.117.42@o2ib4), waiting for 1291 known clients (1184 recovered, 106 in progress, and 1 evicted) already passed deadline 0:08 [Tue Dec 10 05:30:04 2019][83580.462432] Lustre: fir-OST0058: Client 964f90b2-201f-0e40-0c9b-d52b03dcf753 (at 10.9.105.61@o2ib4) reconnecting [Tue Dec 10 05:30:04 2019][83580.472632] Lustre: Skipped 7273 previous similar messages [Tue Dec 10 05:30:04 2019][83580.478156] Lustre: fir-OST0058: Connection restored to 964f90b2-201f-0e40-0c9b-d52b03dcf753 (at 10.9.105.61@o2ib4) [Tue Dec 10 05:30:04 2019][83580.488588] Lustre: Skipped 17265 previous similar messages [Tue Dec 10 05:30:17 2019][83593.534006] Lustre: fir-OST0055: Recovery already passed deadline 0:22. If you do not want to wait more, you may force taget eviction via 'lctl --device fir-OST0055 abort_recovery. [Tue Dec 10 05:30:17 2019][83593.535058] Lustre: 112020:0:(ldlm_lib.c:1765:extend_recovery_timer()) fir-OST005d: extended recovery timer reaching hard limit: 900, extend: 1 [Tue Dec 10 05:30:17 2019][83593.535061] Lustre: 112020:0:(ldlm_lib.c:1765:extend_recovery_timer()) Skipped 154 previous similar messages [Tue Dec 10 05:30:17 2019][83593.572770] Lustre: Skipped 3 previous similar messages [Tue Dec 10 05:30:17 2019][83593.735590] Lustre: fir-OST005d: Recovery over after 3:51, of 1290 clients 1289 recovered and 1 was evicted. [Tue Dec 10 05:30:17 2019][83593.787970] Lustre: fir-OST005d: deleting orphan objects from 0x1a40000402:3041004 to 0x1a40000402:3041729 [Tue Dec 10 05:30:17 2019][83593.958249] Lustre: fir-OST005d: deleting orphan objects from 0x0:27502209 to 0x0:27502241 [Tue Dec 10 05:30:17 2019][83593.958251] Lustre: fir-OST005d: deleting orphan objects from 0x1a40000400:11793335 to 0x1a40000400:11794593 [Tue Dec 10 05:30:17 2019][83594.116751] Lustre: fir-OST005d: deleting orphan objects from 0x1a40000401:922080 to 0x1a40000401:922113 [Tue Dec 10 05:30:24 2019][83600.533414] Lustre: fir-OST0055: Recovery already passed deadline 0:29. If you do not want to wait more, you may force taget eviction via 'lctl --device fir-OST0055 abort_recovery. [Tue Dec 10 05:30:28 2019][83604.774454] Lustre: fir-OST0055: Denying connection for new client 3dc3e4b3-1daf-f260-3956-f8f68e141bca (at 10.9.117.42@o2ib4), waiting for 1291 known clients (1184 recovered, 106 in progress, and 1 evicted) already passed deadline 0:33 [Tue Dec 10 05:30:35 2019][83611.533963] Lustre: fir-OST0055: Recovery already passed deadline 0:40. If you do not want to wait more, you may force taget eviction via 'lctl --device fir-OST0055 abort_recovery. [Tue Dec 10 05:30:35 2019][83611.550895] Lustre: 111654:0:(ldlm_lib.c:1765:extend_recovery_timer()) fir-OST0055: extended recovery timer reaching hard limit: 900, extend: 1 [Tue Dec 10 05:30:35 2019][83611.563857] Lustre: 111654:0:(ldlm_lib.c:1765:extend_recovery_timer()) Skipped 101 previous similar messages [Tue Dec 10 05:30:35 2019][83611.717356] Lustre: fir-OST0055: Recovery over after 4:10, of 1291 clients 1290 recovered and 1 was evicted. [Tue Dec 10 05:30:35 2019][83611.786226] Lustre: fir-OST0055: deleting orphan objects from 0x1840000400:11790357 to 0x1840000400:11790881 [Tue Dec 10 05:30:35 2019][83611.925706] Lustre: fir-OST0055: deleting orphan objects from 0x1840000402:3041849 to 0x1840000402:3042849 [Tue Dec 10 05:30:35 2019][83612.026730] Lustre: fir-OST0055: deleting orphan objects from 0x0:27493454 to 0x0:27493473 [Tue Dec 10 05:30:35 2019][83612.056601] Lustre: fir-OST0055: deleting orphan objects from 0x1840000401:923366 to 0x1840000401:923457 [Tue Dec 10 05:33:40 2019][83796.700915] Lustre: 63864:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575984813/real 1575984813] req@ffff89077f17b600 x1652452421759024/t0(0) o400->MGC10.0.10.51@o2ib7@10.0.10.51@o2ib7:26/25 lens 224/224 e 0 to 1 dl 1575984820 ref 1 fl Rpc:XN/0/ffffffff rc 0/-1 [Tue Dec 10 05:33:40 2019][83796.728970] Lustre: 63864:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 1 previous similar message [Tue Dec 10 05:33:40 2019][83796.738637] LustreError: 166-1: MGC10.0.10.51@o2ib7: Connection to MGS (at 10.0.10.51@o2ib7) was lost; in progress operations using this service will fail [Tue Dec 10 05:34:36 2019][83852.646060] LNetError: 63820:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Timed out tx: active_txs, 0 seconds [Tue Dec 10 05:34:36 2019][83852.656233] LNetError: 63820:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Timed out RDMA with 10.0.10.51@o2ib7 (56): c: 5, oc: 0, rc: 8 [Tue Dec 10 05:34:36 2019][83852.668738] LNetError: 63832:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.51@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:34:36 2019][83852.681607] LNetError: 63832:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) Skipped 4 previous similar messages [Tue Dec 10 05:34:36 2019][83852.751315] LNetError: 108379:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:34:37 2019][83853.751219] LNetError: 108379:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:34:38 2019][83854.751398] LNetError: 108379:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:34:49 2019][83865.751890] LNetError: 108379:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:34:59 2019][83875.751981] LNetError: 108379:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:35:14 2019][83890.752457] LNetError: 108379:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:35:44 2019][83920.753034] LNetError: 108379:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:35:44 2019][83920.765121] LNetError: 108379:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 5 previous similar messages [Tue Dec 10 05:35:45 2019][83922.183649] Lustre: Evicted from MGS (at MGC10.0.10.51@o2ib7_1) after server handle changed from 0xdff031726fbff0e1 to 0xbba64b52f329a2a4 [Tue Dec 10 05:36:05 2019][83941.647915] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 05:36:20 2019][83956.648211] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 05:36:20 2019][83956.658316] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:36:20 2019][83956.670341] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 1 previous similar message [Tue Dec 10 05:36:35 2019][83971.648522] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 05:36:50 2019][83986.648829] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 05:37:05 2019][84001.649136] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 05:37:20 2019][84016.649449] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 05:37:35 2019][84031.649779] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:37:35 2019][84031.661773] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 4 previous similar messages [Tue Dec 10 05:37:51 2019][84047.650090] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 1 seconds [Tue Dec 10 05:37:51 2019][84047.660177] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 1 previous similar message [Tue Dec 10 05:38:16 2019][84072.714709] Lustre: fir-MDT0002-lwp-OST005a: Connection to fir-MDT0002 (at 10.0.10.53@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Tue Dec 10 05:38:16 2019][84072.730692] Lustre: Skipped 10 previous similar messages [Tue Dec 10 05:38:35 2019][84091.650987] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 05:38:35 2019][84091.661067] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 2 previous similar messages [Tue Dec 10 05:38:38 2019][84094.908236] Lustre: fir-OST005f: Connection restored to 19d091c7-bad9-3fc5-d8c7-1acb2d646997 (at 10.9.114.9@o2ib4) [Tue Dec 10 05:38:38 2019][84094.918592] Lustre: Skipped 11 previous similar messages [Tue Dec 10 05:38:53 2019][84109.460494] LustreError: 68040:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0059: cli 52008b8a-1aae-c71d-80d5-aeea34862c6c claims 16801792 GRANT, real grant 16752640 [Tue Dec 10 05:38:53 2019][84109.475266] LustreError: 68040:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 3 previous similar messages [Tue Dec 10 05:39:01 2019][84117.681837] LustreError: 67972:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0059: cli 52008b8a-1aae-c71d-80d5-aeea34862c6c claims 16801792 GRANT, real grant 0 [Tue Dec 10 05:39:03 2019][84120.010758] LustreError: 67824:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0059: cli 52008b8a-1aae-c71d-80d5-aeea34862c6c claims 4218880 GRANT, real grant 0 [Tue Dec 10 05:39:16 2019][84133.064019] LustreError: 67828:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0059: cli 52008b8a-1aae-c71d-80d5-aeea34862c6c claims 16801792 GRANT, real grant 0 [Tue Dec 10 05:39:24 2019][84141.024594] LustreError: 68046:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0059: cli 52008b8a-1aae-c71d-80d5-aeea34862c6c claims 16801792 GRANT, real grant 0 [Tue Dec 10 05:39:45 2019][84161.652430] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 05:39:45 2019][84161.662513] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 52 previous similar messages [Tue Dec 10 05:39:45 2019][84161.671917] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:39:45 2019][84161.683960] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 12 previous similar messages [Tue Dec 10 05:41:06 2019][84243.112384] LustreError: 68023:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0059: cli 52008b8a-1aae-c71d-80d5-aeea34862c6c claims 16801792 GRANT, real grant 13017088 [Tue Dec 10 05:41:06 2019][84243.127158] LustreError: 68023:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 1 previous similar message [Tue Dec 10 05:41:58 2019][84295.432583] Lustre: fir-OST0055: deleting orphan objects from 0x1840000402:3042880 to 0x1840000402:3042913 [Tue Dec 10 05:41:58 2019][84295.432586] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3030047 to 0x1800000400:3030113 [Tue Dec 10 05:41:58 2019][84295.432610] Lustre: fir-OST0059: deleting orphan objects from 0x1940000401:3000337 to 0x1940000401:3000353 [Tue Dec 10 05:41:58 2019][84295.432624] Lustre: fir-OST005b: deleting orphan objects from 0x19c0000401:3020523 to 0x19c0000401:3020545 [Tue Dec 10 05:41:58 2019][84295.432627] Lustre: fir-OST0057: deleting orphan objects from 0x18c0000400:3037677 to 0x18c0000400:3037697 [Tue Dec 10 05:41:58 2019][84295.432628] Lustre: fir-OST0056: deleting orphan objects from 0x1880000402:3038989 to 0x1880000402:3039073 [Tue Dec 10 05:41:59 2019][84295.432630] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3045330 to 0x1900000402:3045345 [Tue Dec 10 05:41:59 2019][84295.432631] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3041125 to 0x1a80000402:3041217 [Tue Dec 10 05:41:59 2019][84295.434960] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3048955 to 0x1980000402:3049025 [Tue Dec 10 05:41:59 2019][84295.434965] Lustre: fir-OST005d: deleting orphan objects from 0x1a40000402:3041777 to 0x1a40000402:3041793 [Tue Dec 10 05:41:59 2019][84295.434975] Lustre: fir-OST005f: deleting orphan objects from 0x1ac0000400:3045911 to 0x1ac0000400:3045953 [Tue Dec 10 05:41:59 2019][84295.434977] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3008777 to 0x1a00000401:3008865 [Tue Dec 10 05:42:01 2019][84297.655221] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 1 seconds [Tue Dec 10 05:42:01 2019][84297.665308] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 56 previous similar messages [Tue Dec 10 05:42:02 2019][84298.511476] LustreError: 167-0: fir-MDT0002-lwp-OST0054: This client was evicted by fir-MDT0002; in progress operations using this service will fail. [Tue Dec 10 05:42:08 2019][84304.867327] LustreError: 67723:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0059: cli 52008b8a-1aae-c71d-80d5-aeea34862c6c claims 16801792 GRANT, real grant 7770112 [Tue Dec 10 05:44:04 2019][84420.763267] LNetError: 113073:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:44:04 2019][84420.775355] LNetError: 113073:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 22 previous similar messages [Tue Dec 10 05:46:26 2019][84562.660654] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 1 seconds [Tue Dec 10 05:46:26 2019][84562.670737] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 9 previous similar messages [Tue Dec 10 05:52:51 2019][84947.668501] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:52:51 2019][84947.680497] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 34 previous similar messages [Tue Dec 10 05:55:05 2019][85081.671234] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 05:55:05 2019][85081.681315] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 34 previous similar messages [Tue Dec 10 06:03:01 2019][85557.680771] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 06:03:01 2019][85557.692760] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 40 previous similar messages [Tue Dec 10 06:05:11 2019][85687.683334] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 1 seconds [Tue Dec 10 06:05:11 2019][85687.693418] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 40 previous similar messages [Tue Dec 10 06:13:10 2019][86166.692984] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 06:13:10 2019][86166.704978] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 42 previous similar messages [Tue Dec 10 06:13:23 2019][86180.149321] Lustre: fir-MDT0002-lwp-OST005a: Connection to fir-MDT0002 (at 10.0.10.54@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Tue Dec 10 06:13:23 2019][86180.165349] Lustre: Skipped 11 previous similar messages [Tue Dec 10 06:14:41 2019][86257.533861] Lustre: fir-OST0054: Connection restored to fir-MDT0002-mdtlov_UUID (at 10.0.10.53@o2ib7) [Tue Dec 10 06:14:41 2019][86257.543088] Lustre: Skipped 28 previous similar messages [Tue Dec 10 06:15:04 2019][86280.503488] LustreError: 167-0: fir-MDT0002-lwp-OST005e: This client was evicted by fir-MDT0002; in progress operations using this service will fail. [Tue Dec 10 06:15:04 2019][86280.516908] LustreError: Skipped 11 previous similar messages [Tue Dec 10 06:15:18 2019][86295.454034] Lustre: fir-OST005d: deleting orphan objects from 0x1a40000402:3041953 to 0x1a40000402:3041985 [Tue Dec 10 06:15:18 2019][86295.454036] Lustre: fir-OST0055: deleting orphan objects from 0x1840000402:3043079 to 0x1840000402:3043137 [Tue Dec 10 06:15:18 2019][86295.454037] Lustre: fir-OST005f: deleting orphan objects from 0x1ac0000400:3046117 to 0x1ac0000400:3046177 [Tue Dec 10 06:15:18 2019][86295.454040] Lustre: fir-OST0059: deleting orphan objects from 0x1940000401:3000495 to 0x1940000401:3000577 [Tue Dec 10 06:15:18 2019][86295.454056] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3045508 to 0x1900000402:3045537 [Tue Dec 10 06:15:18 2019][86295.454063] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3030279 to 0x1800000400:3030305 [Tue Dec 10 06:15:18 2019][86295.454081] Lustre: fir-OST0056: deleting orphan objects from 0x1880000402:3039249 to 0x1880000402:3039265 [Tue Dec 10 06:15:18 2019][86295.454108] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3049184 to 0x1980000402:3049217 [Tue Dec 10 06:15:18 2019][86295.454148] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3041372 to 0x1a80000402:3041409 [Tue Dec 10 06:15:19 2019][86295.454217] Lustre: fir-OST0057: deleting orphan objects from 0x18c0000400:3037845 to 0x18c0000400:3037889 [Tue Dec 10 06:15:19 2019][86295.454219] Lustre: fir-OST005b: deleting orphan objects from 0x19c0000401:3020727 to 0x19c0000401:3020769 [Tue Dec 10 06:15:19 2019][86295.454224] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3009028 to 0x1a00000401:3009057 [Tue Dec 10 06:15:25 2019][86301.695719] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 06:15:25 2019][86301.705801] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 42 previous similar messages [Tue Dec 10 06:16:51 2019][86387.857445] Lustre: 63875:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575987404/real 1575987404] req@ffff8922f905ad00 x1652452422920864/t0(0) o400->fir-MDT0003-lwp-OST005f@10.0.10.54@o2ib7:12/10 lens 224/224 e 0 to 1 dl 1575987411 ref 1 fl Rpc:XN/0/ffffffff rc 0/-1 [Tue Dec 10 06:16:51 2019][86387.857447] Lustre: 63866:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575987404/real 1575987404] req@ffff8922f9059680 x1652452422920912/t0(0) o400->fir-MDT0003-lwp-OST005b@10.0.10.54@o2ib7:12/10 lens 224/224 e 0 to 1 dl 1575987411 ref 1 fl Rpc:XN/0/ffffffff rc 0/-1 [Tue Dec 10 06:16:51 2019][86387.857451] Lustre: 63866:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 2 previous similar messages [Tue Dec 10 06:16:51 2019][86387.857453] Lustre: fir-MDT0003-lwp-OST0059: Connection to fir-MDT0003 (at 10.0.10.54@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Tue Dec 10 06:16:51 2019][86387.857455] Lustre: Skipped 2 previous similar messages [Tue Dec 10 06:16:51 2019][86387.945142] Lustre: 63875:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 6 previous similar messages [Tue Dec 10 06:18:20 2019][86476.977328] Lustre: fir-OST0054: Connection restored to fir-MDT0002-mdtlov_UUID (at 10.0.10.53@o2ib7) [Tue Dec 10 06:18:20 2019][86476.986563] Lustre: Skipped 20 previous similar messages [Tue Dec 10 06:18:56 2019][86513.340197] LustreError: 167-0: fir-MDT0003-lwp-OST0054: This client was evicted by fir-MDT0003; in progress operations using this service will fail. [Tue Dec 10 06:18:56 2019][86513.353585] LustreError: Skipped 11 previous similar messages [Tue Dec 10 06:19:04 2019][86520.840041] Lustre: fir-OST0056: deleting orphan objects from 0x1880000400:11762168 to 0x1880000400:11762241 [Tue Dec 10 06:19:04 2019][86520.840046] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11785079 to 0x1a80000400:11785153 [Tue Dec 10 06:19:04 2019][86520.840059] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11781867 to 0x1900000401:11782049 [Tue Dec 10 06:19:04 2019][86520.840069] Lustre: fir-OST0055: deleting orphan objects from 0x1840000400:11791176 to 0x1840000400:11791265 [Tue Dec 10 06:19:04 2019][86520.840073] Lustre: fir-OST0059: deleting orphan objects from 0x1940000402:11644728 to 0x1940000402:11644769 [Tue Dec 10 06:19:04 2019][86520.840127] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11769931 to 0x1800000401:11769985 [Tue Dec 10 06:19:04 2019][86520.840129] Lustre: fir-OST005b: deleting orphan objects from 0x19c0000402:11726859 to 0x19c0000402:11726881 [Tue Dec 10 06:19:04 2019][86520.840131] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11826483 to 0x1980000401:11826657 [Tue Dec 10 06:19:04 2019][86520.840132] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000400:11638479 to 0x1a00000400:11638529 [Tue Dec 10 06:19:04 2019][86520.840179] Lustre: fir-OST0057: deleting orphan objects from 0x18c0000401:11793269 to 0x18c0000401:11793313 [Tue Dec 10 06:19:04 2019][86520.840181] Lustre: fir-OST005d: deleting orphan objects from 0x1a40000400:11794873 to 0x1a40000400:11794913 [Tue Dec 10 06:19:04 2019][86520.840190] Lustre: fir-OST005f: deleting orphan objects from 0x1ac0000401:11754239 to 0x1ac0000401:11754273 [Tue Dec 10 06:20:12 2019][86588.605617] Lustre: fir-MDT0001-lwp-OST005a: Connection to fir-MDT0001 (at 10.0.10.52@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Tue Dec 10 06:20:12 2019][86588.621603] Lustre: Skipped 19 previous similar messages [Tue Dec 10 06:22:12 2019][86708.846358] Lustre: fir-OST0054: Connection restored to fir-MDT0001-mdtlov_UUID (at 10.0.10.51@o2ib7) [Tue Dec 10 06:22:12 2019][86708.855584] Lustre: Skipped 21 previous similar messages [Tue Dec 10 06:22:37 2019][86734.084538] Lustre: fir-OST0055: deleting orphan objects from 0x1840000401:923481 to 0x1840000401:923521 [Tue Dec 10 06:22:37 2019][86734.084540] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:918591 to 0x1800000402:918625 [Tue Dec 10 06:22:37 2019][86734.084543] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:922723 to 0x1900000400:922753 [Tue Dec 10 06:22:37 2019][86734.084545] Lustre: fir-OST0057: deleting orphan objects from 0x18c0000402:922950 to 0x18c0000402:922977 [Tue Dec 10 06:22:37 2019][86734.084549] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:907512 to 0x1a00000402:907553 [Tue Dec 10 06:22:37 2019][86734.084551] Lustre: fir-OST0059: deleting orphan objects from 0x1940000400:906186 to 0x1940000400:906209 [Tue Dec 10 06:22:37 2019][86734.084553] Lustre: fir-OST0056: deleting orphan objects from 0x1880000401:920974 to 0x1880000401:920993 [Tue Dec 10 06:22:37 2019][86734.084571] Lustre: fir-OST005f: deleting orphan objects from 0x1ac0000402:920947 to 0x1ac0000402:920993 [Tue Dec 10 06:22:37 2019][86734.084576] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:920757 to 0x1a80000401:920801 [Tue Dec 10 06:22:37 2019][86734.084583] Lustre: fir-OST005b: deleting orphan objects from 0x19c0000400:916221 to 0x19c0000400:916257 [Tue Dec 10 06:22:37 2019][86734.084584] Lustre: fir-OST005d: deleting orphan objects from 0x1a40000401:922148 to 0x1a40000401:922177 [Tue Dec 10 06:22:37 2019][86734.085560] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:924075 to 0x1980000400:924097 [Tue Dec 10 06:22:42 2019][86739.136744] LustreError: 167-0: fir-MDT0001-lwp-OST005e: This client was evicted by fir-MDT0001; in progress operations using this service will fail. [Tue Dec 10 06:22:42 2019][86739.150146] LustreError: Skipped 11 previous similar messages [Tue Dec 10 06:41:10 2019][87846.814365] Lustre: Failing over fir-OST005d [Tue Dec 10 06:41:10 2019][87846.861997] Lustre: fir-OST0059: Not available for connect from 10.8.30.2@o2ib6 (stopping) [Tue Dec 10 06:41:10 2019][87846.870267] Lustre: Skipped 1 previous similar message [Tue Dec 10 06:41:10 2019][87846.932238] LustreError: 114752:0:(ldlm_resource.c:1147:ldlm_resource_complain()) filter-fir-OST0059_UUID: namespace resource [0x1940000401:0x2dc03d:0x0].0x0 (ffff88ed360c12c0) refcount nonzero (2) after lock cleanup; forcing cleanup. [Tue Dec 10 06:41:10 2019][87847.376418] Lustre: fir-OST0055: Not available for connect from 10.9.113.11@o2ib4 (stopping) [Tue Dec 10 06:41:10 2019][87847.384864] Lustre: Skipped 343 previous similar messages [Tue Dec 10 06:41:11 2019][87848.380594] Lustre: fir-OST0057: Not available for connect from 10.9.105.65@o2ib4 (stopping) [Tue Dec 10 06:41:11 2019][87848.389032] Lustre: Skipped 628 previous similar messages [Tue Dec 10 06:41:12 2019][87849.162769] LustreError: 137-5: fir-OST005d_UUID: not available for connect from 10.8.23.24@o2ib6 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 06:41:12 2019][87849.180054] LustreError: Skipped 8474 previous similar messages [Tue Dec 10 06:41:13 2019][87849.962903] Lustre: server umount fir-OST0059 complete [Tue Dec 10 06:41:13 2019][87849.968053] Lustre: Skipped 2 previous similar messages [Tue Dec 10 06:41:14 2019][87850.574398] LustreError: 87509:0:(ldlm_lockd.c:2324:ldlm_cancel_handler()) ldlm_cancel from 10.9.108.71@o2ib4 arrived at 1575988874 with bad export cookie 8099382812963126271 [Tue Dec 10 06:41:14 2019][87850.589954] LustreError: 87509:0:(ldlm_lockd.c:2324:ldlm_cancel_handler()) Skipped 1 previous similar message [Tue Dec 10 06:41:14 2019][87851.284712] LustreError: 66057:0:(ldlm_lockd.c:2324:ldlm_cancel_handler()) ldlm_cancel from 10.9.116.1@o2ib4 arrived at 1575988874 with bad export cookie 8099382812963119894 [Tue Dec 10 06:41:17 2019][87853.979725] md7: detected capacity change from 64011422924800 to 0 [Tue Dec 10 06:41:17 2019][87853.985921] md: md7 stopped. [Tue Dec 10 06:41:17 2019][87854.043783] md1: detected capacity change from 64011422924800 to 0 [Tue Dec 10 06:41:17 2019][87854.049978] md: md1 stopped. [Tue Dec 10 06:41:17 2019][87854.050954] md3: detected capacity change from 64011422924800 to 0 [Tue Dec 10 06:41:17 2019][87854.050958] md: md3 stopped. [Tue Dec 10 06:41:17 2019][87854.053291] md11: detected capacity change from 64011422924800 to 0 [Tue Dec 10 06:41:17 2019][87854.053302] md: md11 stopped. [Tue Dec 10 06:41:17 2019][87854.140475] md9: detected capacity change from 64011422924800 to 0 [Tue Dec 10 06:41:17 2019][87854.146665] md: md9 stopped. [Tue Dec 10 06:41:19 2019][87855.711959] LustreError: 87509:0:(ldlm_lockd.c:2324:ldlm_cancel_handler()) ldlm_cancel from 10.9.101.49@o2ib4 arrived at 1575988879 with bad export cookie 8099382812963135658 [Tue Dec 10 06:41:19 2019][87855.727511] LustreError: 87509:0:(ldlm_lockd.c:2324:ldlm_cancel_handler()) Skipped 3 previous similar messages [Tue Dec 10 06:41:19 2019][87856.355456] md5: detected capacity change from 64011422924800 to 0 [Tue Dec 10 06:41:19 2019][87856.361657] md: md5 stopped. [Tue Dec 10 06:41:20 2019][87857.071054] md: md7 stopped. [Tue Dec 10 06:41:20 2019][87857.072014] md: md3 stopped. [Tue Dec 10 06:41:21 2019][87858.074149] md: md3 stopped. [Tue Dec 10 06:41:25 2019][87861.751375] LustreError: 68056:0:(ldlm_lockd.c:2324:ldlm_cancel_handler()) ldlm_cancel from 10.9.101.13@o2ib4 arrived at 1575988885 with bad export cookie 8099382812963145024 [Tue Dec 10 06:41:32 2019][87869.272333] LustreError: 87509:0:(ldlm_lockd.c:2324:ldlm_cancel_handler()) ldlm_cancel from 10.9.101.47@o2ib4 arrived at 1575988892 with bad export cookie 8099382812963143897 [Tue Dec 10 06:41:32 2019][87869.287880] LustreError: 87509:0:(ldlm_lockd.c:2324:ldlm_cancel_handler()) Skipped 3 previous similar messages [Tue Dec 10 06:41:46 2019][87883.295171] LustreError: 137-5: fir-OST005b_UUID: not available for connect from 10.9.117.22@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 06:41:46 2019][87883.312550] LustreError: Skipped 6249 previous similar messages [Tue Dec 10 06:42:50 2019][87947.296381] LustreError: 137-5: fir-OST0055_UUID: not available for connect from 10.8.27.2@o2ib6 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 06:42:50 2019][87947.313575] LustreError: Skipped 7797 previous similar messages [Tue Dec 10 06:44:58 2019][88075.492651] LustreError: 137-5: fir-OST005f_UUID: not available for connect from 10.9.101.18@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 06:44:58 2019][88075.510050] LustreError: Skipped 13956 previous similar messages [Tue Dec 10 07:01:05 2019][89042.543427] LustreError: 11-0: fir-MDT0000-lwp-OST005e: operation ldlm_enqueue to node 10.0.10.52@o2ib7 failed: rc = -107 [Tue Dec 10 07:01:05 2019][89042.543431] Lustre: fir-MDT0000-lwp-OST0056: Connection to fir-MDT0000 (at 10.0.10.52@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Tue Dec 10 07:01:06 2019][89042.543433] Lustre: Skipped 2 previous similar messages [Tue Dec 10 07:01:06 2019][89042.575605] LustreError: Skipped 30 previous similar messages [Tue Dec 10 07:02:27 2019][89123.752807] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.54@o2ib7: 0 seconds [Tue Dec 10 07:02:27 2019][89123.762893] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 112 previous similar messages [Tue Dec 10 07:02:27 2019][89123.772391] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 07:02:27 2019][89123.784389] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 34 previous similar messages [Tue Dec 10 07:03:35 2019][89191.907564] Lustre: fir-OST0054: Connection restored to fir-MDT0001-mdtlov_UUID (at 10.0.10.51@o2ib7) [Tue Dec 10 07:03:35 2019][89191.916813] Lustre: Skipped 23 previous similar messages [Tue Dec 10 07:04:06 2019][89222.899028] LustreError: 167-0: fir-MDT0000-lwp-OST005c: This client was evicted by fir-MDT0000; in progress operations using this service will fail. [Tue Dec 10 07:04:06 2019][89222.912416] LustreError: Skipped 11 previous similar messages [Tue Dec 10 07:04:06 2019][89222.919007] LustreError: 63873:0:(client.c:1197:ptlrpc_import_delay_req()) @@@ invalidate in flight req@ffff890de48d3f00 x1652452423870288/t0(0) o103->fir-MDT0000-lwp-OST005e@10.0.10.51@o2ib7:17/18 lens 328/224 e 0 to 0 dl 0 ref 1 fl Rpc:W/0/ffffffff rc 0/-1 [Tue Dec 10 07:04:06 2019][89223.159901] LustreError: 11-0: fir-MDT0000-lwp-OST005c: operation quota_acquire to node 10.0.10.51@o2ib7 failed: rc = -11 [Tue Dec 10 07:04:06 2019][89223.170875] LustreError: Skipped 3 previous similar messages [Tue Dec 10 07:04:08 2019][89224.874759] LustreError: 11-0: fir-MDT0000-lwp-OST005a: operation quota_acquire to node 10.0.10.51@o2ib7 failed: rc = -11 [Tue Dec 10 07:04:08 2019][89224.885718] LustreError: Skipped 16 previous similar messages [Tue Dec 10 07:04:14 2019][89230.696477] Lustre: fir-OST0056: deleting orphan objects from 0x0:27467647 to 0x0:27467681 [Tue Dec 10 07:04:14 2019][89230.696513] Lustre: fir-OST005c: deleting orphan objects from 0x0:27180265 to 0x0:27180289 [Tue Dec 10 07:04:14 2019][89230.696575] Lustre: fir-OST005a: deleting orphan objects from 0x0:27549741 to 0x0:27549761 [Tue Dec 10 07:04:14 2019][89230.696608] Lustre: fir-OST0054: deleting orphan objects from 0x0:27445580 to 0x0:27445601 [Tue Dec 10 07:04:14 2019][89230.696647] Lustre: fir-OST005e: deleting orphan objects from 0x0:27454900 to 0x0:27454945 [Tue Dec 10 07:04:14 2019][89230.697022] Lustre: fir-OST0058: deleting orphan objects from 0x0:27494442 to 0x0:27494465 [Tue Dec 10 07:04:57 2019][89273.755955] LNetError: 114061:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 07:04:57 2019][89273.768058] LNetError: 114061:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 6 previous similar messages [Tue Dec 10 07:04:59 2019][89275.873243] Lustre: 63870:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has failed due to network error: [sent 1575990296/real 1575990299] req@ffff89129e81da00 x1652452423996192/t0(0) o400->MGC10.0.10.51@o2ib7@10.0.10.52@o2ib7:26/25 lens 224/224 e 0 to 1 dl 1575990303 ref 1 fl Rpc:eXN/0/ffffffff rc 0/-1 [Tue Dec 10 07:04:59 2019][89275.901644] Lustre: 63870:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 2 previous similar messages [Tue Dec 10 07:04:59 2019][89275.911396] LustreError: 166-1: MGC10.0.10.51@o2ib7: Connection to MGS (at 10.0.10.52@o2ib7) was lost; in progress operations using this service will fail [Tue Dec 10 07:07:35 2019][89431.759049] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.52@o2ib7: 0 seconds [Tue Dec 10 07:07:35 2019][89431.769136] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 23 previous similar messages [Tue Dec 10 07:07:35 2019][89431.778562] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 07:07:35 2019][89431.790592] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 13 previous similar messages [Tue Dec 10 07:10:07 2019][89583.762119] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.52@o2ib7: 1 seconds [Tue Dec 10 07:10:07 2019][89583.772201] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 7 previous similar messages [Tue Dec 10 07:13:01 2019][89757.765638] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 07:13:01 2019][89757.777660] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 15 previous similar messages [Tue Dec 10 07:17:12 2019][90008.770606] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.52@o2ib7: 0 seconds [Tue Dec 10 07:17:12 2019][90008.780691] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 11 previous similar messages [Tue Dec 10 07:23:54 2019][90410.778568] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 07:23:54 2019][90410.790559] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 26 previous similar messages [Tue Dec 10 07:31:00 2019][90836.787083] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.52@o2ib7: 0 seconds [Tue Dec 10 07:31:00 2019][90836.797165] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 23 previous similar messages [Tue Dec 10 07:33:00 2019][90956.821732] Lustre: fir-MDT0003-lwp-OST005c: Connection to fir-MDT0003 (at 10.0.10.53@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Tue Dec 10 07:33:00 2019][90956.837717] Lustre: Skipped 9 previous similar messages [Tue Dec 10 07:34:25 2019][91042.163769] Lustre: fir-OST0054: Connection restored to fir-MDT0003-mdtlov_UUID (at 10.0.10.54@o2ib7) [Tue Dec 10 07:34:25 2019][91042.173003] Lustre: Skipped 8 previous similar messages [Tue Dec 10 07:35:05 2019][91082.264270] LustreError: 167-0: fir-MDT0003-lwp-OST005c: This client was evicted by fir-MDT0003; in progress operations using this service will fail. [Tue Dec 10 07:35:05 2019][91082.277659] LustreError: Skipped 5 previous similar messages [Tue Dec 10 07:35:05 2019][91082.285228] Lustre: fir-MDT0003-lwp-OST005c: Connection restored to 10.0.10.54@o2ib7 (at 10.0.10.54@o2ib7) [Tue Dec 10 07:35:05 2019][91082.294896] Lustre: Skipped 6 previous similar messages [Tue Dec 10 07:35:33 2019][91110.101659] Lustre: fir-OST0056: deleting orphan objects from 0x1880000400:11762647 to 0x1880000400:11762689 [Tue Dec 10 07:35:33 2019][91110.101679] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000400:11638916 to 0x1a00000400:11638945 [Tue Dec 10 07:35:33 2019][91110.101680] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11782453 to 0x1900000401:11782497 [Tue Dec 10 07:35:33 2019][91110.101803] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11785582 to 0x1a80000400:11785601 [Tue Dec 10 07:35:33 2019][91110.101804] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11770384 to 0x1800000401:11770401 [Tue Dec 10 07:35:33 2019][91110.101805] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11827087 to 0x1980000401:11827105 [Tue Dec 10 07:38:01 2019][91257.883712] Lustre: fir-MDT0001-lwp-OST005a: Connection to fir-MDT0001 (at 10.0.10.51@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Tue Dec 10 07:38:01 2019][91257.899712] Lustre: Skipped 5 previous similar messages [Tue Dec 10 07:39:02 2019][91318.892140] Lustre: fir-OST0054: Connection restored to fir-MDT0001-mdtlov_UUID (at 10.0.10.52@o2ib7) [Tue Dec 10 07:39:02 2019][91318.901362] Lustre: Skipped 3 previous similar messages [Tue Dec 10 07:40:06 2019][91383.326212] LustreError: 167-0: fir-MDT0001-lwp-OST005a: This client was evicted by fir-MDT0001; in progress operations using this service will fail. [Tue Dec 10 07:40:06 2019][91383.339597] LustreError: Skipped 5 previous similar messages [Tue Dec 10 07:40:06 2019][91383.347176] Lustre: fir-MDT0001-lwp-OST005a: Connection restored to 10.0.10.52@o2ib7 (at 10.0.10.52@o2ib7) [Tue Dec 10 07:40:06 2019][91383.356830] Lustre: Skipped 6 previous similar messages [Tue Dec 10 07:40:19 2019][91396.114129] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:907594 to 0x1a00000402:907617 [Tue Dec 10 07:40:19 2019][91396.114130] Lustre: fir-OST0056: deleting orphan objects from 0x1880000401:921038 to 0x1880000401:921057 [Tue Dec 10 07:40:19 2019][91396.114134] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:918672 to 0x1800000402:918689 [Tue Dec 10 07:40:19 2019][91396.114203] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:924139 to 0x1980000400:924161 [Tue Dec 10 07:40:19 2019][91396.114266] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:920847 to 0x1a80000401:920865 [Tue Dec 10 07:40:19 2019][91396.114268] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:922805 to 0x1900000400:922849 [Tue Dec 10 07:41:21 2019][91458.591755] Lustre: Evicted from MGS (at 10.0.10.51@o2ib7) after server handle changed from 0xbba64b52f329a2a4 to 0xc3c20c0652556a2a [Tue Dec 10 07:41:21 2019][91458.603833] Lustre: MGC10.0.10.51@o2ib7: Connection restored to 10.0.10.51@o2ib7 (at 10.0.10.51@o2ib7) [Tue Dec 10 07:53:33 2019][92189.693251] Lustre: fir-OST0058: haven't heard from client 82b9ac9e-bd42-fb9c-cb3e-f327857b510c (at 10.9.0.62@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892272a71400, cur 1575993213 expire 1575993063 last 1575992986 [Tue Dec 10 07:53:33 2019][92189.714880] Lustre: Skipped 5 previous similar messages [Tue Dec 10 08:34:30 2019][94647.453315] Lustre: fir-OST0054: Connection restored to 82b9ac9e-bd42-fb9c-cb3e-f327857b510c (at 10.9.0.62@o2ib4) [Tue Dec 10 08:34:30 2019][94647.463585] Lustre: Skipped 5 previous similar messages [Tue Dec 10 08:34:31 2019][94647.725707] Lustre: fir-OST0058: haven't heard from client cec884d3-ca4b-8127-2f6b-7762665aa5f8 (at 10.9.0.64@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892266f68800, cur 1575995671 expire 1575995521 last 1575995444 [Tue Dec 10 08:34:31 2019][94647.747347] Lustre: Skipped 5 previous similar messages [Tue Dec 10 09:11:00 2019][96836.831164] Lustre: fir-OST0054: Connection restored to cec884d3-ca4b-8127-2f6b-7762665aa5f8 (at 10.9.0.64@o2ib4) [Tue Dec 10 09:11:00 2019][96836.841425] Lustre: Skipped 5 previous similar messages [Tue Dec 10 09:18:12 2019][97268.763743] Lustre: fir-OST005a: haven't heard from client fb9a2d5e-e9b3-4fb9-b988-9954fcfb0920 (at 10.8.0.66@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8912f8ae5000, cur 1575998292 expire 1575998142 last 1575998065 [Tue Dec 10 09:18:12 2019][97268.785369] Lustre: Skipped 5 previous similar messages [Tue Dec 10 09:52:25 2019][99322.533302] Lustre: fir-OST0054: Connection restored to fb9a2d5e-e9b3-4fb9-b988-9954fcfb0920 (at 10.8.0.66@o2ib6) [Tue Dec 10 09:52:25 2019][99322.543567] Lustre: Skipped 5 previous similar messages [Tue Dec 10 10:17:14 2019][100810.839701] Lustre: fir-OST0058: haven't heard from client 40a204f8-61bd-7bf5-8e8b-66a640362528 (at 10.8.21.28@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8922569b5400, cur 1576001834 expire 1576001684 last 1576001607 [Tue Dec 10 10:17:14 2019][100810.861510] Lustre: Skipped 5 previous similar messages [Tue Dec 10 10:46:11 2019][102548.503191] Lustre: fir-OST0054: Connection restored to 0af2ee10-72ea-97a8-65e7-44544fdbc0b9 (at 10.9.108.39@o2ib4) [Tue Dec 10 10:46:11 2019][102548.513721] Lustre: Skipped 5 previous similar messages [Tue Dec 10 10:51:50 2019][102887.676821] Lustre: fir-OST0054: Connection restored to 6943a6ac-ba36-d287-3012-a3d9ab556566 (at 10.8.21.14@o2ib6) [Tue Dec 10 10:51:50 2019][102887.687255] Lustre: Skipped 5 previous similar messages [Tue Dec 10 10:52:05 2019][102902.279630] Lustre: fir-OST0054: Connection restored to 98c710cf-a183-35fe-d60d-8494e153f1c3 (at 10.8.21.13@o2ib6) [Tue Dec 10 10:52:05 2019][102902.290069] Lustre: Skipped 5 previous similar messages [Tue Dec 10 10:52:07 2019][102904.746400] Lustre: fir-OST0054: Connection restored to (at 10.8.21.8@o2ib6) [Tue Dec 10 10:52:07 2019][102904.746401] Lustre: fir-OST0056: Connection restored to (at 10.8.21.8@o2ib6) [Tue Dec 10 10:52:07 2019][102904.746404] Lustre: Skipped 6 previous similar messages [Tue Dec 10 10:52:07 2019][102904.766189] Lustre: Skipped 4 previous similar messages [Tue Dec 10 10:52:17 2019][102914.701723] Lustre: fir-OST0054: Connection restored to 40a204f8-61bd-7bf5-8e8b-66a640362528 (at 10.8.21.28@o2ib6) [Tue Dec 10 10:52:17 2019][102914.712164] Lustre: Skipped 11 previous similar messages [Tue Dec 10 10:52:36 2019][102933.490929] Lustre: fir-OST0054: Connection restored to 07312e22-36ea-cbe1-f5a7-b2f2d00651b0 (at 10.8.20.22@o2ib6) [Tue Dec 10 10:52:36 2019][102933.501368] Lustre: Skipped 23 previous similar messages [Tue Dec 10 10:53:10 2019][102967.732545] Lustre: fir-OST0054: Connection restored to b5be2f5f-0f09-196f-7061-da3a3aa7cecb (at 10.8.20.31@o2ib6) [Tue Dec 10 10:53:10 2019][102967.742984] Lustre: Skipped 83 previous similar messages [Tue Dec 10 10:54:19 2019][103036.782165] Lustre: fir-OST0054: Connection restored to a77d579b-bc84-7eca-11a1-85e2fd56cb4e (at 10.8.20.23@o2ib6) [Tue Dec 10 10:54:19 2019][103036.792604] Lustre: Skipped 106 previous similar messages [Tue Dec 10 10:55:36 2019][103113.335081] LustreError: 66125:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005e: cli 20463417-fb32-2f92-5aae-59bfa8e287e3 claims 14893056 GRANT, real grant 0 [Tue Dec 10 10:56:02 2019][103139.403106] LustreError: 67963:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli aa2a15c8-736e-708a-65eb-dfabc669a063 claims 28672 GRANT, real grant 0 [Tue Dec 10 11:08:59 2019][103916.737798] Lustre: fir-OST0054: Connection restored to 8003aaab-bcab-ef28-dd2b-704b0d862745 (at 10.8.22.12@o2ib6) [Tue Dec 10 11:08:59 2019][103916.748231] Lustre: Skipped 11 previous similar messages [Tue Dec 10 11:09:22 2019][103939.599954] Lustre: fir-OST0058: Connection restored to 92c08489-d99f-9692-0d8e-5d862ef77698 (at 10.8.22.5@o2ib6) [Tue Dec 10 11:09:22 2019][103939.610312] Lustre: Skipped 4 previous similar messages [Tue Dec 10 11:10:36 2019][104013.821679] Lustre: fir-OST0054: Connection restored to 37454ba9-0898-97b6-5a68-4a0682e739f8 (at 10.8.20.8@o2ib6) [Tue Dec 10 11:10:36 2019][104013.832031] Lustre: Skipped 18 previous similar messages [Tue Dec 10 11:43:24 2019][105981.538288] Lustre: fir-OST0054: Connection restored to b11f5302-9207-4a63-91bc-6141fa0b09e3 (at 10.8.22.4@o2ib6) [Tue Dec 10 11:43:24 2019][105981.548641] Lustre: Skipped 5 previous similar messages [Tue Dec 10 11:44:48 2019][106064.964622] LustreError: 68035:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0058: cli 5d5036fa-60c3-4 claims 16752640 GRANT, real grant 0 [Tue Dec 10 12:01:59 2019][107096.674363] LustreError: 67973:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 295209bb-0224-d868-bd7c-cd75c3b19a1c claims 200704 GRANT, real grant 0 [Tue Dec 10 12:05:18 2019][107295.378991] LustreError: 68000:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005c: cli 31722a42-53ae-b678-363c-dc0a8c0b6d11 claims 28672 GRANT, real grant 0 [Tue Dec 10 12:14:47 2019][107864.528049] LustreError: 67965:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005c: cli 5d5036fa-60c3-4 claims 2076672 GRANT, real grant 0 [Tue Dec 10 12:33:44 2019][109001.246063] perf: interrupt took too long (3130 > 3128), lowering kernel.perf_event_max_sample_rate to 63000 [Tue Dec 10 13:08:00 2019][111057.203872] Lustre: fir-OST0054: Connection restored to a5709ca0-bfe0-cc30-835f-99ba0583ca05 (at 10.8.20.27@o2ib6) [Tue Dec 10 13:08:00 2019][111057.214309] Lustre: Skipped 5 previous similar messages [Tue Dec 10 13:27:28 2019][112225.415351] LustreError: 67989:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005a: cli a83208a9-361d-4 claims 1597440 GRANT, real grant 0 [Tue Dec 10 13:44:48 2019][113265.085246] Lustre: fir-OST005e: haven't heard from client 8fbd1a16-d09d-1ef7-e10d-4e68dc0a9f97 (at 10.8.23.32@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892250f16c00, cur 1576014288 expire 1576014138 last 1576014061 [Tue Dec 10 13:44:48 2019][113265.107060] Lustre: Skipped 61 previous similar messages [Tue Dec 10 13:44:50 2019][113267.080987] Lustre: fir-OST005a: haven't heard from client 8fbd1a16-d09d-1ef7-e10d-4e68dc0a9f97 (at 10.8.23.32@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff88e322a4a800, cur 1576014290 expire 1576014140 last 1576014063 [Tue Dec 10 13:44:50 2019][113267.102809] Lustre: Skipped 1 previous similar message [Tue Dec 10 14:12:28 2019][114926.010694] LustreError: 67711:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 57b26761-b79f-628f-0ec2-0a10fd7ac3bd claims 28672 GRANT, real grant 0 [Tue Dec 10 14:21:04 2019][115441.676815] Lustre: fir-OST0054: Connection restored to (at 10.8.23.32@o2ib6) [Tue Dec 10 14:21:04 2019][115441.684128] Lustre: Skipped 5 previous similar messages [Tue Dec 10 14:47:57 2019][117054.836037] Lustre: fir-OST0054: Connection restored to 0bbd53e2-6989-83e6-f126-86a473496205 (at 10.8.21.36@o2ib6) [Tue Dec 10 14:47:57 2019][117054.846482] Lustre: Skipped 5 previous similar messages [Tue Dec 10 15:28:53 2019][119510.220122] Lustre: fir-OST0058: haven't heard from client ee4590b6-1057-e690-5db0-89b0af3963cd (at 10.8.22.30@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8922bfba1000, cur 1576020533 expire 1576020383 last 1576020306 [Tue Dec 10 15:28:53 2019][119510.241935] Lustre: Skipped 3 previous similar messages [Tue Dec 10 15:29:02 2019][119519.215315] Lustre: fir-OST0056: haven't heard from client ee4590b6-1057-e690-5db0-89b0af3963cd (at 10.8.22.30@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892251931400, cur 1576020542 expire 1576020392 last 1576020315 [Tue Dec 10 15:29:02 2019][119519.237123] Lustre: Skipped 3 previous similar messages [Tue Dec 10 15:35:10 2019][119887.411759] Lustre: fir-OST0054: Connection restored to c20915b7-72a8-8f0f-a961-7c81095a2283 (at 10.8.23.29@o2ib6) [Tue Dec 10 15:35:10 2019][119887.422195] Lustre: Skipped 5 previous similar messages [Tue Dec 10 16:03:38 2019][121596.087729] Lustre: fir-OST0054: Connection restored to ee4590b6-1057-e690-5db0-89b0af3963cd (at 10.8.22.30@o2ib6) [Tue Dec 10 16:03:38 2019][121596.098177] Lustre: Skipped 5 previous similar messages [Tue Dec 10 17:01:22 2019][125060.118327] Lustre: fir-OST0054: Connection restored to 7898fb8f-92c0-6a6b-8c01-20f1dcd2c072 (at 10.8.23.20@o2ib6) [Tue Dec 10 17:01:22 2019][125060.128767] Lustre: Skipped 5 previous similar messages [Tue Dec 10 17:09:05 2019][125522.869119] LustreError: 67911:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli da9f6e55-12b4-4 claims 1597440 GRANT, real grant 0 [Tue Dec 10 17:57:50 2019][128448.096826] LustreError: 67954:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005a: cli da9f6e55-12b4-4 claims 36864 GRANT, real grant 0 [Tue Dec 10 18:18:56 2019][129713.385351] Lustre: fir-OST0054: Connection restored to 43d748a2-b8c5-e7f9-8b00-d16d4390ff4d (at 10.8.22.6@o2ib6) [Tue Dec 10 18:18:56 2019][129713.395703] Lustre: Skipped 3 previous similar messages [Tue Dec 10 18:30:39 2019][130416.428824] Lustre: fir-OST005c: haven't heard from client b6bab463-5f5c-8f5c-f09a-8f0ce0f6e1cd (at 10.8.21.31@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8922f2259800, cur 1576031439 expire 1576031289 last 1576031212 [Tue Dec 10 18:30:39 2019][130416.450651] Lustre: Skipped 1 previous similar message [Tue Dec 10 18:31:55 2019][130492.437526] Lustre: fir-OST005e: haven't heard from client 7515dbe4-f1c8-844a-9186-76f9c6288c34 (at 10.9.104.2@o2ib4) in 222 seconds. I think it's dead, and I am evicting it. exp ffff892250bf5400, cur 1576031515 expire 1576031365 last 1576031293 [Tue Dec 10 18:31:55 2019][130492.459325] Lustre: Skipped 29 previous similar messages [Tue Dec 10 18:51:08 2019][131645.782803] Lustre: fir-OST0054: Connection restored to (at 10.9.114.14@o2ib4) [Tue Dec 10 18:51:08 2019][131645.790234] Lustre: Skipped 5 previous similar messages [Tue Dec 10 18:51:55 2019][131693.053219] Lustre: fir-OST0056: Connection restored to d66c3860-7975-f3f1-3866-4386eb6742ed (at 10.8.19.6@o2ib6) [Tue Dec 10 18:51:55 2019][131693.063577] Lustre: Skipped 5 previous similar messages [Tue Dec 10 18:55:04 2019][131881.923365] Lustre: fir-OST0054: Connection restored to 2295c161-47a8-c199-f8c4-4e53eff1b957 (at 10.9.110.71@o2ib4) [Tue Dec 10 18:55:04 2019][131881.923366] Lustre: fir-OST0058: Connection restored to 2295c161-47a8-c199-f8c4-4e53eff1b957 (at 10.9.110.71@o2ib4) [Tue Dec 10 18:55:04 2019][131881.944410] Lustre: Skipped 4 previous similar messages [Tue Dec 10 18:55:43 2019][131921.259280] Lustre: fir-OST0054: Connection restored to 67fb2ec4-7a5a-f103-4386-bc08c967f193 (at 10.9.107.9@o2ib4) [Tue Dec 10 18:55:43 2019][131921.269718] Lustre: Skipped 5 previous similar messages [Tue Dec 10 18:57:08 2019][132005.721784] Lustre: fir-OST0054: Connection restored to e8872901-9e69-2d9a-e57a-55077a64186b (at 10.9.109.25@o2ib4) [Tue Dec 10 18:57:08 2019][132005.721785] Lustre: fir-OST0058: Connection restored to e8872901-9e69-2d9a-e57a-55077a64186b (at 10.9.109.25@o2ib4) [Tue Dec 10 18:57:08 2019][132005.742828] Lustre: Skipped 4 previous similar messages [Tue Dec 10 18:59:18 2019][132136.020350] Lustre: fir-OST0054: Connection restored to 2ad8ff13-d978-9373-7245-882c6479cc4c (at 10.9.110.63@o2ib4) [Tue Dec 10 18:59:18 2019][132136.030875] Lustre: Skipped 5 previous similar messages [Tue Dec 10 19:03:19 2019][132377.338818] Lustre: fir-OST0054: Connection restored to b5acf087-1850-f5e1-236a-4cc1bab1a9f0 (at 10.9.104.34@o2ib4) [Tue Dec 10 19:03:19 2019][132377.349350] Lustre: Skipped 34 previous similar messages [Tue Dec 10 19:05:07 2019][132484.938830] Lustre: fir-OST0054: Connection restored to b6bab463-5f5c-8f5c-f09a-8f0ce0f6e1cd (at 10.8.21.31@o2ib6) [Tue Dec 10 19:05:07 2019][132484.938830] Lustre: fir-OST0056: Connection restored to b6bab463-5f5c-8f5c-f09a-8f0ce0f6e1cd (at 10.8.21.31@o2ib6) [Tue Dec 10 19:05:07 2019][132484.938833] Lustre: Skipped 18 previous similar messages [Tue Dec 10 19:05:07 2019][132484.965098] Lustre: Skipped 4 previous similar messages [Tue Dec 10 19:08:44 2019][132702.203103] Lustre: fir-OST0054: Connection restored to (at 10.8.28.9@o2ib6) [Tue Dec 10 19:08:44 2019][132702.210335] Lustre: Skipped 29 previous similar messages [Tue Dec 10 19:42:31 2019][134728.568717] Lustre: fir-OST0056: haven't heard from client aadbd140-afe6-3cc5-5efa-1bf64465f6e7 (at 10.8.20.34@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8922515e2800, cur 1576035751 expire 1576035601 last 1576035524 [Tue Dec 10 19:42:31 2019][134728.590521] Lustre: Skipped 77 previous similar messages [Tue Dec 10 19:42:37 2019][134734.512090] Lustre: fir-OST0054: haven't heard from client aadbd140-afe6-3cc5-5efa-1bf64465f6e7 (at 10.8.20.34@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892252a14c00, cur 1576035757 expire 1576035607 last 1576035530 [Tue Dec 10 19:42:41 2019][134738.545470] Lustre: fir-OST005c: haven't heard from client aadbd140-afe6-3cc5-5efa-1bf64465f6e7 (at 10.8.20.34@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8922f23b4000, cur 1576035761 expire 1576035611 last 1576035534 [Tue Dec 10 19:42:41 2019][134738.567278] Lustre: Skipped 3 previous similar messages [Tue Dec 10 20:17:49 2019][136846.637044] Lustre: fir-OST0054: Connection restored to (at 10.8.20.34@o2ib6) [Tue Dec 10 20:17:49 2019][136846.644360] Lustre: Skipped 53 previous similar messages [Tue Dec 10 21:36:51 2019][141589.169921] Lustre: fir-OST0054: Connection restored to 55ff50e7-08a4-be07-5499-ccc18f03f2c9 (at 10.8.23.17@o2ib6) [Tue Dec 10 21:36:51 2019][141589.180357] Lustre: Skipped 5 previous similar messages [Tue Dec 10 22:43:25 2019][145583.045993] LustreError: 68031:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0058: cli c9911b4c-e55e-f4aa-416a-b652019239f7 claims 28672 GRANT, real grant 0 [Tue Dec 10 23:32:58 2019][148556.094351] Lustre: fir-OST0054: Connection restored to 77f07ca8-e3bd-72f6-4ac1-3da8889522b3 (at 10.8.22.19@o2ib6) [Tue Dec 10 23:32:58 2019][148556.094352] Lustre: fir-OST0056: Connection restored to 77f07ca8-e3bd-72f6-4ac1-3da8889522b3 (at 10.8.22.19@o2ib6) [Tue Dec 10 23:32:58 2019][148556.115220] Lustre: Skipped 4 previous similar messages [Tue Dec 10 23:41:20 2019][149058.222678] Lustre: fir-OST0054: Connection restored to 10918197-1d43-5fa6-1aea-d8f3cfbab80a (at 10.8.20.5@o2ib6) [Tue Dec 10 23:41:20 2019][149058.233028] Lustre: Skipped 5 previous similar messages [Wed Dec 11 01:11:55 2019][154493.133417] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576055508/real 1576055508] req@ffff88ec68d40480 x1652452551974832/t0(0) o106->fir-OST0056@10.8.22.1@o2ib6:15/16 lens 296/280 e 0 to 1 dl 1576055515 ref 1 fl Rpc:X/0/ffffffff rc 0/-1 [Wed Dec 11 01:12:02 2019][154500.160563] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576055515/real 1576055515] req@ffff88ec68d40480 x1652452551974832/t0(0) o106->fir-OST0056@10.8.22.1@o2ib6:15/16 lens 296/280 e 0 to 1 dl 1576055522 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Wed Dec 11 01:12:09 2019][154507.187710] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576055522/real 1576055522] req@ffff88ec68d40480 x1652452551974832/t0(0) o106->fir-OST0056@10.8.22.1@o2ib6:15/16 lens 296/280 e 0 to 1 dl 1576055529 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Wed Dec 11 01:12:16 2019][154514.214855] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576055529/real 1576055529] req@ffff88ec68d40480 x1652452551974832/t0(0) o106->fir-OST0056@10.8.22.1@o2ib6:15/16 lens 296/280 e 0 to 1 dl 1576055536 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Wed Dec 11 01:12:23 2019][154521.241994] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576055536/real 1576055536] req@ffff88ec68d40480 x1652452551974832/t0(0) o106->fir-OST0056@10.8.22.1@o2ib6:15/16 lens 296/280 e 0 to 1 dl 1576055543 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Wed Dec 11 01:12:37 2019][154535.269278] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576055550/real 1576055550] req@ffff88ec68d40480 x1652452551974832/t0(0) o106->fir-OST0056@10.8.22.1@o2ib6:15/16 lens 296/280 e 0 to 1 dl 1576055557 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Wed Dec 11 01:12:37 2019][154535.296553] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 1 previous similar message [Wed Dec 11 01:12:58 2019][154556.307704] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576055571/real 1576055571] req@ffff88ec68d40480 x1652452551974832/t0(0) o106->fir-OST0056@10.8.22.1@o2ib6:15/16 lens 296/280 e 0 to 1 dl 1576055578 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Wed Dec 11 01:12:58 2019][154556.334976] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 2 previous similar messages [Wed Dec 11 01:13:33 2019][154591.345414] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576055606/real 1576055606] req@ffff88ec68d40480 x1652452551974832/t0(0) o106->fir-OST0056@10.8.22.1@o2ib6:15/16 lens 296/280 e 0 to 1 dl 1576055613 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Wed Dec 11 01:13:33 2019][154591.372668] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 4 previous similar messages [Wed Dec 11 01:14:43 2019][154661.384828] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576055676/real 1576055676] req@ffff88ec68d40480 x1652452551974832/t0(0) o106->fir-OST0056@10.8.22.1@o2ib6:15/16 lens 296/280 e 0 to 1 dl 1576055683 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Wed Dec 11 01:14:43 2019][154661.412114] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 9 previous similar messages [Wed Dec 11 01:15:08 2019][154686.606341] LNet: Service thread pid 67842 was inactive for 200.46s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Wed Dec 11 01:15:08 2019][154686.623374] Pid: 67842, comm: ll_ost00_056 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Wed Dec 11 01:15:08 2019][154686.633949] Call Trace: [Wed Dec 11 01:15:08 2019][154686.636506] [] ptlrpc_set_wait+0x480/0x790 [ptlrpc] [Wed Dec 11 01:15:08 2019][154686.643193] [] ldlm_run_ast_work+0xd5/0x3a0 [ptlrpc] [Wed Dec 11 01:15:08 2019][154686.649970] [] ldlm_glimpse_locks+0x3b/0x100 [ptlrpc] [Wed Dec 11 01:15:08 2019][154686.656875] [] ofd_intent_policy+0x69b/0x920 [ofd] [Wed Dec 11 01:15:08 2019][154686.663456] [] ldlm_lock_enqueue+0x356/0xa20 [ptlrpc] [Wed Dec 11 01:15:08 2019][154686.670309] [] ldlm_handle_enqueue0+0xa56/0x15f0 [ptlrpc] [Wed Dec 11 01:15:08 2019][154686.677505] [] tgt_enqueue+0x62/0x210 [ptlrpc] [Wed Dec 11 01:15:08 2019][154686.683824] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Wed Dec 11 01:15:08 2019][154686.690877] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Wed Dec 11 01:15:08 2019][154686.698716] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Wed Dec 11 01:15:08 2019][154686.705172] [] kthread+0xd1/0xe0 [Wed Dec 11 01:15:08 2019][154686.710175] [] ret_from_fork_nospec_begin+0xe/0x21 [Wed Dec 11 01:15:08 2019][154686.716742] [] 0xffffffffffffffff [Wed Dec 11 01:15:08 2019][154686.721919] LustreError: dumping log to /tmp/lustre-log.1576055708.67842 [Wed Dec 11 01:15:15 2019][154692.916159] Lustre: fir-OST005c: haven't heard from client 09a03217-f2a1-2632-097f-38339f6cbc7c (at 10.8.22.1@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892260120000, cur 1576055715 expire 1576055565 last 1576055488 [Wed Dec 11 01:15:15 2019][154692.937983] LustreError: 67842:0:(ldlm_lockd.c:681:ldlm_handle_ast_error()) ### client (nid 10.8.22.1@o2ib6) failed to reply to glimpse AST (req@ffff88ec68d40480 x1652452551974832 status 0 rc -5), evict it ns: filter-fir-OST0056_UUID lock: ffff88fdf7a76c00/0x7066c9c18d907795 lrc: 3/0,0 mode: PW/PW res: [0x1880000402:0x2e6270:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 67108864->68719476735) flags: 0x40000000000000 nid: 10.8.22.1@o2ib6 remote: 0x891f0e0311d6012b expref: 6 pid: 66438 timeout: 0 lvb_type: 0 [Wed Dec 11 01:15:15 2019][154692.983786] LustreError: 138-a: fir-OST0056: A client on nid 10.8.22.1@o2ib6 was evicted due to a lock glimpse callback time out: rc -5 [Wed Dec 11 01:15:15 2019][154692.996079] LustreError: Skipped 1 previous similar message [Wed Dec 11 01:15:15 2019][154693.001792] LNet: Service thread pid 67842 completed after 206.86s. This indicates the system was overloaded (too many service threads, or there were not enough hardware resources). [Wed Dec 11 01:15:17 2019][154695.053939] Lustre: fir-OST005e: haven't heard from client 09a03217-f2a1-2632-097f-38339f6cbc7c (at 10.8.22.1@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892250855400, cur 1576055717 expire 1576055567 last 1576055490 [Wed Dec 11 01:15:17 2019][154695.075659] Lustre: Skipped 4 previous similar messages [Wed Dec 11 01:16:06 2019][154744.471386] Lustre: fir-OST0054: Connection restored to 37c7e464-6686-fdc0-1c81-eae75026a910 (at 10.8.22.2@o2ib6) [Wed Dec 11 01:16:06 2019][154744.481736] Lustre: Skipped 5 previous similar messages [Wed Dec 11 01:16:39 2019][154776.988984] LustreError: 67983:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0058: cli 75ca7fbe-4dbb-5345-e1bf-3a337b10784c claims 28672 GRANT, real grant 0 [Wed Dec 11 01:17:07 2019][154805.181795] LustreError: 67824:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0058: cli 1ca33a17-2a16-9d12-d021-e37db0ce1d5c claims 28672 GRANT, real grant 0 [Wed Dec 11 01:47:41 2019][156639.530128] Lustre: fir-OST0054: Connection restored to 1b1ace85-4b01-f903-bb83-ddb9142a20b0 (at 10.8.23.25@o2ib6) [Wed Dec 11 01:47:41 2019][156639.540596] Lustre: Skipped 5 previous similar messages [Wed Dec 11 01:50:36 2019][156814.118108] Lustre: fir-OST0054: Connection restored to (at 10.8.22.1@o2ib6) [Wed Dec 11 01:50:36 2019][156814.125342] Lustre: Skipped 5 previous similar messages [Wed Dec 11 02:10:21 2019][157998.975312] Lustre: fir-OST0056: haven't heard from client d48dfcab-ce8f-b93c-3409-a3e76df7c945 (at 10.8.23.22@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892251bfa000, cur 1576059021 expire 1576058871 last 1576058794 [Wed Dec 11 02:10:25 2019][158003.001566] Lustre: fir-OST005c: haven't heard from client d48dfcab-ce8f-b93c-3409-a3e76df7c945 (at 10.8.23.22@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892276b4d000, cur 1576059025 expire 1576058875 last 1576058798 [Wed Dec 11 02:10:25 2019][158003.023385] Lustre: Skipped 1 previous similar message [Wed Dec 11 02:46:47 2019][160185.543119] Lustre: fir-OST0054: Connection restored to d48dfcab-ce8f-b93c-3409-a3e76df7c945 (at 10.8.23.22@o2ib6) [Wed Dec 11 02:46:47 2019][160185.553555] Lustre: Skipped 5 previous similar messages [Wed Dec 11 07:09:33 2019][175952.204662] Lustre: fir-OST0054: Connection restored to 54375174-855e-4eb5-233f-bff7110a15a5 (at 10.8.22.7@o2ib6) [Wed Dec 11 07:09:33 2019][175952.215018] Lustre: Skipped 5 previous similar messages [Wed Dec 11 07:23:36 2019]SOL session closed by BMC [Wed Dec 11 07:23:36 2019]Error in SOL session [-- Console down -- Wed Dec 11 07:23:36 2019] [-- Console up -- Wed Dec 11 07:23:37 2019] [Wed Dec 11 07:23:37 2019]Acquiring startup lock...done [Wed Dec 11 07:23:37 2019]Info: SOL payload already de-activated [Wed Dec 11 07:23:37 2019][SOL Session operational. Use ~? for help] [Wed Dec 11 08:29:29 2019][180747.446024] Lustre: fir-OST0054: haven't heard from client 5a6b489d-8a0c-1dc7-c222-8c5330c92213 (at 10.8.8.20@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892252f4d800, cur 1576081769 expire 1576081619 last 1576081542 [Wed Dec 11 08:29:29 2019][180747.467764] Lustre: Skipped 3 previous similar messages [Wed Dec 11 08:32:30 2019][180928.431481] Lustre: fir-OST0054: haven't heard from client dcb788f4-67f3-4 (at 10.9.109.25@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff88f00a8e3c00, cur 1576081950 expire 1576081800 last 1576081723 [Wed Dec 11 08:32:30 2019][180928.451542] Lustre: Skipped 53 previous similar messages [Wed Dec 11 08:32:38 2019][180936.847782] Lustre: fir-OST0054: Connection restored to (at 10.9.107.20@o2ib4) [Wed Dec 11 08:32:38 2019][180936.855189] Lustre: Skipped 5 previous similar messages [Wed Dec 11 08:36:45 2019][181184.073424] Lustre: fir-OST0054: Connection restored to 2295c161-47a8-c199-f8c4-4e53eff1b957 (at 10.9.110.71@o2ib4) [Wed Dec 11 08:36:45 2019][181184.083951] Lustre: Skipped 5 previous similar messages [Wed Dec 11 08:37:00 2019][181199.055734] Lustre: fir-OST0056: Connection restored to e8872901-9e69-2d9a-e57a-55077a64186b (at 10.9.109.25@o2ib4) [Wed Dec 11 08:37:00 2019][181199.066249] Lustre: Skipped 5 previous similar messages [Wed Dec 11 08:45:05 2019][181684.308691] LustreError: 67945:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005a: cli da9f6e55-12b4-4 claims 1605632 GRANT, real grant 36864 [Wed Dec 11 08:53:23 2019][182181.803276] Lustre: fir-OST0054: Connection restored to 907ff646-c0ba-4 (at 10.9.117.46@o2ib4) [Wed Dec 11 08:53:23 2019][182181.811976] Lustre: Skipped 5 previous similar messages [Wed Dec 11 08:53:56 2019][182214.739773] Lustre: fir-OST0054: Connection restored to (at 10.8.9.1@o2ib6) [Wed Dec 11 08:53:56 2019][182214.746916] Lustre: Skipped 5 previous similar messages [Wed Dec 11 08:57:30 2019][182428.920092] Lustre: fir-OST0054: Connection restored to 8a77a7b3-28b8-5200-390a-7fe51bf1be0a (at 10.8.7.5@o2ib6) [Wed Dec 11 08:57:30 2019][182428.930351] Lustre: Skipped 5 previous similar messages [Wed Dec 11 08:59:03 2019][182522.314512] Lustre: fir-OST0054: Connection restored to 0df17536-86d5-4 (at 10.9.101.60@o2ib4) [Wed Dec 11 08:59:03 2019][182522.323216] Lustre: Skipped 5 previous similar messages [Wed Dec 11 08:59:20 2019][182538.970365] Lustre: fir-OST0054: Connection restored to 54fd6f2e-cb6c-4 (at 10.9.101.57@o2ib4) [Wed Dec 11 08:59:20 2019][182538.979070] Lustre: Skipped 5 previous similar messages [Wed Dec 11 08:59:30 2019][182549.155961] Lustre: fir-OST0054: Connection restored to (at 10.9.101.59@o2ib4) [Wed Dec 11 08:59:30 2019][182549.163362] Lustre: Skipped 5 previous similar messages [Wed Dec 11 09:00:59 2019][182638.397737] Lustre: fir-OST0054: Connection restored to 5a6b489d-8a0c-1dc7-c222-8c5330c92213 (at 10.8.8.20@o2ib6) [Wed Dec 11 09:00:59 2019][182638.408090] Lustre: Skipped 5 previous similar messages [Wed Dec 11 09:04:24 2019][182842.722377] Lustre: fir-OST0054: Connection restored to fc841094-f1fd-2756-1968-f74105b220e6 (at 10.8.8.30@o2ib6) [Wed Dec 11 09:04:24 2019][182842.732727] Lustre: Skipped 5 previous similar messages [Wed Dec 11 09:08:35 2019][183093.965890] Lustre: fir-OST0054: Connection restored to 8393b8d6-d8ea-1574-4a69-552de6648def (at 10.9.102.48@o2ib4) [Wed Dec 11 09:08:35 2019][183093.976416] Lustre: Skipped 16 previous similar messages [Wed Dec 11 09:16:13 2019][183552.110674] Lustre: fir-OST0054: Connection restored to 6676e5f3-c59e-c628-05b4-c9153b23c3f7 (at 10.8.21.16@o2ib6) [Wed Dec 11 09:16:13 2019][183552.121113] Lustre: Skipped 11 previous similar messages [Wed Dec 11 10:23:45 2019][187603.803944] LustreError: 67812:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 57b26761-b79f-628f-0ec2-0a10fd7ac3bd claims 212992 GRANT, real grant 28672 [Wed Dec 11 10:30:05 2019][187984.504138] LustreError: 67963:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli da9f6e55-12b4-4 claims 1605632 GRANT, real grant 1597440 [Wed Dec 11 10:38:39 2019][188497.658275] LustreError: 67893:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005e: cli 837641b0-d89a-c20b-3139-4eb8fe8d733b claims 110592 GRANT, real grant 0 [Wed Dec 11 11:29:00 2019][191519.399673] Lustre: fir-OST0054: Connection restored to 5ce2e68e-76b2-bbc3-75c5-66a5c2b02651 (at 10.8.23.15@o2ib6) [Wed Dec 11 11:29:00 2019][191519.410118] Lustre: Skipped 10 previous similar messages [Wed Dec 11 11:52:30 2019][192928.681711] Lustre: fir-OST005e: haven't heard from client 45ffa07c-203c-dad9-8f0d-e714fc6465b8 (at 10.8.22.11@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892250928400, cur 1576093950 expire 1576093800 last 1576093723 [Wed Dec 11 11:52:30 2019][192928.703538] Lustre: Skipped 11 previous similar messages [Wed Dec 11 12:20:27 2019][194605.727054] Lustre: fir-OST005e: haven't heard from client 704e8622-7442-8eb3-b4e3-c86a69ef45af (at 10.8.20.21@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892250aacc00, cur 1576095627 expire 1576095477 last 1576095400 [Wed Dec 11 12:20:27 2019][194605.748848] Lustre: Skipped 5 previous similar messages [Wed Dec 11 12:20:46 2019][194624.706806] Lustre: fir-OST0054: haven't heard from client 704e8622-7442-8eb3-b4e3-c86a69ef45af (at 10.8.20.21@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892253eacc00, cur 1576095646 expire 1576095496 last 1576095419 [Wed Dec 11 12:20:46 2019][194624.728603] Lustre: Skipped 4 previous similar messages [Wed Dec 11 12:26:57 2019][194995.860637] Lustre: fir-OST0054: Connection restored to (at 10.8.22.11@o2ib6) [Wed Dec 11 12:26:57 2019][194995.867949] Lustre: Skipped 5 previous similar messages [Wed Dec 11 12:27:08 2019][195007.252896] Lustre: fir-OST0054: Connection restored to 4f86dcb5-8d8c-1599-bd44-005eb718eb65 (at 10.8.22.10@o2ib6) [Wed Dec 11 12:27:08 2019][195007.263331] Lustre: Skipped 5 previous similar messages [Wed Dec 11 12:28:04 2019][195063.004671] Lustre: fir-OST0054: Connection restored to a8841932-bc4a-ab11-1ace-8e1fdda46930 (at 10.8.23.23@o2ib6) [Wed Dec 11 12:28:04 2019][195063.015129] Lustre: Skipped 5 previous similar messages [Wed Dec 11 12:37:46 2019][195644.721852] Lustre: fir-OST0058: haven't heard from client c3415e6e-dda3-8602-28df-a932f656881d (at 10.9.112.17@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff89225390b000, cur 1576096666 expire 1576096516 last 1576096439 [Wed Dec 11 12:55:45 2019][196724.523607] Lustre: fir-OST0054: Connection restored to 704e8622-7442-8eb3-b4e3-c86a69ef45af (at 10.8.20.21@o2ib6) [Wed Dec 11 12:55:45 2019][196724.534047] Lustre: Skipped 5 previous similar messages [Wed Dec 11 12:56:17 2019][196756.062990] Lustre: fir-OST0054: Connection restored to (at 10.9.112.17@o2ib4) [Wed Dec 11 12:56:17 2019][196756.070397] Lustre: Skipped 5 previous similar messages [Wed Dec 11 13:02:17 2019][197116.468514] Lustre: fir-OST0054: Connection restored to (at 10.8.9.1@o2ib6) [Wed Dec 11 13:02:17 2019][197116.468515] Lustre: fir-OST0056: Connection restored to (at 10.8.9.1@o2ib6) [Wed Dec 11 13:02:17 2019][197116.482806] Lustre: Skipped 3 previous similar messages [Wed Dec 11 13:02:32 2019][197131.014869] Lustre: fir-OST0054: Connection restored to bdb2a993-354c-ddce-bf9d-5960b01c7975 (at 10.8.23.13@o2ib6) [Wed Dec 11 13:02:32 2019][197131.025313] Lustre: Skipped 5 previous similar messages [Wed Dec 11 13:04:06 2019][197225.346656] Lustre: fir-OST0054: Connection restored to 37c7e464-6686-fdc0-1c81-eae75026a910 (at 10.8.22.2@o2ib6) [Wed Dec 11 13:04:06 2019][197225.357004] Lustre: Skipped 5 previous similar messages [Wed Dec 11 13:05:44 2019][197323.297435] Lustre: fir-OST0054: Connection restored to (at 10.9.113.13@o2ib4) [Wed Dec 11 13:05:44 2019][197323.304838] Lustre: Skipped 5 previous similar messages [Wed Dec 11 13:07:05 2019][197404.416039] Lustre: fir-OST0054: Connection restored to 0df17536-86d5-4 (at 10.9.101.60@o2ib4) [Wed Dec 11 13:07:05 2019][197404.424752] Lustre: Skipped 5 previous similar messages [Wed Dec 11 13:12:37 2019][197736.574442] Lustre: fir-OST0054: Connection restored to 48f67746-6174-d4eb-bf6b-7295eeca30af (at 10.8.24.7@o2ib6) [Wed Dec 11 13:12:37 2019][197736.574443] Lustre: fir-OST0056: Connection restored to 48f67746-6174-d4eb-bf6b-7295eeca30af (at 10.8.24.7@o2ib6) [Wed Dec 11 13:12:37 2019][197736.574446] Lustre: Skipped 6 previous similar messages [Wed Dec 11 13:12:37 2019][197736.600487] Lustre: Skipped 4 previous similar messages [Wed Dec 11 13:37:40 2019][199238.819613] Lustre: fir-OST0058: haven't heard from client 000d6715-906a-fe00-99d9-1ba39760e7f7 (at 10.8.22.16@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892253cbbc00, cur 1576100260 expire 1576100110 last 1576100033 [Wed Dec 11 13:37:40 2019][199238.841428] Lustre: Skipped 5 previous similar messages [Wed Dec 11 13:45:53 2019][199731.802323] Lustre: fir-OST0058: haven't heard from client 85fbdf3d-35db-072c-03b7-e9977baaa2bf (at 10.8.23.12@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892261a48c00, cur 1576100753 expire 1576100603 last 1576100526 [Wed Dec 11 13:45:53 2019][199731.824114] Lustre: Skipped 11 previous similar messages [Wed Dec 11 13:49:24 2019][199942.960635] Lustre: fir-OST0054: Connection restored to (at 10.8.23.12@o2ib6) [Wed Dec 11 13:49:24 2019][199942.967956] Lustre: Skipped 5 previous similar messages [Wed Dec 11 14:12:19 2019][201318.492614] Lustre: fir-OST0054: Connection restored to cea3a46a-6e64-ecd2-2636-1b7611592cd3 (at 10.8.23.8@o2ib6) [Wed Dec 11 14:12:19 2019][201318.502968] Lustre: Skipped 5 previous similar messages [Wed Dec 11 14:12:27 2019][201326.401557] Lustre: fir-OST0054: Connection restored to 60e7dd38-7049-6086-949c-b7f68f3f00ca (at 10.8.23.18@o2ib6) [Wed Dec 11 14:12:27 2019][201326.411991] Lustre: Skipped 5 previous similar messages [Wed Dec 11 14:12:38 2019][201337.051897] Lustre: fir-OST0054: Connection restored to 5bbbecd7-709a-9f29-693e-a19d73c8cefb (at 10.8.22.18@o2ib6) [Wed Dec 11 14:12:38 2019][201337.062358] Lustre: Skipped 5 previous similar messages [Wed Dec 11 14:12:55 2019][201354.235845] Lustre: fir-OST0054: Connection restored to 94396c8b-eccd-7da2-de85-f79420b2e641 (at 10.8.23.33@o2ib6) [Wed Dec 11 14:12:55 2019][201354.246291] Lustre: Skipped 11 previous similar messages [Wed Dec 11 14:30:44 2019][202422.854023] Lustre: fir-OST0054: haven't heard from client 8c2fd243-a078-4 (at 10.9.117.46@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8902f96e1000, cur 1576103444 expire 1576103294 last 1576103217 [Wed Dec 11 14:30:44 2019][202422.874103] Lustre: Skipped 5 previous similar messages [Wed Dec 11 14:33:19 2019][202578.475456] Lustre: fir-OST0054: Connection restored to 907ff646-c0ba-4 (at 10.9.117.46@o2ib4) [Wed Dec 11 14:33:19 2019][202578.484188] Lustre: Skipped 17 previous similar messages [Wed Dec 11 14:37:18 2019][202817.834063] Lustre: fir-OST0054: Connection restored to fb63a42c-93f0-576d-f57c-a83fc4375277 (at 10.8.21.2@o2ib6) [Wed Dec 11 14:37:18 2019][202817.844415] Lustre: Skipped 2 previous similar messages [Wed Dec 11 14:37:30 2019][202829.180467] Lustre: fir-OST0056: Connection restored to (at 10.8.22.32@o2ib6) [Wed Dec 11 14:37:30 2019][202829.187793] Lustre: Skipped 5 previous similar messages [Wed Dec 11 15:05:15 2019][204494.660619] Lustre: fir-OST0054: Connection restored to 98b70d1a-7357-ff1b-1e1d-8bd68b6592c2 (at 10.8.23.27@o2ib6) [Wed Dec 11 15:05:15 2019][204494.671065] Lustre: Skipped 5 previous similar messages [Wed Dec 11 15:09:25 2019][204744.214693] Lustre: fir-OST0054: Connection restored to 84c69ebc-7dc0-678f-942c-60a0d29de5a5 (at 10.8.22.27@o2ib6) [Wed Dec 11 15:09:25 2019][204744.225127] Lustre: Skipped 5 previous similar messages [Wed Dec 11 15:18:50 2019][205309.175062] LustreError: 67988:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005c: cli 8442b5f1-7da8-4 claims 28672 GRANT, real grant 0 [Wed Dec 11 18:58:17 2019][218476.265970] Lustre: fir-OST0054: Connection restored to 0aa269ad-def9-3be3-d596-fd7c0af955fb (at 10.8.20.26@o2ib6) [Wed Dec 11 18:58:17 2019][218476.276413] Lustre: Skipped 5 previous similar messages [Wed Dec 11 19:34:35 2019][220654.286010] Lustre: fir-OST0054: Connection restored to 207217ac-1163-df36-3120-8bf6c3ecbb93 (at 10.8.23.21@o2ib6) [Wed Dec 11 19:34:35 2019][220654.296456] Lustre: Skipped 5 previous similar messages [Wed Dec 11 19:47:16 2019][221415.472120] LustreError: 67718:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005c: cli 35ba350a-bccc-3fd9-39f0-a94eca80785d claims 16752640 GRANT, real grant 0 [Wed Dec 11 21:40:15 2019][228194.663383] Lustre: fir-OST0054: Connection restored to e15078c5-8209-4 (at 10.8.25.17@o2ib6) [Wed Dec 11 21:40:15 2019][228194.671998] Lustre: Skipped 5 previous similar messages [Wed Dec 11 21:41:08 2019][228247.375046] Lustre: fir-OST0054: haven't heard from client e15078c5-8209-4 (at 10.8.25.17@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892253ace800, cur 1576129268 expire 1576129118 last 1576129041 [Wed Dec 11 21:41:08 2019][228247.395022] Lustre: Skipped 5 previous similar messages [Wed Dec 11 21:47:23 2019][228622.376907] Lustre: fir-OST0054: haven't heard from client 208ccf09-d6ca-4 (at 10.8.25.17@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff88f4f5725c00, cur 1576129643 expire 1576129493 last 1576129416 [Wed Dec 11 21:47:23 2019][228622.396920] Lustre: Skipped 5 previous similar messages [Wed Dec 11 22:06:24 2019][229763.991868] Lustre: fir-OST0054: Connection restored to e15078c5-8209-4 (at 10.8.25.17@o2ib6) [Wed Dec 11 22:06:24 2019][229764.000503] Lustre: Skipped 5 previous similar messages [Wed Dec 11 22:14:23 2019][230242.409048] Lustre: fir-OST0056: haven't heard from client 0cfc0c49-f407-4 (at 10.8.25.17@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff890858777400, cur 1576131263 expire 1576131113 last 1576131036 [Wed Dec 11 22:14:23 2019][230242.429025] Lustre: Skipped 5 previous similar messages [Wed Dec 11 22:54:40 2019][232660.307500] Lustre: fir-OST0054: Connection restored to f8d5264b-1de7-5abd-fef8-60297df9f169 (at 10.8.22.20@o2ib6) [Wed Dec 11 22:54:40 2019][232660.317943] Lustre: Skipped 5 previous similar messages [Wed Dec 11 22:54:46 2019][232666.429524] Lustre: fir-OST0054: Connection restored to bd358c1a-07c6-3f9f-7c84-efdb04e29ef9 (at 10.8.21.1@o2ib6) [Wed Dec 11 22:54:46 2019][232666.439875] Lustre: Skipped 5 previous similar messages [Wed Dec 11 22:55:51 2019][232730.711982] Lustre: fir-OST0054: Connection restored to 26627d4d-9b72-83d5-02a3-73c7f9501a91 (at 10.8.22.26@o2ib6) [Wed Dec 11 22:55:51 2019][232730.722419] Lustre: Skipped 5 previous similar messages [Wed Dec 11 23:57:33 2019][236433.088011] LustreError: 67718:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005c: cli 8442b5f1-7da8-4 claims 3637248 GRANT, real grant 28672 [Thu Dec 12 00:00:44 2019][236623.856599] LustreError: 67971:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0056: cli cc645112-3584-d084-5d6b-c64af0bf19ce claims 299008 GRANT, real grant 0 [Thu Dec 12 00:10:47 2019][237227.255864] LNet: Service thread pid 67997 was inactive for 200.48s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:10:47 2019][237227.272892] Pid: 67997, comm: ll_ost_io00_031 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:10:47 2019][237227.283678] Call Trace: [Thu Dec 12 00:10:47 2019][237227.286234] [] bitmap_startwrite+0x1f5/0x210 [Thu Dec 12 00:10:47 2019][237227.292286] [] add_stripe_bio+0x451/0x7f0 [raid456] [Thu Dec 12 00:10:47 2019][237227.298953] [] raid5_make_request+0x1e4/0xca0 [raid456] [Thu Dec 12 00:10:47 2019][237227.305972] [] md_handle_request+0xd0/0x150 [Thu Dec 12 00:10:47 2019][237227.311936] [] md_make_request+0x79/0x190 [Thu Dec 12 00:10:47 2019][237227.317759] [] generic_make_request+0x147/0x380 [Thu Dec 12 00:10:47 2019][237227.324088] [] submit_bio+0x70/0x150 [Thu Dec 12 00:10:47 2019][237227.329487] [] osd_submit_bio+0x1c/0x60 [osd_ldiskfs] [Thu Dec 12 00:10:47 2019][237227.336326] [] osd_do_bio.isra.35+0x9b5/0xab0 [osd_ldiskfs] [Thu Dec 12 00:10:47 2019][237227.343723] [] osd_write_commit+0x3ec/0x8c0 [osd_ldiskfs] [Thu Dec 12 00:10:47 2019][237227.350903] [] ofd_commitrw_write+0xfbe/0x1d40 [ofd] [Thu Dec 12 00:10:47 2019][237227.357653] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:10:47 2019][237227.363835] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:10:47 2019][237227.370543] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:10:47 2019][237227.377579] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:10:47 2019][237227.385406] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:10:47 2019][237227.391836] [] kthread+0xd1/0xe0 [Thu Dec 12 00:10:47 2019][237227.396855] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:10:47 2019][237227.403431] [] 0xffffffffffffffff [Thu Dec 12 00:10:47 2019][237227.408592] LustreError: dumping log to /tmp/lustre-log.1576138247.67997 [Thu Dec 12 00:10:47 2019][237227.417351] Pid: 67875, comm: ll_ost_io01_019 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:10:47 2019][237227.428137] Call Trace: [Thu Dec 12 00:10:47 2019][237227.430693] [] osd_trans_stop+0x265/0x8e0 [osd_ldiskfs] [Thu Dec 12 00:10:47 2019][237227.437702] [] ofd_trans_stop+0x25/0x60 [ofd] [Thu Dec 12 00:10:47 2019][237227.443864] [] ofd_commitrw_write+0x9d4/0x1d40 [ofd] [Thu Dec 12 00:10:47 2019][237227.450606] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:10:47 2019][237227.456750] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:10:47 2019][237227.463458] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:10:47 2019][237227.470523] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:10:47 2019][237227.478351] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:10:47 2019][237227.484804] [] kthread+0xd1/0xe0 [Thu Dec 12 00:10:47 2019][237227.489810] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:10:47 2019][237227.496411] [] 0xffffffffffffffff [Thu Dec 12 00:10:47 2019][237227.501547] Pid: 66127, comm: ll_ost_io01_000 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:10:47 2019][237227.512325] Call Trace: [Thu Dec 12 00:10:47 2019][237227.514878] [] bitmap_startwrite+0x1f5/0x210 [Thu Dec 12 00:10:47 2019][237227.520928] [] add_stripe_bio+0x451/0x7f0 [raid456] [Thu Dec 12 00:10:48 2019][237227.527599] [] raid5_make_request+0x1e4/0xca0 [raid456] [Thu Dec 12 00:10:48 2019][237227.534635] [] md_handle_request+0xd0/0x150 [Thu Dec 12 00:10:48 2019][237227.540635] [] md_make_request+0x79/0x190 [Thu Dec 12 00:10:48 2019][237227.546452] [] generic_make_request+0x147/0x380 [Thu Dec 12 00:10:48 2019][237227.552776] [] submit_bio+0x70/0x150 [Thu Dec 12 00:10:48 2019][237227.558158] [] osd_submit_bio+0x1c/0x60 [osd_ldiskfs] [Thu Dec 12 00:10:48 2019][237227.565041] [] osd_do_bio.isra.35+0x489/0xab0 [osd_ldiskfs] [Thu Dec 12 00:10:48 2019][237227.572395] [] osd_write_commit+0x3ec/0x8c0 [osd_ldiskfs] [Thu Dec 12 00:10:48 2019][237227.579591] [] ofd_commitrw_write+0xfbe/0x1d40 [ofd] [Thu Dec 12 00:10:48 2019][237227.586343] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:10:48 2019][237227.592521] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.599219] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.606309] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.614163] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.620594] [] kthread+0xd1/0xe0 [Thu Dec 12 00:10:48 2019][237227.625610] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:10:48 2019][237227.632189] [] 0xffffffffffffffff [Thu Dec 12 00:10:48 2019][237227.637348] Pid: 67903, comm: ll_ost_io02_025 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:10:48 2019][237227.648139] Call Trace: [Thu Dec 12 00:10:48 2019][237227.650703] [] bitmap_startwrite+0x1f5/0x210 [Thu Dec 12 00:10:48 2019][237227.656754] [] add_stripe_bio+0x451/0x7f0 [raid456] [Thu Dec 12 00:10:48 2019][237227.663473] [] raid5_make_request+0x1e4/0xca0 [raid456] [Thu Dec 12 00:10:48 2019][237227.670548] [] md_handle_request+0xd0/0x150 [Thu Dec 12 00:10:48 2019][237227.676521] [] md_make_request+0x79/0x190 [Thu Dec 12 00:10:48 2019][237227.682317] [] generic_make_request+0x147/0x380 [Thu Dec 12 00:10:48 2019][237227.688659] [] submit_bio+0x70/0x150 [Thu Dec 12 00:10:48 2019][237227.694018] [] osd_submit_bio+0x1c/0x60 [osd_ldiskfs] [Thu Dec 12 00:10:48 2019][237227.700923] [] osd_do_bio.isra.35+0x9b5/0xab0 [osd_ldiskfs] [Thu Dec 12 00:10:48 2019][237227.708290] [] osd_write_commit+0x3ec/0x8c0 [osd_ldiskfs] [Thu Dec 12 00:10:48 2019][237227.715539] [] ofd_commitrw_write+0xfbe/0x1d40 [ofd] [Thu Dec 12 00:10:48 2019][237227.722288] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:10:48 2019][237227.728448] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.735151] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.742187] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.750006] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.756489] [] kthread+0xd1/0xe0 [Thu Dec 12 00:10:48 2019][237227.761507] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:10:48 2019][237227.768122] [] 0xffffffffffffffff [Thu Dec 12 00:10:48 2019][237227.773261] LNet: Service thread pid 67874 was inactive for 201.01s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:10:48 2019][237227.790313] LNet: Skipped 3 previous similar messages [Thu Dec 12 00:10:48 2019][237227.795467] Pid: 67874, comm: ll_ost_io02_019 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:10:48 2019][237227.806299] Call Trace: [Thu Dec 12 00:10:48 2019][237227.808854] [] bitmap_startwrite+0x1f5/0x210 [Thu Dec 12 00:10:48 2019][237227.814913] [] add_stripe_bio+0x451/0x7f0 [raid456] [Thu Dec 12 00:10:48 2019][237227.821567] [] raid5_make_request+0x1e4/0xca0 [raid456] [Thu Dec 12 00:10:48 2019][237227.828571] [] md_handle_request+0xd0/0x150 [Thu Dec 12 00:10:48 2019][237227.834545] [] md_make_request+0x79/0x190 [Thu Dec 12 00:10:48 2019][237227.840363] [] generic_make_request+0x147/0x380 [Thu Dec 12 00:10:48 2019][237227.846684] [] submit_bio+0x70/0x150 [Thu Dec 12 00:10:48 2019][237227.852096] [] osd_submit_bio+0x1c/0x60 [osd_ldiskfs] [Thu Dec 12 00:10:48 2019][237227.858932] [] osd_do_bio.isra.35+0x9b5/0xab0 [osd_ldiskfs] [Thu Dec 12 00:10:48 2019][237227.866313] [] osd_write_commit+0x3ec/0x8c0 [osd_ldiskfs] [Thu Dec 12 00:10:48 2019][237227.873502] [] ofd_commitrw_write+0xfbe/0x1d40 [ofd] [Thu Dec 12 00:10:48 2019][237227.880275] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:10:48 2019][237227.886422] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.893190] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.900239] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.908063] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.914493] [] kthread+0xd1/0xe0 [Thu Dec 12 00:10:48 2019][237227.919539] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:10:48 2019][237227.926108] [] 0xffffffffffffffff [Thu Dec 12 00:10:48 2019][237227.931236] LNet: Service thread pid 66126 was inactive for 201.17s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:10:50 2019][237229.815918] LNet: Service thread pid 68032 was inactive for 200.37s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:10:50 2019][237229.828881] LustreError: dumping log to /tmp/lustre-log.1576138250.68032 [Thu Dec 12 00:10:51 2019][237230.839941] LNet: Service thread pid 67718 was inactive for 200.50s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:10:51 2019][237230.852901] LNet: Skipped 3 previous similar messages [Thu Dec 12 00:10:51 2019][237230.858054] LustreError: dumping log to /tmp/lustre-log.1576138251.67718 [Thu Dec 12 00:10:52 2019][237231.863959] LustreError: dumping log to /tmp/lustre-log.1576138252.67781 [Thu Dec 12 00:10:53 2019][237232.887976] LNet: Service thread pid 68043 was inactive for 200.48s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:10:53 2019][237232.900923] LNet: Skipped 2 previous similar messages [Thu Dec 12 00:10:53 2019][237232.906072] LustreError: dumping log to /tmp/lustre-log.1576138253.68043 [Thu Dec 12 00:10:54 2019][237234.424012] LustreError: dumping log to /tmp/lustre-log.1576138254.67827 [Thu Dec 12 00:10:56 2019][237235.960051] LustreError: dumping log to /tmp/lustre-log.1576138256.67824 [Thu Dec 12 00:10:58 2019][237238.008081] LNet: Service thread pid 67769 was inactive for 200.14s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:10:58 2019][237238.021049] LNet: Skipped 4 previous similar messages [Thu Dec 12 00:10:58 2019][237238.026194] LustreError: dumping log to /tmp/lustre-log.1576138258.67769 [Thu Dec 12 00:11:00 2019][237240.056127] LustreError: dumping log to /tmp/lustre-log.1576138260.67905 [Thu Dec 12 00:11:01 2019][237240.568137] LustreError: dumping log to /tmp/lustre-log.1576138261.67944 [Thu Dec 12 00:11:02 2019][237242.104166] LustreError: dumping log to /tmp/lustre-log.1576138262.67688 [Thu Dec 12 00:11:04 2019][237244.152206] LustreError: dumping log to /tmp/lustre-log.1576138264.67885 [Thu Dec 12 00:11:05 2019][237244.664220] LustreError: dumping log to /tmp/lustre-log.1576138265.67597 [Thu Dec 12 00:11:06 2019][237245.930444] INFO: task ll_ost_io00_000:66124 blocked for more than 120 seconds. [Thu Dec 12 00:11:06 2019][237245.937848] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:06 2019][237245.945773] ll_ost_io00_000 D ffff8912e629a080 0 66124 2 0x00000080 [Thu Dec 12 00:11:06 2019][237245.952969] Call Trace: [Thu Dec 12 00:11:06 2019][237245.955566] [] schedule+0x29/0x70 [Thu Dec 12 00:11:06 2019][237245.960650] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:11:06 2019][237245.967669] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:06 2019][237245.973595] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:11:06 2019][237245.980735] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:11:06 2019][237245.987362] [] ? osd_declare_xattr_set+0xf1/0x3a0 [osd_ldiskfs] [Thu Dec 12 00:11:06 2019][237245.995045] [] ? kmem_cache_alloc+0x1c2/0x1f0 [Thu Dec 12 00:11:06 2019][237246.001159] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:11:06 2019][237246.007874] [] ? osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:06 2019][237246.015146] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:11:06 2019][237246.022649] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:06 2019][237246.029726] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:11:06 2019][237246.035932] [] ofd_object_punch+0x73d/0xd30 [ofd] [Thu Dec 12 00:11:06 2019][237246.042401] [] ofd_punch_hdl+0x493/0xa30 [ofd] [Thu Dec 12 00:11:06 2019][237246.048642] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.055670] [] ? ptlrpc_nrs_req_get_nolock0+0xd1/0x170 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.063354] [] ? ktime_get_real_seconds+0xe/0x10 [libcfs] [Thu Dec 12 00:11:06 2019][237246.070549] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.078351] [] ? ptlrpc_wait_event+0xa5/0x360 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.085236] [] ? __wake_up+0x44/0x50 [Thu Dec 12 00:11:06 2019][237246.090603] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.097021] [] ? ptlrpc_register_service+0xf80/0xf80 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.104504] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:06 2019][237246.109503] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:06 2019][237246.115704] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:06 2019][237246.122256] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:06 2019][237246.128463] INFO: task ll_ost_io00_002:66126 blocked for more than 120 seconds. [Thu Dec 12 00:11:06 2019][237246.135860] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:06 2019][237246.143801] ll_ost_io00_002 D ffff8903a9ada080 0 66126 2 0x00000080 [Thu Dec 12 00:11:06 2019][237246.150990] Call Trace: [Thu Dec 12 00:11:06 2019][237246.153538] [] schedule+0x29/0x70 [Thu Dec 12 00:11:06 2019][237246.158632] [] bitmap_startwrite+0x1f5/0x210 [Thu Dec 12 00:11:06 2019][237246.164662] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:06 2019][237246.170594] [] add_stripe_bio+0x451/0x7f0 [raid456] [Thu Dec 12 00:11:06 2019][237246.177235] [] raid5_make_request+0x1e4/0xca0 [raid456] [Thu Dec 12 00:11:06 2019][237246.184241] [] ? find_get_pages+0x180/0x1d0 [Thu Dec 12 00:11:06 2019][237246.190182] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:06 2019][237246.196135] [] ? mempool_alloc_slab+0x15/0x20 [Thu Dec 12 00:11:06 2019][237246.202251] [] md_handle_request+0xd0/0x150 [Thu Dec 12 00:11:06 2019][237246.208205] [] ? generic_make_request_checks+0x2a7/0x440 [Thu Dec 12 00:11:06 2019][237246.215287] [] md_make_request+0x79/0x190 [Thu Dec 12 00:11:06 2019][237246.221066] [] generic_make_request+0x147/0x380 [Thu Dec 12 00:11:06 2019][237246.227341] [] ? md_mergeable_bvec+0x46/0x50 [Thu Dec 12 00:11:06 2019][237246.233373] [] submit_bio+0x70/0x150 [Thu Dec 12 00:11:06 2019][237246.238725] [] ? lprocfs_oh_tally+0x17/0x40 [obdclass] [Thu Dec 12 00:11:06 2019][237246.245612] [] osd_submit_bio+0x1c/0x60 [osd_ldiskfs] [Thu Dec 12 00:11:06 2019][237246.252426] [] osd_do_bio.isra.35+0x489/0xab0 [osd_ldiskfs] [Thu Dec 12 00:11:06 2019][237246.259757] [] ? __find_get_page+0x1e/0xa0 [Thu Dec 12 00:11:06 2019][237246.265603] [] osd_write_commit+0x3ec/0x8c0 [osd_ldiskfs] [Thu Dec 12 00:11:06 2019][237246.272769] [] ? osd_trans_start+0x235/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:06 2019][237246.280027] [] ofd_commitrw_write+0xfbe/0x1d40 [ofd] [Thu Dec 12 00:11:06 2019][237246.286754] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:11:06 2019][237246.292923] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.299576] [] ? lustre_msg_buf+0x17/0x60 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.306182] [] ? __req_capsule_get+0x163/0x740 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.313180] [] ? class_handle2object+0xb9/0x1c0 [obdclass] [Thu Dec 12 00:11:06 2019][237246.320427] [] ? update_curr+0x14c/0x1e0 [Thu Dec 12 00:11:06 2019][237246.326092] [] ? account_entity_dequeue+0xae/0xd0 [Thu Dec 12 00:11:06 2019][237246.332569] [] ? target_send_reply_msg+0x170/0x170 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.339919] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.346937] [] ? ptlrpc_nrs_req_get_nolock0+0xd1/0x170 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.354605] [] ? ktime_get_real_seconds+0xe/0x10 [libcfs] [Thu Dec 12 00:11:06 2019][237246.361824] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.369605] [] ? ptlrpc_wait_event+0xa5/0x360 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.376489] [] ? wake_up_state+0x20/0x20 [Thu Dec 12 00:11:06 2019][237246.382187] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.388599] [] ? ptlrpc_register_service+0xf80/0xf80 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.396088] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:06 2019][237246.401092] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:06 2019][237246.407284] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:06 2019][237246.413835] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:06 2019][237246.420038] INFO: task ll_ost_io01_000:66127 blocked for more than 120 seconds. [Thu Dec 12 00:11:06 2019][237246.427467] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:06 2019][237246.435405] ll_ost_io01_000 D ffff8912f2f12080 0 66127 2 0x00000080 [Thu Dec 12 00:11:06 2019][237246.442596] Call Trace: [Thu Dec 12 00:11:06 2019][237246.445142] [] schedule+0x29/0x70 [Thu Dec 12 00:11:06 2019][237246.450243] [] bitmap_startwrite+0x1f5/0x210 [Thu Dec 12 00:11:06 2019][237246.456264] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:06 2019][237246.462193] [] add_stripe_bio+0x451/0x7f0 [raid456] [Thu Dec 12 00:11:06 2019][237246.468826] [] raid5_make_request+0x1e4/0xca0 [raid456] [Thu Dec 12 00:11:06 2019][237246.475794] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:06 2019][237246.481740] [] ? mempool_alloc_slab+0x15/0x20 [Thu Dec 12 00:11:06 2019][237246.487842] [] md_handle_request+0xd0/0x150 [Thu Dec 12 00:11:06 2019][237246.493819] [] ? generic_make_request_checks+0x2a7/0x440 [Thu Dec 12 00:11:06 2019][237246.500868] [] md_make_request+0x79/0x190 [Thu Dec 12 00:11:06 2019][237246.506640] [] generic_make_request+0x147/0x380 [Thu Dec 12 00:11:06 2019][237246.512913] [] ? md_mergeable_bvec+0x46/0x50 [Thu Dec 12 00:11:06 2019][237246.518927] [] submit_bio+0x70/0x150 [Thu Dec 12 00:11:06 2019][237246.524274] [] ? lprocfs_oh_tally+0x17/0x40 [obdclass] [Thu Dec 12 00:11:06 2019][237246.531183] [] osd_submit_bio+0x1c/0x60 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237246.537985] [] osd_do_bio.isra.35+0x489/0xab0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237246.545318] [] ? __find_get_page+0x1e/0xa0 [Thu Dec 12 00:11:07 2019][237246.551159] [] osd_write_commit+0x3ec/0x8c0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237246.558307] [] ? osd_trans_start+0x235/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237246.565536] [] ofd_commitrw_write+0xfbe/0x1d40 [ofd] [Thu Dec 12 00:11:07 2019][237246.572276] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:11:07 2019][237246.578433] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.585121] [] ? lustre_msg_buf+0x17/0x60 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.591685] [] ? __req_capsule_get+0x163/0x740 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.598707] [] ? class_handle2object+0xb9/0x1c0 [obdclass] [Thu Dec 12 00:11:07 2019][237246.605937] [] ? update_curr+0x14c/0x1e0 [Thu Dec 12 00:11:07 2019][237246.611602] [] ? account_entity_dequeue+0xae/0xd0 [Thu Dec 12 00:11:07 2019][237246.618095] [] ? target_send_reply_msg+0x170/0x170 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.625443] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.632471] [] ? ptlrpc_nrs_req_get_nolock0+0xd1/0x170 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.640132] [] ? ktime_get_real_seconds+0xe/0x10 [libcfs] [Thu Dec 12 00:11:07 2019][237246.647314] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.655089] [] ? ptlrpc_wait_event+0xa5/0x360 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.661982] [] ? __wake_up+0x44/0x50 [Thu Dec 12 00:11:07 2019][237246.667343] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.673762] [] ? ptlrpc_register_service+0xf80/0xf80 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.681285] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:07 2019][237246.686274] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237246.692473] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:07 2019][237246.699022] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237246.705209] INFO: task ll_ost_io03_002:66137 blocked for more than 120 seconds. [Thu Dec 12 00:11:07 2019][237246.712622] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:07 2019][237246.720553] ll_ost_io03_002 D ffff89125d170000 0 66137 2 0x00000080 [Thu Dec 12 00:11:07 2019][237246.727754] Call Trace: [Thu Dec 12 00:11:07 2019][237246.730299] [] schedule+0x29/0x70 [Thu Dec 12 00:11:07 2019][237246.735378] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:11:07 2019][237246.742396] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:07 2019][237246.748326] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:11:07 2019][237246.755505] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:11:07 2019][237246.762128] [] ? osd_declare_xattr_set+0xf1/0x3a0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237246.769803] [] ? kmem_cache_alloc+0x1c2/0x1f0 [Thu Dec 12 00:11:07 2019][237246.775906] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:11:07 2019][237246.782647] [] ? osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237246.789882] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:11:07 2019][237246.797411] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237246.804477] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:11:07 2019][237246.810721] [] ofd_commitrw_write+0xa31/0x1d40 [ofd] [Thu Dec 12 00:11:07 2019][237246.817431] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:11:07 2019][237246.823617] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.830265] [] ? lustre_msg_buf+0x17/0x60 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.836853] [] ? __req_capsule_get+0x163/0x740 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.843861] [] ? class_handle2object+0xb9/0x1c0 [obdclass] [Thu Dec 12 00:11:07 2019][237246.851092] [] ? update_curr+0x14c/0x1e0 [Thu Dec 12 00:11:07 2019][237246.856771] [] ? mutex_lock+0x12/0x2f [Thu Dec 12 00:11:07 2019][237246.862216] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.869251] [] ? ptlrpc_nrs_req_get_nolock0+0xd1/0x170 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.876919] [] ? ktime_get_real_seconds+0xe/0x10 [libcfs] [Thu Dec 12 00:11:07 2019][237246.884102] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.891882] [] ? ptlrpc_wait_event+0xa5/0x360 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.898797] [] ? __wake_up+0x44/0x50 [Thu Dec 12 00:11:07 2019][237246.904158] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.910586] [] ? ptlrpc_register_service+0xf80/0xf80 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.918075] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:07 2019][237246.923079] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237246.929271] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:07 2019][237246.935851] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237246.942039] INFO: task jbd2/md2-8:66168 blocked for more than 120 seconds. [Thu Dec 12 00:11:07 2019][237246.949004] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:07 2019][237246.956928] jbd2/md2-8 D ffff8912f5c68000 0 66168 2 0x00000080 [Thu Dec 12 00:11:07 2019][237246.964142] Call Trace: [Thu Dec 12 00:11:07 2019][237246.966690] [] schedule+0x29/0x70 [Thu Dec 12 00:11:07 2019][237246.971790] [] jbd2_journal_commit_transaction+0x23c/0x19b0 [jbd2] [Thu Dec 12 00:11:07 2019][237246.979729] [] ? dequeue_task_fair+0x41e/0x660 [Thu Dec 12 00:11:07 2019][237246.985943] [] ? __switch_to+0xce/0x580 [Thu Dec 12 00:11:07 2019][237246.991525] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:07 2019][237246.997454] [] ? __schedule+0x42a/0x860 [Thu Dec 12 00:11:07 2019][237247.003050] [] ? try_to_del_timer_sync+0x5e/0x90 [Thu Dec 12 00:11:07 2019][237247.009427] [] kjournald2+0xc9/0x260 [jbd2] [Thu Dec 12 00:11:07 2019][237247.015366] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:07 2019][237247.021307] [] ? commit_timeout+0x10/0x10 [jbd2] [Thu Dec 12 00:11:07 2019][237247.027681] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:07 2019][237247.032649] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237247.038853] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:07 2019][237247.045381] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237247.051588] INFO: task ll_ost02_005:66899 blocked for more than 120 seconds. [Thu Dec 12 00:11:07 2019][237247.058743] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:07 2019][237247.066672] ll_ost02_005 D ffff8912c7616180 0 66899 2 0x00000080 [Thu Dec 12 00:11:07 2019][237247.073875] Call Trace: [Thu Dec 12 00:11:07 2019][237247.076430] [] schedule+0x29/0x70 [Thu Dec 12 00:11:07 2019][237247.081491] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:11:07 2019][237247.088472] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:07 2019][237247.094404] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:11:07 2019][237247.101577] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:11:07 2019][237247.108210] [] ? osd_declare_write+0x350/0x490 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237247.115646] [] ? kmem_cache_alloc+0x1c2/0x1f0 [Thu Dec 12 00:11:07 2019][237247.121746] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:11:07 2019][237247.128487] [] ? osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237247.135725] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:11:07 2019][237247.143248] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237247.150305] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:11:07 2019][237247.156495] [] ofd_attr_set+0x464/0xb60 [ofd] [Thu Dec 12 00:11:07 2019][237247.162595] [] ofd_setattr_hdl+0x31d/0x8e0 [ofd] [Thu Dec 12 00:11:07 2019][237247.169036] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.176040] [] ? ptlrpc_nrs_req_get_nolock0+0xd1/0x170 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.183720] [] ? ktime_get_real_seconds+0xe/0x10 [libcfs] [Thu Dec 12 00:11:07 2019][237247.190916] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.198730] [] ? ptlrpc_wait_event+0xa5/0x360 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.205626] [] ? __wake_up+0x44/0x50 [Thu Dec 12 00:11:07 2019][237247.210993] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.217392] [] ? ptlrpc_register_service+0xf80/0xf80 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.224271] LNet: Service thread pid 67794 was inactive for 200.41s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:11:07 2019][237247.224273] LNet: Skipped 11 previous similar messages [Thu Dec 12 00:11:07 2019][237247.224275] LustreError: dumping log to /tmp/lustre-log.1576138267.67794 [Thu Dec 12 00:11:07 2019][237247.249902] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:07 2019][237247.254877] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237247.261095] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:07 2019][237247.267641] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237247.273844] INFO: task ll_ost_io00_003:67590 blocked for more than 120 seconds. [Thu Dec 12 00:11:07 2019][237247.281255] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:07 2019][237247.289197] ll_ost_io00_003 D ffff8903a99c8000 0 67590 2 0x00000080 [Thu Dec 12 00:11:07 2019][237247.296406] Call Trace: [Thu Dec 12 00:11:07 2019][237247.298950] [] schedule+0x29/0x70 [Thu Dec 12 00:11:07 2019][237247.304052] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:11:07 2019][237247.311037] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:07 2019][237247.316992] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:11:07 2019][237247.324151] [] ? crypto_mod_get+0x19/0x40 [Thu Dec 12 00:11:07 2019][237247.329921] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:11:07 2019][237247.336548] [] ? osd_declare_qid+0x200/0x4a0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237247.343793] [] ? kmem_cache_alloc+0x1c2/0x1f0 [Thu Dec 12 00:11:07 2019][237247.349907] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:11:07 2019][237247.356635] [] ? osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237247.363885] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:11:07 2019][237247.371405] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237247.378481] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:11:07 2019][237247.385205] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:11:07 2019][237247.391384] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.398040] [] ? lustre_msg_buf+0x17/0x60 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.404643] [] ? __req_capsule_get+0x163/0x740 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.411612] [] ? __enqueue_entity+0x78/0x80 [Thu Dec 12 00:11:07 2019][237247.417609] [] ? target_send_reply_msg+0x170/0x170 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.424957] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.431972] [] ? ptlrpc_nrs_req_get_nolock0+0xd1/0x170 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.439650] [] ? ktime_get_real_seconds+0xe/0x10 [libcfs] [Thu Dec 12 00:11:07 2019][237247.446832] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.454624] [] ? ptlrpc_wait_event+0xa5/0x360 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.461529] [] ? __wake_up+0x44/0x50 [Thu Dec 12 00:11:07 2019][237247.466896] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.473323] [] ? ptlrpc_register_service+0xf80/0xf80 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.480864] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:07 2019][237247.485861] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237247.492049] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:07 2019][237247.498591] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237247.504796] INFO: task ll_ost_io01_003:67594 blocked for more than 120 seconds. [Thu Dec 12 00:11:07 2019][237247.512214] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:08 2019][237247.520135] ll_ost_io01_003 D ffff8912f1b12080 0 67594 2 0x00000080 [Thu Dec 12 00:11:08 2019][237247.527326] Call Trace: [Thu Dec 12 00:11:08 2019][237247.529870] [] schedule+0x29/0x70 [Thu Dec 12 00:11:08 2019][237247.534962] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:11:08 2019][237247.541940] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:08 2019][237247.547888] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:11:08 2019][237247.555045] [] ? crypto_mod_get+0x19/0x40 [Thu Dec 12 00:11:08 2019][237247.560821] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:11:08 2019][237247.567468] [] ? osd_declare_qid+0x200/0x4a0 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237247.574710] [] ? kmem_cache_alloc+0x1c2/0x1f0 [Thu Dec 12 00:11:08 2019][237247.580829] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:11:08 2019][237247.587557] [] ? osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237247.594794] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:11:08 2019][237247.602325] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237247.609385] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:11:08 2019][237247.616127] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:11:08 2019][237247.622290] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.628940] [] ? lustre_msg_buf+0x17/0x60 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.635523] [] ? __req_capsule_get+0x163/0x740 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.642489] [] ? __enqueue_entity+0x78/0x80 [Thu Dec 12 00:11:08 2019][237247.648457] [] ? target_send_reply_msg+0x170/0x170 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.655804] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.662857] [] ? ptlrpc_nrs_req_get_nolock0+0xd1/0x170 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.670521] [] ? ktime_get_real_seconds+0xe/0x10 [libcfs] [Thu Dec 12 00:11:08 2019][237247.677729] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.685502] [] ? ptlrpc_wait_event+0xa5/0x360 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.692381] [] ? __wake_up+0x44/0x50 [Thu Dec 12 00:11:08 2019][237247.697762] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.704154] [] ? ptlrpc_register_service+0xf80/0xf80 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.711655] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:08 2019][237247.716644] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:08 2019][237247.722848] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:08 2019][237247.729411] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:08 2019][237247.735620] INFO: task ll_ost_io01_004:67597 blocked for more than 120 seconds. [Thu Dec 12 00:11:08 2019][237247.736285] LustreError: dumping log to /tmp/lustre-log.1576138268.67917 [Thu Dec 12 00:11:08 2019][237247.749831] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:08 2019][237247.757760] ll_ost_io01_004 D ffff8912f4bd9040 0 67597 2 0x00000080 [Thu Dec 12 00:11:08 2019][237247.764963] Call Trace: [Thu Dec 12 00:11:08 2019][237247.767520] [] schedule+0x29/0x70 [Thu Dec 12 00:11:08 2019][237247.772586] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:11:08 2019][237247.779603] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:08 2019][237247.785536] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:11:08 2019][237247.792692] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:11:08 2019][237247.799318] [] ? osd_declare_xattr_set+0xf1/0x3a0 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237247.807018] [] ? kmem_cache_alloc+0x1c2/0x1f0 [Thu Dec 12 00:11:08 2019][237247.813123] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:11:08 2019][237247.819871] [] ? osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237247.827108] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:11:08 2019][237247.834623] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237247.841698] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:11:08 2019][237247.847884] [] ofd_commitrw_write+0xa31/0x1d40 [ofd] [Thu Dec 12 00:11:08 2019][237247.854608] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:11:08 2019][237247.860778] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.867430] [] ? lustre_msg_buf+0x17/0x60 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.873991] [] ? __req_capsule_get+0x163/0x740 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.880973] [] ? mutex_lock+0x12/0x2f [Thu Dec 12 00:11:08 2019][237247.886432] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.893448] [] ? ptlrpc_nrs_req_get_nolock0+0xd1/0x170 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.901127] [] ? ktime_get_real_seconds+0xe/0x10 [libcfs] [Thu Dec 12 00:11:08 2019][237247.908295] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.916082] [] ? ptlrpc_wait_event+0xa5/0x360 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.922993] [] ? __wake_up+0x44/0x50 [Thu Dec 12 00:11:08 2019][237247.928341] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.934743] [] ? ptlrpc_register_service+0xf80/0xf80 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.942231] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:08 2019][237247.947218] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:08 2019][237247.953433] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:08 2019][237247.959961] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:08 2019][237247.966184] INFO: task ll_ost03_027:67684 blocked for more than 120 seconds. [Thu Dec 12 00:11:08 2019][237247.973326] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:08 2019][237247.981279] ll_ost03_027 D ffff8922cb341040 0 67684 2 0x00000080 [Thu Dec 12 00:11:08 2019][237247.988473] Call Trace: [Thu Dec 12 00:11:08 2019][237247.991025] [] ? fid_is_on_ost+0x3f4/0x420 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237247.998109] [] schedule+0x29/0x70 [Thu Dec 12 00:11:08 2019][237248.003179] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:11:08 2019][237248.010159] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:08 2019][237248.016098] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:11:08 2019][237248.023269] [] ? lprocfs_counter_add+0xf9/0x160 [obdclass] [Thu Dec 12 00:11:08 2019][237248.030529] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:11:08 2019][237248.037185] [] ? osd_declare_qid+0x200/0x4a0 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237248.044415] [] ? kmem_cache_alloc+0x1c2/0x1f0 [Thu Dec 12 00:11:08 2019][237248.050545] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:11:08 2019][237248.057273] [] ? osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237248.064524] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:11:08 2019][237248.072039] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237248.079108] [] ofd_precreate_objects+0xa57/0x1d80 [ofd] [Thu Dec 12 00:11:08 2019][237248.086096] [] ofd_create_hdl+0x474/0x20e0 [ofd] [Thu Dec 12 00:11:08 2019][237248.092515] [] ? lustre_pack_reply_v2+0x135/0x290 [ptlrpc] [Thu Dec 12 00:11:08 2019][237248.099797] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:11:08 2019][237248.106809] [] ? ptlrpc_nrs_req_get_nolock0+0xd1/0x170 [ptlrpc] [Thu Dec 12 00:11:08 2019][237248.114494] [] ? ktime_get_real_seconds+0xe/0x10 [libcfs] [Thu Dec 12 00:11:08 2019][237248.121697] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:11:08 2019][237248.129494] [] ? ptlrpc_wait_event+0xa5/0x360 [ptlrpc] [Thu Dec 12 00:11:08 2019][237248.136392] [] ? __wake_up+0x44/0x50 [Thu Dec 12 00:11:08 2019][237248.141753] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:11:08 2019][237248.148156] [] ? ptlrpc_register_service+0xf80/0xf80 [ptlrpc] [Thu Dec 12 00:11:08 2019][237248.155646] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:08 2019][237248.160645] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:08 2019][237248.166833] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:08 2019][237248.173395] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:09 2019][237249.272305] LustreError: dumping log to /tmp/lustre-log.1576138269.67928 [Thu Dec 12 00:11:10 2019][237249.784318] LustreError: dumping log to /tmp/lustre-log.1576138270.68019 [Thu Dec 12 00:11:11 2019][237251.320348] LustreError: dumping log to /tmp/lustre-log.1576138271.68044 [Thu Dec 12 00:11:14 2019][237254.392416] LustreError: dumping log to /tmp/lustre-log.1576138274.67962 [Thu Dec 12 00:11:15 2019][237255.416430] LustreError: dumping log to /tmp/lustre-log.1576138275.68025 [Thu Dec 12 00:11:16 2019][237256.440464] LustreError: dumping log to /tmp/lustre-log.1576138276.67720 [Thu Dec 12 00:11:17 2019][237257.464471] LustreError: dumping log to /tmp/lustre-log.1576138277.68037 [Thu Dec 12 00:11:18 2019][237257.976484] LustreError: dumping log to /tmp/lustre-log.1576138278.68000 [Thu Dec 12 00:11:24 2019][237263.608604] LNet: Service thread pid 67732 was inactive for 200.10s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:11:24 2019][237263.621573] LNet: Skipped 20 previous similar messages [Thu Dec 12 00:11:24 2019][237263.626807] LustreError: dumping log to /tmp/lustre-log.1576138284.67732 [Thu Dec 12 00:11:28 2019][237267.704683] LustreError: dumping log to /tmp/lustre-log.1576138288.67971 [Thu Dec 12 00:11:32 2019][237272.312775] LustreError: dumping log to /tmp/lustre-log.1576138292.68046 [Thu Dec 12 00:11:34 2019][237273.848807] LustreError: dumping log to /tmp/lustre-log.1576138294.67728 [Thu Dec 12 00:11:35 2019][237274.872830] LustreError: dumping log to /tmp/lustre-log.1576138295.68018 [Thu Dec 12 00:11:36 2019][237275.896850] LustreError: dumping log to /tmp/lustre-log.1576138296.67959 [Thu Dec 12 00:11:39 2019][237278.968904] LustreError: dumping log to /tmp/lustre-log.1576138299.112549 [Thu Dec 12 00:11:44 2019][237284.089008] LustreError: dumping log to /tmp/lustre-log.1576138304.68001 [Thu Dec 12 00:11:46 2019][237285.625037] LustreError: dumping log to /tmp/lustre-log.1576138306.66124 [Thu Dec 12 00:11:47 2019][237286.649056] LustreError: dumping log to /tmp/lustre-log.1576138307.67880 [Thu Dec 12 00:11:50 2019][237290.233133] LustreError: dumping log to /tmp/lustre-log.1576138310.67770 [Thu Dec 12 00:11:59 2019][237299.449316] LNet: Service thread pid 67594 was inactive for 200.23s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:11:59 2019][237299.462289] LNet: Skipped 17 previous similar messages [Thu Dec 12 00:11:59 2019][237299.467526] LustreError: dumping log to /tmp/lustre-log.1576138319.67594 [Thu Dec 12 00:12:00 2019][237300.473336] LustreError: dumping log to /tmp/lustre-log.1576138320.67713 [Thu Dec 12 00:12:03 2019][237303.033394] LustreError: dumping log to /tmp/lustre-log.1576138323.67951 [Thu Dec 12 00:12:06 2019][237305.593441] LustreError: dumping log to /tmp/lustre-log.1576138326.67912 [Thu Dec 12 00:12:17 2019][237316.857665] LustreError: dumping log to /tmp/lustre-log.1576138337.67722 [Thu Dec 12 00:12:18 2019][237317.881688] LustreError: dumping log to /tmp/lustre-log.1576138338.67976 [Thu Dec 12 00:12:19 2019][237318.905707] LustreError: dumping log to /tmp/lustre-log.1576138339.67887 [Thu Dec 12 00:12:24 2019][237323.513803] LustreError: dumping log to /tmp/lustre-log.1576138343.66899 [Thu Dec 12 00:12:26 2019][237326.073852] LustreError: dumping log to /tmp/lustre-log.1576138346.87151 [Thu Dec 12 00:12:27 2019][237327.097865] LustreError: dumping log to /tmp/lustre-log.1576138347.67983 [Thu Dec 12 00:12:28 2019][237327.609877] LustreError: dumping log to /tmp/lustre-log.1576138348.67684 [Thu Dec 12 00:12:30 2019][237330.169932] LustreError: dumping log to /tmp/lustre-log.1576138350.67715 [Thu Dec 12 00:12:31 2019][237331.193955] LustreError: dumping log to /tmp/lustre-log.1576138351.67590 [Thu Dec 12 00:12:36 2019][237335.802048] LustreError: dumping log to /tmp/lustre-log.1576138356.67945 [Thu Dec 12 00:12:38 2019][237338.362100] LustreError: dumping log to /tmp/lustre-log.1576138358.68035 [Thu Dec 12 00:12:39 2019][237339.386117] LustreError: dumping log to /tmp/lustre-log.1576138359.68013 [Thu Dec 12 00:12:40 2019][237339.898126] LustreError: dumping log to /tmp/lustre-log.1576138360.67859 [Thu Dec 12 00:12:42 2019][237341.699169] LustreError: 67688:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576138062, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0056_UUID lock: ffff89018ca4ee40/0x7066c9c1908b2888 lrc: 3/0,1 mode: --/PW res: [0x1880000401:0x11129f:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67688 timeout: 0 lvb_type: 0 [Thu Dec 12 00:12:42 2019][237341.742968] LustreError: dumping log to /tmp/lustre-log.1576138362.67688 [Thu Dec 12 00:12:53 2019][237353.210396] LustreError: dumping log to /tmp/lustre-log.1576138373.67618 [Thu Dec 12 00:13:04 2019][237363.962609] LNet: Service thread pid 67892 was inactive for 236.60s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:13:04 2019][237363.975560] LNet: Skipped 21 previous similar messages [Thu Dec 12 00:13:04 2019][237363.980831] LustreError: dumping log to /tmp/lustre-log.1576138384.67892 [Thu Dec 12 00:13:05 2019][237364.986631] LustreError: dumping log to /tmp/lustre-log.1576138385.67904 [Thu Dec 12 00:13:18 2019][237378.279904] LustreError: 112549:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576138098, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0056_UUID lock: ffff8920d77c7740/0x7066c9c1908c0acd lrc: 3/0,1 mode: --/PW res: [0x1880000402:0x2efb73:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 112549 timeout: 0 lvb_type: 0 [Thu Dec 12 00:13:18 2019][237378.323825] LustreError: 112549:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 1 previous similar message [Thu Dec 12 00:13:35 2019][237394.683221] LustreError: dumping log to /tmp/lustre-log.1576138415.67940 [Thu Dec 12 00:13:37 2019][237396.731277] LustreError: dumping log to /tmp/lustre-log.1576138417.67754 [Thu Dec 12 00:13:42 2019][237401.851365] LustreError: dumping log to /tmp/lustre-log.1576138422.67970 [Thu Dec 12 00:13:44 2019][237403.899424] LustreError: dumping log to /tmp/lustre-log.1576138424.67894 [Thu Dec 12 00:13:46 2019][237405.947447] LustreError: dumping log to /tmp/lustre-log.1576138426.67982 [Thu Dec 12 00:13:47 2019][237406.971470] LustreError: dumping log to /tmp/lustre-log.1576138427.66134 [Thu Dec 12 00:13:49 2019][237409.019513] LustreError: dumping log to /tmp/lustre-log.1576138429.68042 [Thu Dec 12 00:13:53 2019][237413.115598] LustreError: dumping log to /tmp/lustre-log.1576138433.68031 [Thu Dec 12 00:14:17 2019][237436.668067] LustreError: dumping log to /tmp/lustre-log.1576138457.67989 [Thu Dec 12 00:14:21 2019][237440.744152] LustreError: 67618:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576138161, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0056_UUID lock: ffff88f482c39f80/0x7066c9c1908d9f29 lrc: 3/0,1 mode: --/PW res: [0x1880000401:0x1112a1:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67618 timeout: 0 lvb_type: 0 [Thu Dec 12 00:14:48 2019][237468.412705] LustreError: dumping log to /tmp/lustre-log.1576138488.68023 [Thu Dec 12 00:14:58 2019][237477.628890] LustreError: dumping log to /tmp/lustre-log.1576138498.67963 [Thu Dec 12 00:15:08 2019][237487.869097] LustreError: dumping log to /tmp/lustre-log.1576138508.113352 [Thu Dec 12 00:15:16 2019][237496.061263] LNet: Service thread pid 112519 was inactive for 313.06s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:15:16 2019][237496.074297] LNet: Skipped 20 previous similar messages [Thu Dec 12 00:15:16 2019][237496.079557] LustreError: dumping log to /tmp/lustre-log.1576138516.112519 [Thu Dec 12 00:15:35 2019][237514.549421] Lustre: fir-OST0054: Connection restored to 687b1eea-b865-b791-9de5-a67096eac725 (at 10.8.23.26@o2ib6) [Thu Dec 12 00:15:35 2019][237514.559886] Lustre: Skipped 2 previous similar messages [Thu Dec 12 00:15:48 2019][237527.781354] Lustre: fir-OST0054: Connection restored to ca09bd61-a4b3-111c-b997-9c7823236764 (at 10.8.22.17@o2ib6) [Thu Dec 12 00:15:48 2019][237527.791798] Lustre: Skipped 4 previous similar messages [Thu Dec 12 00:15:50 2019][237530.102940] Lustre: fir-OST0054: Connection restored to 00850750-7463-78da-94ee-623be2781c44 (at 10.8.22.22@o2ib6) [Thu Dec 12 00:15:50 2019][237530.113398] Lustre: Skipped 4 previous similar messages [Thu Dec 12 00:16:01 2019][237541.118188] LNet: Service thread pid 67986 was inactive for 362.65s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:16:01 2019][237541.135213] Pid: 67986, comm: ll_ost_io01_039 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:16:01 2019][237541.145992] Call Trace: [Thu Dec 12 00:16:01 2019][237541.148566] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:16:01 2019][237541.155561] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:16:01 2019][237541.162728] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:16:01 2019][237541.169368] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:16:01 2019][237541.176102] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:16:01 2019][237541.183617] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:16:01 2019][237541.190700] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:16:01 2019][237541.197433] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:16:01 2019][237541.203561] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:16:01 2019][237541.210260] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:16:01 2019][237541.217289] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:16:01 2019][237541.225106] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:16:01 2019][237541.231519] [] kthread+0xd1/0xe0 [Thu Dec 12 00:16:01 2019][237541.236535] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:16:01 2019][237541.243098] [] 0xffffffffffffffff [Thu Dec 12 00:16:01 2019][237541.248197] LustreError: dumping log to /tmp/lustre-log.1576138561.67986 [Thu Dec 12 00:16:02 2019][237542.142185] Pid: 67591, comm: ll_ost_io02_003 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:16:02 2019][237542.152975] Call Trace: [Thu Dec 12 00:16:02 2019][237542.155545] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:16:02 2019][237542.162566] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:16:02 2019][237542.169736] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:16:02 2019][237542.176415] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:16:02 2019][237542.183151] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:16:02 2019][237542.190720] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:16:02 2019][237542.197808] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:16:02 2019][237542.204580] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:16:02 2019][237542.210728] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:16:02 2019][237542.217443] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:16:02 2019][237542.224476] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:16:02 2019][237542.232306] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:16:02 2019][237542.238722] [] kthread+0xd1/0xe0 [Thu Dec 12 00:16:02 2019][237542.243738] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:16:02 2019][237542.250318] [] 0xffffffffffffffff [Thu Dec 12 00:16:02 2019][237542.255444] LustreError: dumping log to /tmp/lustre-log.1576138562.67591 [Thu Dec 12 00:16:07 2019][237546.903790] Lustre: fir-OST0054: Connection restored to a507eb44-8ff1-13e2-fab8-30d1823663f8 (at 10.8.22.24@o2ib6) [Thu Dec 12 00:16:07 2019][237546.914227] Lustre: Skipped 4 previous similar messages [Thu Dec 12 00:16:09 2019][237549.311321] LNet: Service thread pid 67746 was inactive for 362.01s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:16:09 2019][237549.328363] LNet: Skipped 1 previous similar message [Thu Dec 12 00:16:09 2019][237549.333426] Pid: 67746, comm: ll_ost02_036 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:16:09 2019][237549.343970] Call Trace: [Thu Dec 12 00:16:09 2019][237549.346539] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:16:09 2019][237549.353560] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:16:09 2019][237549.360748] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:16:09 2019][237549.367413] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:16:09 2019][237549.374169] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:16:09 2019][237549.381704] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:16:09 2019][237549.388816] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:16:09 2019][237549.395050] [] ofd_attr_set+0x464/0xb60 [ofd] [Thu Dec 12 00:16:09 2019][237549.401208] [] ofd_setattr_hdl+0x31d/0x8e0 [ofd] [Thu Dec 12 00:16:09 2019][237549.407622] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:16:09 2019][237549.414743] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:16:09 2019][237549.422570] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:16:09 2019][237549.429009] [] kthread+0xd1/0xe0 [Thu Dec 12 00:16:09 2019][237549.434011] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:16:09 2019][237549.440573] [] 0xffffffffffffffff [Thu Dec 12 00:16:09 2019][237549.445687] LustreError: dumping log to /tmp/lustre-log.1576138569.67746 [Thu Dec 12 00:16:13 2019][237553.406410] LNet: Service thread pid 67937 was inactive for 362.21s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:16:13 2019][237553.423451] Pid: 67937, comm: ll_ost_io03_025 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:16:13 2019][237553.434231] Call Trace: [Thu Dec 12 00:16:13 2019][237553.436808] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:16:13 2019][237553.443803] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:16:13 2019][237553.450988] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:16:13 2019][237553.457639] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:16:13 2019][237553.464381] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:16:13 2019][237553.471910] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:16:13 2019][237553.479017] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:16:13 2019][237553.485759] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:16:13 2019][237553.491940] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:16:13 2019][237553.498659] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:16:13 2019][237553.505703] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:16:13 2019][237553.513505] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:16:13 2019][237553.519944] [] kthread+0xd1/0xe0 [Thu Dec 12 00:16:13 2019][237553.524967] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:16:13 2019][237553.531529] [] 0xffffffffffffffff [Thu Dec 12 00:16:13 2019][237553.536641] LustreError: dumping log to /tmp/lustre-log.1576138573.67937 [Thu Dec 12 00:16:15 2019][237555.454451] Pid: 67964, comm: ll_ost_io02_043 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:16:15 2019][237555.465236] Call Trace: [Thu Dec 12 00:16:15 2019][237555.467815] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:16:15 2019][237555.474837] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:16:15 2019][237555.482026] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:16:15 2019][237555.488692] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:16:15 2019][237555.495444] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:16:15 2019][237555.502984] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:16:15 2019][237555.510090] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:16:15 2019][237555.516869] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:16:15 2019][237555.523014] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:16:15 2019][237555.529756] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:16:15 2019][237555.536842] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:16:16 2019][237555.544687] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:16:16 2019][237555.551136] [] kthread+0xd1/0xe0 [Thu Dec 12 00:16:16 2019][237555.556185] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:16:16 2019][237555.562777] [] 0xffffffffffffffff [Thu Dec 12 00:16:16 2019][237555.567895] LustreError: dumping log to /tmp/lustre-log.1576138576.67964 [Thu Dec 12 00:16:18 2019][237557.502496] LustreError: dumping log to /tmp/lustre-log.1576138577.67973 [Thu Dec 12 00:16:19 2019][237558.526510] LustreError: dumping log to /tmp/lustre-log.1576138578.67960 [Thu Dec 12 00:16:22 2019][237561.598572] LustreError: dumping log to /tmp/lustre-log.1576138582.68049 [Thu Dec 12 00:16:24 2019][237563.646619] LustreError: dumping log to /tmp/lustre-log.1576138584.113613 [Thu Dec 12 00:16:25 2019][237565.308284] Lustre: fir-OST0056: Export ffff89036eda0800 already connecting from 10.8.23.26@o2ib6 [Thu Dec 12 00:16:38 2019][237578.501849] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 00:16:41 2019][237580.894384] Lustre: fir-OST0056: Export ffff88e42919e800 already connecting from 10.8.22.22@o2ib6 [Thu Dec 12 00:16:50 2019][237590.271148] LustreError: dumping log to /tmp/lustre-log.1576138610.68026 [Thu Dec 12 00:16:58 2019][237597.632231] Lustre: fir-OST0056: Export ffff8902f7fafc00 already connecting from 10.8.22.24@o2ib6 [Thu Dec 12 00:17:16 2019][237615.485143] Lustre: fir-OST0056: Export ffff89036eda0800 already connecting from 10.8.23.26@o2ib6 [Thu Dec 12 00:17:19 2019][237618.943722] LustreError: dumping log to /tmp/lustre-log.1576138639.68050 [Thu Dec 12 00:17:21 2019][237620.991766] LustreError: dumping log to /tmp/lustre-log.1576138641.67955 [Thu Dec 12 00:17:22 2019][237621.801792] Lustre: 67968:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:17:22 2019][237621.801792] req@ffff890f1b8c4850 x1652591481752896/t0(0) o4->0f4b0f7a-80c1-4@10.9.110.62@o2ib4:647/0 lens 1352/664 e 24 to 0 dl 1576138647 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:17:22 2019][237622.509804] Lustre: 68038:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:17:22 2019][237622.509804] req@ffff88e4f04ec050 x1652591481755520/t0(0) o4->0f4b0f7a-80c1-4@10.9.110.62@o2ib4:647/0 lens 1352/664 e 24 to 0 dl 1576138647 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:17:22 2019][237622.536983] Lustre: 68038:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 3 previous similar messages [Thu Dec 12 00:17:24 2019][237623.807832] Lustre: 67953:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:17:24 2019][237623.807832] req@ffff890712200050 x1648846345542000/t0(0) o4->75b6516e-d912-63bd-698a-8f68fc05bdf0@10.9.110.15@o2ib4:649/0 lens 488/448 e 24 to 0 dl 1576138649 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:17:24 2019][237623.836762] Lustre: 67953:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1 previous similar message [Thu Dec 12 00:17:25 2019][237625.087842] LustreError: dumping log to /tmp/lustre-log.1576138645.67994 [Thu Dec 12 00:17:27 2019][237627.203908] Lustre: 67877:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:17:27 2019][237627.203908] req@ffff89224f575050 x1649530969658688/t0(0) o4->1c192c26-6a2d-8fff-8f45-c6fac242e547@10.9.104.15@o2ib4:652/0 lens 488/448 e 24 to 0 dl 1576138652 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:17:27 2019][237627.232835] Lustre: 67877:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 4 previous similar messages [Thu Dec 12 00:17:28 2019][237628.030913] Lustre: fir-OST0056: Client 0f4b0f7a-80c1-4 (at 10.9.110.62@o2ib4) reconnecting [Thu Dec 12 00:17:28 2019][237628.039357] Lustre: Skipped 5 previous similar messages [Thu Dec 12 00:17:28 2019][237628.044707] Lustre: fir-OST0056: Connection restored to d849fafe-3a33-7fd6-08c1-09a87a8abd8b (at 10.9.110.62@o2ib4) [Thu Dec 12 00:17:29 2019][237628.678722] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 00:17:29 2019][237629.183925] LustreError: dumping log to /tmp/lustre-log.1576138649.67768 [Thu Dec 12 00:17:31 2019][237631.231975] LustreError: dumping log to /tmp/lustre-log.1576138651.67995 [Thu Dec 12 00:17:33 2019][237633.050024] Lustre: 68022:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:17:33 2019][237633.050024] req@ffff8902fd4f3850 x1648532634593856/t0(0) o4->a5082367-d733-7058-3cc5-0eedec6c0c1c@10.8.30.16@o2ib6:658/0 lens 2488/448 e 24 to 0 dl 1576138658 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:17:33 2019][237633.078952] Lustre: 68022:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 6 previous similar messages [Thu Dec 12 00:17:33 2019][237633.280013] LustreError: dumping log to /tmp/lustre-log.1576138653.67952 [Thu Dec 12 00:17:34 2019][237634.304028] LustreError: dumping log to /tmp/lustre-log.1576138654.68005 [Thu Dec 12 00:17:35 2019][237635.328055] LustreError: dumping log to /tmp/lustre-log.1576138655.67897 [Thu Dec 12 00:17:37 2019][237637.376092] LustreError: dumping log to /tmp/lustre-log.1576138657.68039 [Thu Dec 12 00:17:38 2019][237638.085392] Lustre: fir-OST0056: Connection restored to (at 10.9.108.56@o2ib4) [Thu Dec 12 00:17:38 2019][237638.092791] Lustre: Skipped 9 previous similar messages [Thu Dec 12 00:17:39 2019][237639.424133] LustreError: dumping log to /tmp/lustre-log.1576138659.68006 [Thu Dec 12 00:17:42 2019][237642.232218] Lustre: 66135:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:17:42 2019][237642.232218] req@ffff891a4f63b050 x1649046890263936/t0(0) o10->7126efc2-9676-1db9-94d0-ae09c1520697@10.9.101.26@o2ib4:667/0 lens 440/432 e 17 to 0 dl 1576138667 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:17:42 2019][237642.261241] Lustre: 66135:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 10 previous similar messages [Thu Dec 12 00:17:43 2019][237642.629898] Lustre: fir-OST0056: deleting orphan objects from 0x1880000401:1118894 to 0x1880000401:1118913 [Thu Dec 12 00:17:46 2019][237645.568265] LustreError: dumping log to /tmp/lustre-log.1576138666.112531 [Thu Dec 12 00:17:48 2019][237647.809088] Lustre: fir-OST0056: Export ffff8902f7fafc00 already connecting from 10.8.22.24@o2ib6 [Thu Dec 12 00:17:48 2019][237647.818055] Lustre: Skipped 1 previous similar message [Thu Dec 12 00:17:55 2019][237654.608801] Lustre: fir-OST0056: Connection restored to 8d232f07-b6ab-bc70-4dd8-277e82f65db5 (at 10.9.107.58@o2ib4) [Thu Dec 12 00:17:55 2019][237654.619327] Lustre: Skipped 9 previous similar messages [Thu Dec 12 00:18:00 2019][237659.904550] LustreError: dumping log to /tmp/lustre-log.1576138680.67598 [Thu Dec 12 00:18:02 2019][237661.888598] Lustre: 67966:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:18:02 2019][237661.888598] req@ffff8903a9aa0850 x1649656820965840/t0(0) o4->d5b9405e-1c60-945f-2d9d-6a877d61380f@10.8.30.30@o2ib6:687/0 lens 1720/448 e 12 to 0 dl 1576138687 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:18:02 2019][237661.917553] Lustre: 67966:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 19 previous similar messages [Thu Dec 12 00:18:02 2019][237661.952589] LustreError: dumping log to /tmp/lustre-log.1576138682.68014 [Thu Dec 12 00:18:21 2019][237681.247793] Lustre: fir-OST0056: Export ffff88e42919e800 already connecting from 10.8.22.22@o2ib6 [Thu Dec 12 00:18:21 2019][237681.256757] Lustre: Skipped 2 previous similar messages [Thu Dec 12 00:18:27 2019][237687.211349] Lustre: fir-OST0056: Connection restored to ec8d663e-70c3-0c7c-9511-dfaaba3f32c1 (at 10.9.104.45@o2ib4) [Thu Dec 12 00:18:27 2019][237687.221878] Lustre: Skipped 7 previous similar messages [Thu Dec 12 00:18:34 2019][237693.697223] LustreError: dumping log to /tmp/lustre-log.1576138714.67884 [Thu Dec 12 00:18:34 2019][237694.529253] Lustre: 26939:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:18:35 2019][237694.529253] req@ffff8902569da050 x1649291658560656/t0(0) o4->89038d66-847b-1ff4-ff67-a551d70b6de8@10.9.110.70@o2ib4:719/0 lens 488/448 e 8 to 0 dl 1576138719 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:18:35 2019][237694.558093] Lustre: 26939:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 11 previous similar messages [Thu Dec 12 00:18:37 2019][237696.573443] Lustre: fir-OST0056: Client 72ec26e6-8490-9625-4bfc-aa584f79f189 (at 10.9.102.25@o2ib4) reconnecting [Thu Dec 12 00:18:37 2019][237696.583710] Lustre: Skipped 29 previous similar messages [Thu Dec 12 00:18:38 2019][237697.793305] LustreError: dumping log to /tmp/lustre-log.1576138718.67730 [Thu Dec 12 00:18:41 2019][237700.865378] LustreError: dumping log to /tmp/lustre-log.1576138721.67820 [Thu Dec 12 00:18:43 2019][237702.913411] LustreError: dumping log to /tmp/lustre-log.1576138723.67907 [Thu Dec 12 00:18:47 2019][237707.009489] LustreError: dumping log to /tmp/lustre-log.1576138727.68003 [Thu Dec 12 00:18:51 2019][237711.105570] LustreError: dumping log to /tmp/lustre-log.1576138731.67906 [Thu Dec 12 00:19:21 2019][237740.560319] Lustre: fir-OST0056: haven't heard from client 7e5bcac9-70c5-4 (at ) in 227 seconds. I think it's dead, and I am evicting it. exp ffff89036eda0800, cur 1576138761 expire 1576138611 last 1576138534 [Thu Dec 12 00:19:21 2019][237740.579755] Lustre: Skipped 5 previous similar messages [Thu Dec 12 00:19:28 2019][237748.163230] Lustre: fir-OST0056: Export ffff8902f7fafc00 already connecting from 10.8.22.24@o2ib6 [Thu Dec 12 00:19:28 2019][237748.172193] Lustre: Skipped 4 previous similar messages [Thu Dec 12 00:19:31 2019][237751.479667] Lustre: fir-OST0056: Connection restored to 860089bf-2de2-f0b4-c239-a266e1c756b4 (at 10.9.102.54@o2ib4) [Thu Dec 12 00:19:31 2019][237751.490187] Lustre: Skipped 19 previous similar messages [Thu Dec 12 00:19:40 2019][237760.386572] Lustre: 27017:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:19:40 2019][237760.386572] req@ffff89079a151850 x1648858224189808/t0(0) o4->8e4fe161-7440-1bc3-60cf-ef16452a7501@10.9.105.43@o2ib4:30/0 lens 6576/448 e 4 to 0 dl 1576138785 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:19:40 2019][237760.415403] Lustre: 27017:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 37 previous similar messages [Thu Dec 12 00:19:56 2019][237775.706527] Lustre: fir-OST0056: deleting orphan objects from 0x0:27479877 to 0x0:27479905 [Thu Dec 12 00:19:57 2019][237776.642884] LNet: Service thread pid 67991 was inactive for 512.09s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:19:57 2019][237776.655854] LNet: Skipped 42 previous similar messages [Thu Dec 12 00:19:57 2019][237776.661088] LustreError: dumping log to /tmp/lustre-log.1576138797.67991 [Thu Dec 12 00:20:07 2019][237786.883094] LustreError: dumping log to /tmp/lustre-log.1576138807.67753 [Thu Dec 12 00:20:09 2019][237788.931149] LustreError: dumping log to /tmp/lustre-log.1576138809.67822 [Thu Dec 12 00:20:29 2019][237809.411550] LustreError: dumping log to /tmp/lustre-log.1576138829.68008 [Thu Dec 12 00:20:31 2019][237811.459584] LustreError: dumping log to /tmp/lustre-log.1576138831.67931 [Thu Dec 12 00:20:46 2019][237826.287005] Lustre: fir-OST0056: Client 7520ece1-a22b-161c-9a9c-7f1c99e6d5c6 (at 10.9.108.37@o2ib4) reconnecting [Thu Dec 12 00:20:46 2019][237826.297268] Lustre: Skipped 29 previous similar messages [Thu Dec 12 00:21:21 2019][237860.612574] LNet: Service thread pid 68004 was inactive for 563.65s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:21:21 2019][237860.629597] LNet: Skipped 1 previous similar message [Thu Dec 12 00:21:21 2019][237860.634660] Pid: 68004, comm: ll_ost_io00_034 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:21:21 2019][237860.645456] Call Trace: [Thu Dec 12 00:21:21 2019][237860.648027] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:21:21 2019][237860.655041] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:21:21 2019][237860.662208] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:21:21 2019][237860.668886] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:21:21 2019][237860.675626] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:21:21 2019][237860.683185] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:21:21 2019][237860.690278] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:21:21 2019][237860.697014] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:21:21 2019][237860.703141] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:21:21 2019][237860.709830] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:21:21 2019][237860.716876] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:21:21 2019][237860.724678] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:21:21 2019][237860.731107] [] kthread+0xd1/0xe0 [Thu Dec 12 00:21:21 2019][237860.736109] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:21:21 2019][237860.742703] [] 0xffffffffffffffff [Thu Dec 12 00:21:21 2019][237860.747812] LustreError: dumping log to /tmp/lustre-log.1576138881.68004 [Thu Dec 12 00:21:31 2019][237870.852778] Pid: 67777, comm: ll_ost01_049 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:21:31 2019][237870.863330] Call Trace: [Thu Dec 12 00:21:31 2019][237870.865907] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:21:31 2019][237870.872901] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:21:31 2019][237870.880114] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:21:31 2019][237870.886779] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:21:31 2019][237870.893527] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:21:31 2019][237870.901052] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:21:31 2019][237870.908193] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:21:31 2019][237870.914407] [] ofd_destroy+0x5d0/0x960 [ofd] [Thu Dec 12 00:21:31 2019][237870.920445] [] ofd_destroy_by_fid+0x1f4/0x4a0 [ofd] [Thu Dec 12 00:21:31 2019][237870.927094] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 00:21:31 2019][237870.933480] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:21:31 2019][237870.940528] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:21:31 2019][237870.948336] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:21:31 2019][237870.954762] [] kthread+0xd1/0xe0 [Thu Dec 12 00:21:31 2019][237870.959769] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:21:31 2019][237870.966329] [] 0xffffffffffffffff [Thu Dec 12 00:21:31 2019][237870.971437] LustreError: dumping log to /tmp/lustre-log.1576138891.67777 [Thu Dec 12 00:21:31 2019][237870.978842] Pid: 67725, comm: ll_ost01_041 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:21:31 2019][237870.989398] Call Trace: [Thu Dec 12 00:21:31 2019][237870.991974] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:21:31 2019][237870.998970] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:21:31 2019][237871.006135] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:21:31 2019][237871.012783] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:21:31 2019][237871.019533] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:21:31 2019][237871.027059] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:21:31 2019][237871.034147] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:21:31 2019][237871.040378] [] ofd_destroy+0x5d0/0x960 [ofd] [Thu Dec 12 00:21:31 2019][237871.046436] [] ofd_destroy_by_fid+0x1f4/0x4a0 [ofd] [Thu Dec 12 00:21:31 2019][237871.053092] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 00:21:31 2019][237871.059482] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:21:31 2019][237871.066575] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:21:31 2019][237871.074388] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:21:31 2019][237871.080810] [] kthread+0xd1/0xe0 [Thu Dec 12 00:21:31 2019][237871.085811] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:21:31 2019][237871.092398] [] 0xffffffffffffffff [Thu Dec 12 00:21:40 2019][237879.563699] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 00:21:40 2019][237879.572669] Lustre: Skipped 8 previous similar messages [Thu Dec 12 00:21:52 2019][237891.845210] Lustre: 27074:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:21:52 2019][237891.845210] req@ffff88f2fbd54050 x1650958282745904/t0(0) o4->0c302cf4-1147-d945-dfa2-e9bc796b3175@10.9.101.32@o2ib4:162/0 lens 7904/448 e 3 to 0 dl 1576138917 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:21:52 2019][237891.874134] Lustre: 27074:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 52 previous similar messages [Thu Dec 12 00:22:07 2019][237907.467083] Lustre: fir-OST0056: Connection restored to b4c9913c-f59e-b8ac-70a9-c2d8d6c39257 (at 10.9.101.34@o2ib4) [Thu Dec 12 00:22:07 2019][237907.477608] Lustre: Skipped 23 previous similar messages [Thu Dec 12 00:22:18 2019][237917.957722] LNet: Service thread pid 68041 was inactive for 612.09s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:22:18 2019][237917.974748] LNet: Skipped 2 previous similar messages [Thu Dec 12 00:22:18 2019][237917.979913] Pid: 68041, comm: ll_ost_io00_053 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:22:18 2019][237917.990726] Call Trace: [Thu Dec 12 00:22:18 2019][237917.993304] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:22:18 2019][237918.000304] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:22:18 2019][237918.007520] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:22:18 2019][237918.014166] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:22:18 2019][237918.020918] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:22:18 2019][237918.028459] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:22:18 2019][237918.035563] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:22:18 2019][237918.042325] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:22:18 2019][237918.048452] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:22:18 2019][237918.055153] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:22:18 2019][237918.062197] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:22:18 2019][237918.070008] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:22:18 2019][237918.076475] [] kthread+0xd1/0xe0 [Thu Dec 12 00:22:18 2019][237918.081484] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:22:18 2019][237918.088079] [] 0xffffffffffffffff [Thu Dec 12 00:22:18 2019][237918.093185] LustreError: dumping log to /tmp/lustre-log.1576138938.68041 [Thu Dec 12 00:22:20 2019][237920.005760] Pid: 67941, comm: ll_ost_io01_029 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:22:20 2019][237920.016543] Call Trace: [Thu Dec 12 00:22:20 2019][237920.019118] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:22:20 2019][237920.026145] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:22:20 2019][237920.033333] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:22:20 2019][237920.040001] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:22:20 2019][237920.046772] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:22:20 2019][237920.054298] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:22:20 2019][237920.061400] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:22:20 2019][237920.067637] [] ofd_object_punch+0x73d/0xd30 [ofd] [Thu Dec 12 00:22:20 2019][237920.074111] [] ofd_punch_hdl+0x493/0xa30 [ofd] [Thu Dec 12 00:22:20 2019][237920.080340] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:22:20 2019][237920.087405] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:22:20 2019][237920.095229] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:22:20 2019][237920.101681] [] kthread+0xd1/0xe0 [Thu Dec 12 00:22:20 2019][237920.106680] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:22:20 2019][237920.113267] [] 0xffffffffffffffff [Thu Dec 12 00:22:20 2019][237920.118406] LustreError: dumping log to /tmp/lustre-log.1576138940.67941 [Thu Dec 12 00:22:28 2019][237928.197929] LustreError: dumping log to /tmp/lustre-log.1576138948.67750 [Thu Dec 12 00:23:48 2019][238008.071530] LustreError: dumping log to /tmp/lustre-log.1576139028.112488 [Thu Dec 12 00:23:52 2019][238011.731862] Lustre: fir-OST0056: deleting orphan objects from 0x1880000400:11777327 to 0x1880000400:11777377 [Thu Dec 12 00:23:54 2019][238014.215651] LustreError: dumping log to /tmp/lustre-log.1576139034.112522 [Thu Dec 12 00:24:49 2019][238069.512760] LustreError: dumping log to /tmp/lustre-log.1576139089.67896 [Thu Dec 12 00:24:56 2019][238075.656890] LustreError: dumping log to /tmp/lustre-log.1576139096.67930 [Thu Dec 12 00:25:07 2019][238087.489627] Lustre: fir-OST0056: Client dffc1cc0-26ab-9b78-f3a0-8d9b8d410b62 (at 10.9.108.46@o2ib4) reconnecting [Thu Dec 12 00:25:07 2019][238087.499887] Lustre: Skipped 17 previous similar messages [Thu Dec 12 00:26:02 2019][238142.343776] Lustre: fir-OST0056: Export ffff891d99820800 already connecting from 10.8.23.26@o2ib6 [Thu Dec 12 00:26:02 2019][238142.352760] Lustre: Skipped 26 previous similar messages [Thu Dec 12 00:26:14 2019][238153.610462] Lustre: 27127:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:26:14 2019][238153.610462] req@ffff88f2acc5c050 x1650958603467184/t0(0) o4->717fa73e-8071-a76f-931e-8957a8ca32aa@10.9.101.41@o2ib4:424/0 lens 2056/448 e 2 to 0 dl 1576139179 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:26:14 2019][238153.639401] Lustre: 27127:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 21 previous similar messages [Thu Dec 12 00:26:15 2019][238155.530496] LustreError: dumping log to /tmp/lustre-log.1576139175.67974 [Thu Dec 12 00:26:20 2019][238159.626567] LustreError: dumping log to /tmp/lustre-log.1576139180.67949 [Thu Dec 12 00:26:34 2019][238173.612853] Lustre: fir-OST0056: Connection restored to bb7d080c-8ae8-f7ed-5d33-d34ca54d93de (at 10.9.108.19@o2ib4) [Thu Dec 12 00:26:34 2019][238173.623374] Lustre: Skipped 16 previous similar messages [Thu Dec 12 00:26:38 2019][238178.058942] LNet: Service thread pid 67807 was inactive for 763.33s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:26:38 2019][238178.075968] LNet: Skipped 1 previous similar message [Thu Dec 12 00:26:38 2019][238178.081030] Pid: 67807, comm: ll_ost01_056 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:26:38 2019][238178.091578] Call Trace: [Thu Dec 12 00:26:38 2019][238178.094150] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:26:38 2019][238178.101149] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:26:38 2019][238178.108346] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:26:38 2019][238178.114996] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:26:38 2019][238178.121761] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:26:38 2019][238178.129290] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:26:38 2019][238178.136394] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:26:38 2019][238178.142611] [] ofd_destroy+0x5d0/0x960 [ofd] [Thu Dec 12 00:26:38 2019][238178.148665] [] ofd_destroy_by_fid+0x1f4/0x4a0 [ofd] [Thu Dec 12 00:26:38 2019][238178.155314] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 00:26:38 2019][238178.161718] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:26:38 2019][238178.168767] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:26:38 2019][238178.176590] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:26:38 2019][238178.183006] [] kthread+0xd1/0xe0 [Thu Dec 12 00:26:38 2019][238178.188023] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:26:38 2019][238178.194586] [] 0xffffffffffffffff [Thu Dec 12 00:26:38 2019][238178.199699] LustreError: dumping log to /tmp/lustre-log.1576139198.67807 [Thu Dec 12 00:27:21 2019][238221.067805] Pid: 87152, comm: ll_ost_io01_067 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:27:21 2019][238221.078590] Call Trace: [Thu Dec 12 00:27:21 2019][238221.081166] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:27:21 2019][238221.088162] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:27:21 2019][238221.095345] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:27:21 2019][238221.101992] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:27:21 2019][238221.108728] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:27:21 2019][238221.116254] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:27:21 2019][238221.123365] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:27:21 2019][238221.129581] [] ofd_commitrw_write+0xa31/0x1d40 [ofd] [Thu Dec 12 00:27:21 2019][238221.136316] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:27:21 2019][238221.142458] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:27:21 2019][238221.149174] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:27:21 2019][238221.156206] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:27:21 2019][238221.164013] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:27:21 2019][238221.170453] [] kthread+0xd1/0xe0 [Thu Dec 12 00:27:21 2019][238221.175456] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:27:21 2019][238221.182057] [] 0xffffffffffffffff [Thu Dec 12 00:27:21 2019][238221.187166] LustreError: dumping log to /tmp/lustre-log.1576139241.87152 [Thu Dec 12 00:27:23 2019][238223.115840] Pid: 67714, comm: ll_ost_io03_004 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:27:23 2019][238223.126625] Call Trace: [Thu Dec 12 00:27:23 2019][238223.129215] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:27:23 2019][238223.136213] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:27:23 2019][238223.143412] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:27:23 2019][238223.150064] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:27:23 2019][238223.156814] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:27:23 2019][238223.164354] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:27:23 2019][238223.171463] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:27:23 2019][238223.178227] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:27:23 2019][238223.184378] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:27:23 2019][238223.191099] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:27:23 2019][238223.198145] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:27:23 2019][238223.205958] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:27:23 2019][238223.212387] [] kthread+0xd1/0xe0 [Thu Dec 12 00:27:23 2019][238223.217389] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:27:23 2019][238223.224000] [] 0xffffffffffffffff [Thu Dec 12 00:27:23 2019][238223.229100] LustreError: dumping log to /tmp/lustre-log.1576139243.67714 [Thu Dec 12 00:27:25 2019][238225.163886] Pid: 67657, comm: ll_ost02_023 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:27:25 2019][238225.174436] Call Trace: [Thu Dec 12 00:27:25 2019][238225.177012] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:27:25 2019][238225.184019] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:27:25 2019][238225.191282] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:27:25 2019][238225.197939] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:27:25 2019][238225.204725] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:27:25 2019][238225.212282] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:27:25 2019][238225.219407] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:27:25 2019][238225.225636] [] ofd_destroy+0x5d0/0x960 [ofd] [Thu Dec 12 00:27:25 2019][238225.231696] [] ofd_destroy_by_fid+0x1f4/0x4a0 [ofd] [Thu Dec 12 00:27:25 2019][238225.238360] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 00:27:25 2019][238225.244768] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:27:25 2019][238225.251844] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:27:25 2019][238225.259685] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:27:25 2019][238225.266158] [] kthread+0xd1/0xe0 [Thu Dec 12 00:27:25 2019][238225.271178] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:27:25 2019][238225.277742] [] 0xffffffffffffffff [Thu Dec 12 00:27:25 2019][238225.282870] LustreError: dumping log to /tmp/lustre-log.1576139245.67657 [Thu Dec 12 00:29:53 2019][238372.622843] LNet: Service thread pid 67943 was inactive for 912.97s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:29:53 2019][238372.639862] LNet: Skipped 3 previous similar messages [Thu Dec 12 00:29:53 2019][238372.645009] Pid: 67943, comm: ll_ost_io01_030 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:29:53 2019][238372.655808] Call Trace: [Thu Dec 12 00:29:53 2019][238372.658383] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:29:53 2019][238372.665382] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:29:53 2019][238372.672563] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:29:53 2019][238372.679214] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:29:53 2019][238372.685964] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:29:53 2019][238372.693489] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:29:53 2019][238372.700596] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:29:53 2019][238372.706810] [] ofd_object_punch+0x73d/0xd30 [ofd] [Thu Dec 12 00:29:53 2019][238372.713299] [] ofd_punch_hdl+0x493/0xa30 [ofd] [Thu Dec 12 00:29:53 2019][238372.719514] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:29:53 2019][238372.726575] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:29:53 2019][238372.734379] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:29:53 2019][238372.740806] [] kthread+0xd1/0xe0 [Thu Dec 12 00:29:53 2019][238372.745809] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:29:53 2019][238372.752386] [] 0xffffffffffffffff [Thu Dec 12 00:29:53 2019][238372.757485] LustreError: dumping log to /tmp/lustre-log.1576139393.67943 [Thu Dec 12 00:31:10 2019][238450.448399] LNet: Service thread pid 67762 was inactive for 964.37s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:31:10 2019][238450.461346] LNet: Skipped 13 previous similar messages [Thu Dec 12 00:31:10 2019][238450.466583] LustreError: dumping log to /tmp/lustre-log.1576139470.67762 [Thu Dec 12 00:31:19 2019][238458.640557] LustreError: dumping log to /tmp/lustre-log.1576139479.67745 [Thu Dec 12 00:32:20 2019][238520.081793] LNet: Service thread pid 66128 was inactive for 1015.91s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:32:20 2019][238520.098899] Pid: 66128, comm: ll_ost_io01_001 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:32:20 2019][238520.109680] Call Trace: [Thu Dec 12 00:32:20 2019][238520.112254] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:32:20 2019][238520.119251] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:32:20 2019][238520.126436] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:32:20 2019][238520.133085] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:32:20 2019][238520.139835] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:32:20 2019][238520.147359] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:32:20 2019][238520.154463] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:32:20 2019][238520.160680] [] ofd_commitrw_write+0xa31/0x1d40 [ofd] [Thu Dec 12 00:32:20 2019][238520.167431] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:32:20 2019][238520.173560] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:32:20 2019][238520.180275] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:32:20 2019][238520.187319] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:32:20 2019][238520.195137] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:32:20 2019][238520.201552] [] kthread+0xd1/0xe0 [Thu Dec 12 00:32:20 2019][238520.206553] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:32:20 2019][238520.213122] [] 0xffffffffffffffff [Thu Dec 12 00:32:20 2019][238520.218222] LustreError: dumping log to /tmp/lustre-log.1576139540.66128 [Thu Dec 12 00:32:28 2019][238528.273957] Pid: 112533, comm: ll_ost02_080 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:32:28 2019][238528.284564] Call Trace: [Thu Dec 12 00:32:28 2019][238528.287134] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:32:28 2019][238528.294131] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:32:28 2019][238528.301329] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:32:28 2019][238528.307979] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:32:28 2019][238528.314729] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:32:28 2019][238528.322255] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:32:28 2019][238528.329349] [] tgt_client_data_update+0x303/0x5e0 [ptlrpc] [Thu Dec 12 00:32:28 2019][238528.336660] [] tgt_client_new+0x41b/0x610 [ptlrpc] [Thu Dec 12 00:32:28 2019][238528.343270] [] ofd_obd_connect+0x3a3/0x4c0 [ofd] [Thu Dec 12 00:32:28 2019][238528.349658] [] target_handle_connect+0xecb/0x2b10 [ptlrpc] [Thu Dec 12 00:32:28 2019][238528.356953] [] tgt_request_handle+0x50a/0x1580 [ptlrpc] [Thu Dec 12 00:32:28 2019][238528.363985] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:32:28 2019][238528.371814] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:32:28 2019][238528.378225] [] kthread+0xd1/0xe0 [Thu Dec 12 00:32:28 2019][238528.383241] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:32:28 2019][238528.389805] [] 0xffffffffffffffff [Thu Dec 12 00:32:28 2019][238528.394917] LustreError: dumping log to /tmp/lustre-log.1576139548.112533 [Thu Dec 12 00:32:40 2019][238540.562197] Pid: 67868, comm: ll_ost01_068 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:32:40 2019][238540.572718] Call Trace: [Thu Dec 12 00:32:41 2019][238540.575287] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:32:41 2019][238540.582297] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:32:41 2019][238540.589492] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:32:41 2019][238540.596161] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:32:41 2019][238540.602912] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:32:41 2019][238540.610438] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:32:41 2019][238540.617541] [] tgt_client_data_update+0x303/0x5e0 [ptlrpc] [Thu Dec 12 00:32:41 2019][238540.624850] [] tgt_client_new+0x41b/0x610 [ptlrpc] [Thu Dec 12 00:32:41 2019][238540.631461] [] ofd_obd_connect+0x3a3/0x4c0 [ofd] [Thu Dec 12 00:32:41 2019][238540.637849] [] target_handle_connect+0xecb/0x2b10 [ptlrpc] [Thu Dec 12 00:32:41 2019][238540.645143] [] tgt_request_handle+0x50a/0x1580 [ptlrpc] [Thu Dec 12 00:32:41 2019][238540.652178] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:32:41 2019][238540.660000] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:32:41 2019][238540.666434] [] kthread+0xd1/0xe0 [Thu Dec 12 00:32:41 2019][238540.671448] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:32:41 2019][238540.678013] [] 0xffffffffffffffff [Thu Dec 12 00:32:41 2019][238540.683124] LustreError: dumping log to /tmp/lustre-log.1576139561.67868 [Thu Dec 12 00:32:45 2019][238544.658296] Pid: 67671, comm: ll_ost00_024 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:32:45 2019][238544.668816] Call Trace: [Thu Dec 12 00:32:45 2019][238544.671388] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:32:45 2019][238544.678391] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:32:45 2019][238544.685574] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:32:45 2019][238544.692225] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:32:45 2019][238544.698987] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:32:45 2019][238544.706532] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:32:45 2019][238544.713636] [] tgt_client_data_update+0x303/0x5e0 [ptlrpc] [Thu Dec 12 00:32:45 2019][238544.720941] [] tgt_client_new+0x41b/0x610 [ptlrpc] [Thu Dec 12 00:32:45 2019][238544.727567] [] ofd_obd_connect+0x3a3/0x4c0 [ofd] [Thu Dec 12 00:32:45 2019][238544.733956] [] target_handle_connect+0xecb/0x2b10 [ptlrpc] [Thu Dec 12 00:32:45 2019][238544.741259] [] tgt_request_handle+0x50a/0x1580 [ptlrpc] [Thu Dec 12 00:32:45 2019][238544.748283] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:32:45 2019][238544.756097] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:32:45 2019][238544.762514] [] kthread+0xd1/0xe0 [Thu Dec 12 00:32:45 2019][238544.767543] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:32:45 2019][238544.774117] [] 0xffffffffffffffff [Thu Dec 12 00:32:45 2019][238544.779231] LustreError: dumping log to /tmp/lustre-log.1576139565.67671 [Thu Dec 12 00:33:05 2019][238565.138725] Pid: 26830, comm: ll_ost_io03_048 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:33:05 2019][238565.149503] Call Trace: [Thu Dec 12 00:33:05 2019][238565.152072] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:33:05 2019][238565.159070] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:33:05 2019][238565.166256] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:33:05 2019][238565.172919] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:33:05 2019][238565.179669] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:33:05 2019][238565.187195] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:33:05 2019][238565.194299] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:33:05 2019][238565.200515] [] ofd_object_punch+0x73d/0xd30 [ofd] [Thu Dec 12 00:33:05 2019][238565.207004] [] ofd_punch_hdl+0x493/0xa30 [ofd] [Thu Dec 12 00:33:05 2019][238565.213221] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:33:05 2019][238565.220281] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:33:05 2019][238565.228085] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:33:05 2019][238565.234513] [] kthread+0xd1/0xe0 [Thu Dec 12 00:33:05 2019][238565.239520] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:33:05 2019][238565.246118] [] 0xffffffffffffffff [Thu Dec 12 00:33:05 2019][238565.251220] LustreError: dumping log to /tmp/lustre-log.1576139585.26830 [Thu Dec 12 00:33:50 2019][238610.195609] LustreError: dumping log to /tmp/lustre-log.1576139630.112514 [Thu Dec 12 00:33:51 2019][238610.850728] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 00:33:51 2019][238610.860907] Lustre: Skipped 92 previous similar messages [Thu Dec 12 00:34:46 2019][238666.472729] Lustre: 112543:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/-150), not sending early reply [Thu Dec 12 00:34:46 2019][238666.472729] req@ffff88f2af871850 x1652161335594432/t0(0) o19->ae1d0080-04fa-5436-e145-ffdf0db9990d@10.0.10.3@o2ib7:181/0 lens 336/336 e 0 to 0 dl 1576139691 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:34:47 2019][238666.501801] Lustre: 112543:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 192 previous similar messages [Thu Dec 12 00:34:56 2019][238675.732910] LustreError: dumping log to /tmp/lustre-log.1576139696.68007 [Thu Dec 12 00:35:00 2019][238679.828995] LustreError: dumping log to /tmp/lustre-log.1576139700.67771 [Thu Dec 12 00:35:02 2019][238682.395663] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 00:35:02 2019][238682.404631] Lustre: Skipped 52 previous similar messages [Thu Dec 12 00:35:04 2019][238683.925084] LustreError: dumping log to /tmp/lustre-log.1576139704.67958 [Thu Dec 12 00:35:09 2019][238688.549482] Lustre: fir-OST0056: Connection restored to dffc1cc0-26ab-9b78-f3a0-8d9b8d410b62 (at 10.9.108.46@o2ib4) [Thu Dec 12 00:35:09 2019][238688.560020] Lustre: Skipped 91 previous similar messages [Thu Dec 12 00:36:05 2019][238745.366307] LustreError: dumping log to /tmp/lustre-log.1576139765.67936 [Thu Dec 12 00:36:09 2019][238749.462393] LustreError: dumping log to /tmp/lustre-log.1576139769.67813 [Thu Dec 12 00:36:14 2019][238753.558481] LustreError: dumping log to /tmp/lustre-log.1576139773.68021 [Thu Dec 12 00:36:18 2019][238757.654552] LustreError: dumping log to /tmp/lustre-log.1576139778.67946 [Thu Dec 12 00:36:22 2019][238761.750633] LustreError: dumping log to /tmp/lustre-log.1576139782.67662 [Thu Dec 12 00:36:26 2019][238765.846715] LustreError: dumping log to /tmp/lustre-log.1576139786.112537 [Thu Dec 12 00:37:07 2019][238806.807531] LustreError: dumping log to /tmp/lustre-log.1576139827.67957 [Thu Dec 12 00:37:31 2019][238831.384032] LNet: Service thread pid 68010 was inactive for 1201.76s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:37:31 2019][238831.401160] LNet: Skipped 4 previous similar messages [Thu Dec 12 00:37:31 2019][238831.406327] Pid: 68010, comm: ll_ost_io00_036 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:37:31 2019][238831.417115] Call Trace: [Thu Dec 12 00:37:31 2019][238831.419670] [] call_rwsem_down_read_failed+0x18/0x30 [Thu Dec 12 00:37:31 2019][238831.426409] [] osd_read_lock+0x5c/0xe0 [osd_ldiskfs] [Thu Dec 12 00:37:31 2019][238831.433181] [] ofd_preprw_write.isra.31+0xd3/0xea0 [ofd] [Thu Dec 12 00:37:31 2019][238831.440274] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:37:31 2019][238831.446328] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:37:31 2019][238831.452947] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:37:31 2019][238831.459990] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:37:31 2019][238831.467792] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:37:31 2019][238831.474222] [] kthread+0xd1/0xe0 [Thu Dec 12 00:37:31 2019][238831.479224] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:37:31 2019][238831.485800] [] 0xffffffffffffffff [Thu Dec 12 00:37:31 2019][238831.490910] LustreError: dumping log to /tmp/lustre-log.1576139851.68010 [Thu Dec 12 00:37:31 2019][238831.498312] Pid: 67948, comm: ll_ost_io03_026 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:37:31 2019][238831.509122] Call Trace: [Thu Dec 12 00:37:31 2019][238831.511670] [] __lock_page+0x74/0x90 [Thu Dec 12 00:37:31 2019][238831.517017] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:37:31 2019][238831.522805] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:37:31 2019][238831.528841] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:37:31 2019][238831.535684] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:37:31 2019][238831.542856] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:37:31 2019][238831.548910] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:37:31 2019][238831.555499] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:37:31 2019][238831.562550] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.570346] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.576776] [] kthread+0xd1/0xe0 [Thu Dec 12 00:37:32 2019][238831.581769] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:37:32 2019][238831.588337] [] 0xffffffffffffffff [Thu Dec 12 00:37:32 2019][238831.593430] Pid: 68051, comm: ll_ost_io01_063 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:37:32 2019][238831.604226] Call Trace: [Thu Dec 12 00:37:32 2019][238831.606770] [] __lock_page+0x74/0x90 [Thu Dec 12 00:37:32 2019][238831.612113] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:37:32 2019][238831.617900] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:37:32 2019][238831.623950] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:37:32 2019][238831.630785] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:37:32 2019][238831.637957] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:37:32 2019][238831.644016] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.650602] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.657639] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.665438] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.671867] [] kthread+0xd1/0xe0 [Thu Dec 12 00:37:32 2019][238831.676863] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:37:32 2019][238831.683430] [] 0xffffffffffffffff [Thu Dec 12 00:37:32 2019][238831.688529] Pid: 26875, comm: ll_ost_io03_050 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:37:32 2019][238831.699317] Call Trace: [Thu Dec 12 00:37:32 2019][238831.701860] [] call_rwsem_down_read_failed+0x18/0x30 [Thu Dec 12 00:37:32 2019][238831.708584] [] osd_read_lock+0x5c/0xe0 [osd_ldiskfs] [Thu Dec 12 00:37:32 2019][238831.715335] [] ofd_preprw_write.isra.31+0xd3/0xea0 [ofd] [Thu Dec 12 00:37:32 2019][238831.722415] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:37:32 2019][238831.728474] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.735060] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.742095] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.749898] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.756328] [] kthread+0xd1/0xe0 [Thu Dec 12 00:37:32 2019][238831.761321] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:37:32 2019][238831.767887] [] 0xffffffffffffffff [Thu Dec 12 00:37:32 2019][238831.772981] Pid: 67980, comm: ll_ost_io00_027 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:37:32 2019][238831.783777] Call Trace: [Thu Dec 12 00:37:32 2019][238831.786323] [] __lock_page+0x74/0x90 [Thu Dec 12 00:37:32 2019][238831.791664] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:37:32 2019][238831.797452] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:37:32 2019][238831.803485] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:37:32 2019][238831.810322] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:37:32 2019][238831.817507] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:37:32 2019][238831.823563] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.830153] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.837187] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.844990] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.851418] [] kthread+0xd1/0xe0 [Thu Dec 12 00:37:32 2019][238831.856414] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:37:32 2019][238831.862967] [] 0xffffffffffffffff [Thu Dec 12 00:37:35 2019][238835.480122] LustreError: dumping log to /tmp/lustre-log.1576139855.26895 [Thu Dec 12 00:37:40 2019][238839.576203] LustreError: dumping log to /tmp/lustre-log.1576139859.26898 [Thu Dec 12 00:37:44 2019][238843.672276] LustreError: dumping log to /tmp/lustre-log.1576139864.67947 [Thu Dec 12 00:37:48 2019][238847.768367] LustreError: dumping log to /tmp/lustre-log.1576139868.26831 [Thu Dec 12 00:37:52 2019][238851.864439] LustreError: dumping log to /tmp/lustre-log.1576139872.68047 [Thu Dec 12 00:37:56 2019][238855.960522] LustreError: dumping log to /tmp/lustre-log.1576139876.67817 [Thu Dec 12 00:38:00 2019][238860.056602] LustreError: dumping log to /tmp/lustre-log.1576139880.67942 [Thu Dec 12 00:38:12 2019][238872.344849] LustreError: dumping log to /tmp/lustre-log.1576139892.26937 [Thu Dec 12 00:38:16 2019][238876.440930] LustreError: dumping log to /tmp/lustre-log.1576139896.67966 [Thu Dec 12 00:38:20 2019][238880.537017] LustreError: dumping log to /tmp/lustre-log.1576139900.26919 [Thu Dec 12 00:38:25 2019][238884.633101] LustreError: dumping log to /tmp/lustre-log.1576139905.26943 [Thu Dec 12 00:38:29 2019][238888.729178] LustreError: dumping log to /tmp/lustre-log.1576139909.26936 [Thu Dec 12 00:38:37 2019][238896.921342] LustreError: dumping log to /tmp/lustre-log.1576139917.26948 [Thu Dec 12 00:38:41 2019][238901.017427] LustreError: dumping log to /tmp/lustre-log.1576139921.67987 [Thu Dec 12 00:38:45 2019][238905.113505] LustreError: dumping log to /tmp/lustre-log.1576139925.26969 [Thu Dec 12 00:38:49 2019][238909.209608] LustreError: dumping log to /tmp/lustre-log.1576139929.26907 [Thu Dec 12 00:38:53 2019][238913.305672] LustreError: dumping log to /tmp/lustre-log.1576139933.113359 [Thu Dec 12 00:38:57 2019][238917.401754] LustreError: dumping log to /tmp/lustre-log.1576139937.26971 [Thu Dec 12 00:39:01 2019][238921.497840] LustreError: dumping log to /tmp/lustre-log.1576139941.26973 [Thu Dec 12 00:39:06 2019][238925.593923] LustreError: dumping log to /tmp/lustre-log.1576139946.67758 [Thu Dec 12 00:39:10 2019][238929.689999] LustreError: dumping log to /tmp/lustre-log.1576139950.67755 [Thu Dec 12 00:39:13 2019][238933.234077] LustreError: 67644:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576139653, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0054_UUID lock: ffff88e87d178480/0x7066c9c190adca24 lrc: 3/0,1 mode: --/PW res: [0x1800000402:0x110c27:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67644 timeout: 0 lvb_type: 0 [Thu Dec 12 00:39:18 2019][238937.882165] LustreError: dumping log to /tmp/lustre-log.1576139958.26985 [Thu Dec 12 00:39:22 2019][238941.978246] LustreError: dumping log to /tmp/lustre-log.1576139962.67881 [Thu Dec 12 00:39:26 2019][238946.074325] LustreError: dumping log to /tmp/lustre-log.1576139966.26952 [Thu Dec 12 00:39:30 2019][238950.170405] LustreError: dumping log to /tmp/lustre-log.1576139970.26989 [Thu Dec 12 00:39:34 2019][238954.266489] LustreError: dumping log to /tmp/lustre-log.1576139974.66132 [Thu Dec 12 00:39:51 2019][238970.650816] LustreError: dumping log to /tmp/lustre-log.1576139991.67834 [Thu Dec 12 00:39:59 2019][238978.842981] LustreError: dumping log to /tmp/lustre-log.1576139999.67698 [Thu Dec 12 00:40:03 2019][238982.939064] LustreError: dumping log to /tmp/lustre-log.1576140003.26966 [Thu Dec 12 00:40:07 2019][238987.035146] LustreError: dumping log to /tmp/lustre-log.1576140007.27043 [Thu Dec 12 00:40:11 2019][238991.131230] LustreError: dumping log to /tmp/lustre-log.1576140011.26987 [Thu Dec 12 00:40:15 2019][238995.227321] LustreError: dumping log to /tmp/lustre-log.1576140015.26997 [Thu Dec 12 00:40:32 2019][239011.611649] LustreError: dumping log to /tmp/lustre-log.1576140032.27049 [Thu Dec 12 00:40:36 2019][239015.707840] LustreError: dumping log to /tmp/lustre-log.1576140036.26916 [Thu Dec 12 00:40:40 2019][239019.803804] LustreError: dumping log to /tmp/lustre-log.1576140040.27044 [Thu Dec 12 00:40:44 2019][239023.899892] LustreError: dumping log to /tmp/lustre-log.1576140044.26918 [Thu Dec 12 00:40:48 2019][239027.995968] LustreError: dumping log to /tmp/lustre-log.1576140048.26944 [Thu Dec 12 00:41:00 2019][239040.284214] LustreError: dumping log to /tmp/lustre-log.1576140060.27021 [Thu Dec 12 00:41:09 2019][239048.476388] LustreError: dumping log to /tmp/lustre-log.1576140068.27004 [Thu Dec 12 00:41:33 2019][239072.928876] LustreError: 67683:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576139793, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0054_UUID lock: ffff88fc261efbc0/0x7066c9c190add4e3 lrc: 3/0,1 mode: --/PW res: [0x1800000402:0x110c28:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67683 timeout: 0 lvb_type: 0 [Thu Dec 12 00:41:33 2019][239073.052881] LNet: Service thread pid 26990 was inactive for 1201.65s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:41:33 2019][239073.065932] LNet: Skipped 178 previous similar messages [Thu Dec 12 00:41:33 2019][239073.071248] LustreError: dumping log to /tmp/lustre-log.1576140093.26990 [Thu Dec 12 00:41:37 2019][239077.148952] LustreError: dumping log to /tmp/lustre-log.1576140097.27090 [Thu Dec 12 00:42:18 2019][239118.109781] LustreError: dumping log to /tmp/lustre-log.1576140138.112535 [Thu Dec 12 00:42:22 2019][239122.205857] LustreError: dumping log to /tmp/lustre-log.1576140142.27112 [Thu Dec 12 00:42:43 2019][239142.686281] Pid: 27079, comm: ll_ost_io03_076 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:42:43 2019][239142.697061] Call Trace: [Thu Dec 12 00:42:43 2019][239142.699631] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:42:43 2019][239142.706638] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:42:43 2019][239142.713838] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:42:43 2019][239142.720485] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:42:43 2019][239142.727235] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:42:43 2019][239142.734761] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:42:43 2019][239142.741866] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:42:43 2019][239142.748611] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:42:43 2019][239142.754752] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:42:43 2019][239142.761455] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:42:43 2019][239142.768511] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:42:43 2019][239142.776310] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:42:43 2019][239142.782738] [] kthread+0xd1/0xe0 [Thu Dec 12 00:42:43 2019][239142.787742] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:42:43 2019][239142.794317] [] 0xffffffffffffffff [Thu Dec 12 00:42:43 2019][239142.799419] LustreError: dumping log to /tmp/lustre-log.1576140163.27079 [Thu Dec 12 00:42:51 2019][239150.878428] Pid: 27113, comm: ll_ost_io03_080 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:42:51 2019][239150.889206] Call Trace: [Thu Dec 12 00:42:51 2019][239150.891762] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:42:51 2019][239150.898764] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:42:51 2019][239150.905958] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:42:51 2019][239150.912603] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:42:51 2019][239150.919352] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:42:51 2019][239150.926879] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:42:51 2019][239150.933990] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:42:51 2019][239150.940728] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:42:51 2019][239150.946884] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:42:51 2019][239150.953571] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:42:51 2019][239150.960606] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:42:51 2019][239150.968427] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:42:51 2019][239150.974839] [] kthread+0xd1/0xe0 [Thu Dec 12 00:42:51 2019][239150.979882] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:42:51 2019][239150.986438] [] 0xffffffffffffffff [Thu Dec 12 00:42:51 2019][239150.991541] LustreError: dumping log to /tmp/lustre-log.1576140171.27113 [Thu Dec 12 00:43:28 2019][239187.743171] Pid: 27093, comm: ll_ost_io02_095 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:43:28 2019][239187.753955] Call Trace: [Thu Dec 12 00:43:28 2019][239187.756530] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:43:28 2019][239187.763530] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:43:28 2019][239187.770712] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:43:28 2019][239187.777374] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:43:28 2019][239187.784126] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:43:28 2019][239187.791652] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:43:28 2019][239187.798756] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:43:28 2019][239187.804972] [] ofd_commitrw_write+0xa31/0x1d40 [ofd] [Thu Dec 12 00:43:28 2019][239187.811721] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:43:28 2019][239187.817851] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:43:28 2019][239187.824564] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:43:28 2019][239187.831588] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:43:28 2019][239187.839418] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:43:28 2019][239187.845838] [] kthread+0xd1/0xe0 [Thu Dec 12 00:43:28 2019][239187.850851] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:43:28 2019][239187.857418] [] 0xffffffffffffffff [Thu Dec 12 00:43:28 2019][239187.862528] LustreError: dumping log to /tmp/lustre-log.1576140208.27093 [Thu Dec 12 00:43:32 2019][239191.839250] Pid: 27066, comm: ll_ost_io03_073 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:43:32 2019][239191.850035] Call Trace: [Thu Dec 12 00:43:32 2019][239191.852596] [] __lock_page+0x74/0x90 [Thu Dec 12 00:43:32 2019][239191.857946] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:43:32 2019][239191.863747] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:43:32 2019][239191.869808] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:43:32 2019][239191.876656] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:43:32 2019][239191.883824] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:43:32 2019][239191.889878] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:43:32 2019][239191.896494] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:43:32 2019][239191.903529] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:43:32 2019][239191.911333] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:43:32 2019][239191.917762] [] kthread+0xd1/0xe0 [Thu Dec 12 00:43:32 2019][239191.922765] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:43:32 2019][239191.929341] [] 0xffffffffffffffff [Thu Dec 12 00:43:32 2019][239191.934464] LustreError: dumping log to /tmp/lustre-log.1576140212.27066 [Thu Dec 12 00:43:32 2019][239191.941836] Pid: 27089, comm: ll_ost_io00_088 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:43:32 2019][239191.952648] Call Trace: [Thu Dec 12 00:43:32 2019][239191.955194] [] __lock_page+0x74/0x90 [Thu Dec 12 00:43:32 2019][239191.960543] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:43:32 2019][239191.966329] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:43:32 2019][239191.972373] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:43:32 2019][239191.979210] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:43:32 2019][239191.986379] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:43:32 2019][239191.992435] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:43:32 2019][239191.999039] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:43:32 2019][239192.006076] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:43:32 2019][239192.013883] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:43:32 2019][239192.020316] [] kthread+0xd1/0xe0 [Thu Dec 12 00:43:32 2019][239192.025309] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:43:32 2019][239192.031887] [] 0xffffffffffffffff [Thu Dec 12 00:43:52 2019][239211.879472] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 00:43:52 2019][239211.889654] Lustre: Skipped 241 previous similar messages [Thu Dec 12 00:43:52 2019][239212.319668] LustreError: dumping log to /tmp/lustre-log.1576140232.27054 [Thu Dec 12 00:43:56 2019][239216.415741] LustreError: dumping log to /tmp/lustre-log.1576140236.27070 [Thu Dec 12 00:44:33 2019][239252.857485] LustreError: 67592:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576139973, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0054_UUID lock: ffff88f028303a80/0x7066c9c190addd87 lrc: 3/0,1 mode: --/PW res: [0x1800000401:0xb3d37e:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67592 timeout: 0 lvb_type: 0 [Thu Dec 12 00:44:46 2019][239265.568733] LustreError: dumping log to /tmp/lustre-log.1576140285.27053 [Thu Dec 12 00:44:47 2019][239266.586759] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:44:47 2019][239266.586759] req@ffff8912fbf3d050 x1651926262561472/t0(0) o4->360200e4-9bb2-dc52-b96e-5f48834c2e13@10.8.27.21@o2ib6:26/0 lens 488/0 e 1 to 0 dl 1576140291 ref 2 fl New:/0/ffffffff rc 0/-1 [Thu Dec 12 00:44:47 2019][239266.615399] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 530 previous similar messages [Thu Dec 12 00:45:04 2019][239284.519561] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 00:45:04 2019][239284.528524] Lustre: Skipped 59 previous similar messages [Thu Dec 12 00:45:09 2019][239288.861084] Lustre: fir-OST0058: Connection restored to c93954af-761b-f1eb-f651-9881322a7a72 (at 10.9.108.51@o2ib4) [Thu Dec 12 00:45:09 2019][239288.871607] Lustre: Skipped 286 previous similar messages [Thu Dec 12 00:45:10 2019][239290.145220] LustreError: dumping log to /tmp/lustre-log.1576140310.27073 [Thu Dec 12 00:45:31 2019][239310.625632] LustreError: dumping log to /tmp/lustre-log.1576140331.27186 [Thu Dec 12 00:45:35 2019][239314.721711] LustreError: dumping log to /tmp/lustre-log.1576140335.27126 [Thu Dec 12 00:45:48 2019][239328.033980] LustreError: dumping log to /tmp/lustre-log.1576140348.67696 [Thu Dec 12 00:46:17 2019][239356.626551] LustreError: 112566:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576140077, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005a_UUID lock: ffff8905d3ecc5c0/0x7066c9c190ade131 lrc: 3/0,1 mode: --/PW res: [0x1980000401:0xb4b138:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 112566 timeout: 0 lvb_type: 0 [Thu Dec 12 00:46:24 2019][239363.874698] LustreError: dumping log to /tmp/lustre-log.1576140384.27076 [Thu Dec 12 00:46:36 2019][239376.162942] LustreError: dumping log to /tmp/lustre-log.1576140396.26988 [Thu Dec 12 00:46:48 2019][239388.451179] LustreError: dumping log to /tmp/lustre-log.1576140408.27185 [Thu Dec 12 00:46:49 2019][239389.256104] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117226 to 0x1800000402:1117281 [Thu Dec 12 00:47:05 2019][239404.835509] LustreError: dumping log to /tmp/lustre-log.1576140425.27219 [Thu Dec 12 00:47:27 2019][239427.358966] LustreError: 67696:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576140147, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005a_UUID lock: ffff88fcfae90000/0x7066c9c190ade2a4 lrc: 3/0,1 mode: --/PW res: [0x1980000402:0x2f245a:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67696 timeout: 0 lvb_type: 0 [Thu Dec 12 00:47:29 2019][239429.412004] LustreError: dumping log to /tmp/lustre-log.1576140449.27226 [Thu Dec 12 00:47:33 2019][239433.508086] LustreError: dumping log to /tmp/lustre-log.1576140453.27075 [Thu Dec 12 00:47:38 2019][239437.604174] LustreError: dumping log to /tmp/lustre-log.1576140458.27254 [Thu Dec 12 00:47:42 2019][239441.700246] LustreError: dumping log to /tmp/lustre-log.1576140462.27259 [Thu Dec 12 00:47:46 2019][239445.796336] LNet: Service thread pid 27255 was inactive for 1202.86s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:47:46 2019][239445.813444] LNet: Skipped 9 previous similar messages [Thu Dec 12 00:47:46 2019][239445.818595] Pid: 27255, comm: ll_ost_io00_103 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:47:46 2019][239445.829372] Call Trace: [Thu Dec 12 00:47:46 2019][239445.831922] [] __lock_page+0x74/0x90 [Thu Dec 12 00:47:46 2019][239445.837275] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:47:46 2019][239445.843064] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:47:46 2019][239445.849096] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:47:46 2019][239445.855943] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:47:46 2019][239445.863114] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:47:46 2019][239445.869166] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:47:46 2019][239445.875774] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:47:46 2019][239445.882816] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:47:46 2019][239445.890620] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:47:46 2019][239445.897049] [] kthread+0xd1/0xe0 [Thu Dec 12 00:47:46 2019][239445.902053] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:47:46 2019][239445.908636] [] 0xffffffffffffffff [Thu Dec 12 00:47:46 2019][239445.913744] LustreError: dumping log to /tmp/lustre-log.1576140466.27255 [Thu Dec 12 00:47:50 2019][239449.892412] Pid: 27231, comm: ll_ost_io01_097 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:47:50 2019][239449.903196] Call Trace: [Thu Dec 12 00:47:50 2019][239449.905748] [] __lock_page+0x74/0x90 [Thu Dec 12 00:47:50 2019][239449.911099] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:47:50 2019][239449.916898] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:47:50 2019][239449.922931] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:47:50 2019][239449.929775] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:47:50 2019][239449.936958] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:47:50 2019][239449.943015] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:47:50 2019][239449.949633] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:47:50 2019][239449.956667] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:47:50 2019][239449.964469] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:47:50 2019][239449.970900] [] kthread+0xd1/0xe0 [Thu Dec 12 00:47:50 2019][239449.975903] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:47:50 2019][239449.982477] [] 0xffffffffffffffff [Thu Dec 12 00:47:50 2019][239449.987588] LustreError: dumping log to /tmp/lustre-log.1576140470.27231 [Thu Dec 12 00:47:50 2019][239449.994976] Pid: 27258, comm: ll_ost_io00_106 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:47:50 2019][239450.005769] Call Trace: [Thu Dec 12 00:47:50 2019][239450.008313] [] __lock_page+0x74/0x90 [Thu Dec 12 00:47:50 2019][239450.013653] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:47:50 2019][239450.019440] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:47:50 2019][239450.025477] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:47:50 2019][239450.032320] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:47:50 2019][239450.039490] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:47:50 2019][239450.045546] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.052145] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.059180] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.067000] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.073428] [] kthread+0xd1/0xe0 [Thu Dec 12 00:47:50 2019][239450.078422] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:47:50 2019][239450.084989] [] 0xffffffffffffffff [Thu Dec 12 00:47:50 2019][239450.090081] Pid: 27071, comm: ll_ost_io00_083 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:47:50 2019][239450.100868] Call Trace: [Thu Dec 12 00:47:50 2019][239450.103413] [] __lock_page+0x74/0x90 [Thu Dec 12 00:47:50 2019][239450.108755] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:47:50 2019][239450.114527] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:47:50 2019][239450.120583] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:47:50 2019][239450.127437] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:47:50 2019][239450.134623] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:47:50 2019][239450.140667] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.147270] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.154293] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.162105] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.168523] [] kthread+0xd1/0xe0 [Thu Dec 12 00:47:50 2019][239450.173529] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:47:50 2019][239450.180084] [] 0xffffffffffffffff [Thu Dec 12 00:47:50 2019][239450.185189] Pid: 27072, comm: ll_ost_io03_074 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:47:50 2019][239450.195977] Call Trace: [Thu Dec 12 00:47:50 2019][239450.198525] [] __lock_page+0x74/0x90 [Thu Dec 12 00:47:50 2019][239450.203867] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:47:50 2019][239450.209653] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:47:50 2019][239450.215688] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:47:50 2019][239450.222524] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:47:50 2019][239450.229694] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:47:50 2019][239450.235748] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.242338] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.249373] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.257191] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.263622] [] kthread+0xd1/0xe0 [Thu Dec 12 00:47:50 2019][239450.268617] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:47:50 2019][239450.275183] [] 0xffffffffffffffff [Thu Dec 12 00:47:54 2019][239453.988496] LustreError: dumping log to /tmp/lustre-log.1576140474.27250 [Thu Dec 12 00:47:57 2019][239457.037560] LustreError: 67629:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576140177, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0058_UUID lock: ffff890987068fc0/0x7066c9c190ade472 lrc: 3/0,1 mode: --/PW res: [0x1900000401:0xb402ed:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67629 timeout: 0 lvb_type: 0 [Thu Dec 12 00:47:58 2019][239458.084571] LustreError: dumping log to /tmp/lustre-log.1576140478.27279 [Thu Dec 12 00:48:02 2019][239462.180656] LustreError: dumping log to /tmp/lustre-log.1576140482.27263 [Thu Dec 12 00:48:10 2019][239470.372840] LustreError: dumping log to /tmp/lustre-log.1576140490.27260 [Thu Dec 12 00:48:19 2019][239478.564989] LustreError: dumping log to /tmp/lustre-log.1576140498.27264 [Thu Dec 12 00:48:27 2019][239486.757152] LustreError: dumping log to /tmp/lustre-log.1576140507.27275 [Thu Dec 12 00:48:31 2019][239490.853236] LustreError: dumping log to /tmp/lustre-log.1576140511.27261 [Thu Dec 12 00:48:35 2019][239494.949315] LustreError: dumping log to /tmp/lustre-log.1576140515.27225 [Thu Dec 12 00:48:43 2019][239503.141478] LustreError: dumping log to /tmp/lustre-log.1576140523.27278 [Thu Dec 12 00:48:47 2019][239507.237556] LustreError: dumping log to /tmp/lustre-log.1576140527.27310 [Thu Dec 12 00:48:51 2019][239511.333635] LustreError: dumping log to /tmp/lustre-log.1576140531.27312 [Thu Dec 12 00:48:55 2019][239515.429722] LustreError: dumping log to /tmp/lustre-log.1576140535.27085 [Thu Dec 12 00:48:59 2019][239519.525802] LustreError: dumping log to /tmp/lustre-log.1576140539.27272 [Thu Dec 12 00:49:04 2019][239523.621887] LustreError: dumping log to /tmp/lustre-log.1576140544.27232 [Thu Dec 12 00:49:08 2019][239527.717968] LustreError: dumping log to /tmp/lustre-log.1576140548.27298 [Thu Dec 12 00:49:12 2019][239531.814059] LustreError: dumping log to /tmp/lustre-log.1576140552.27314 [Thu Dec 12 00:49:32 2019][239552.294461] LustreError: dumping log to /tmp/lustre-log.1576140572.27324 [Thu Dec 12 00:49:36 2019][239556.390543] LustreError: dumping log to /tmp/lustre-log.1576140576.27322 [Thu Dec 12 00:49:53 2019][239572.671876] LustreError: 112528:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576140293, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005e_UUID lock: ffff890020f49440/0x7066c9c190ade74a lrc: 3/0,1 mode: --/PW res: [0x1a80000401:0x111eb5:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 112528 timeout: 0 lvb_type: 0 [Thu Dec 12 00:50:09 2019][239589.159211] LustreError: dumping log to /tmp/lustre-log.1576140609.67861 [Thu Dec 12 00:50:17 2019][239597.351356] LustreError: dumping log to /tmp/lustre-log.1576140617.27336 [Thu Dec 12 00:50:21 2019][239601.447451] LustreError: dumping log to /tmp/lustre-log.1576140621.112525 [Thu Dec 12 00:50:25 2019][239605.543530] LustreError: dumping log to /tmp/lustre-log.1576140625.67921 [Thu Dec 12 00:50:34 2019][239613.735683] LustreError: dumping log to /tmp/lustre-log.1576140634.27339 [Thu Dec 12 00:50:42 2019][239621.927847] LustreError: dumping log to /tmp/lustre-log.1576140642.27357 [Thu Dec 12 00:51:35 2019][239675.176916] LNet: Service thread pid 67602 was inactive for 1201.88s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:51:35 2019][239675.189983] LNet: Skipped 118 previous similar messages [Thu Dec 12 00:51:35 2019][239675.195311] LustreError: dumping log to /tmp/lustre-log.1576140695.67602 [Thu Dec 12 00:52:09 2019][239709.333597] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785095 to 0x1800000401:11785185 [Thu Dec 12 00:52:16 2019][239716.137736] LustreError: dumping log to /tmp/lustre-log.1576140736.67843 [Thu Dec 12 00:52:37 2019][239736.618140] LustreError: dumping log to /tmp/lustre-log.1576140757.67614 [Thu Dec 12 00:53:52 2019][239811.969335] Lustre: fir-OST005a: Client 882378af-0b41-73ee-5c10-5cc51464645c (at 10.9.108.22@o2ib4) reconnecting [Thu Dec 12 00:53:52 2019][239811.979602] Lustre: Skipped 300 previous similar messages [Thu Dec 12 00:53:53 2019][239813.287543] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11841861 to 0x1980000401:11841953 [Thu Dec 12 00:53:55 2019][239815.080705] LustreError: 112521:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576140535, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005e_UUID lock: ffff890fa9aae9c0/0x7066c9c190adf5ba lrc: 3/0,1 mode: --/PW res: [0x1a31e13:0x0:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 112521 timeout: 0 lvb_type: 0 [Thu Dec 12 00:54:15 2019][239834.924106] Pid: 67644, comm: ll_ost01_022 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:54:15 2019][239834.934621] Call Trace: [Thu Dec 12 00:54:15 2019][239834.937173] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 00:54:15 2019][239834.944215] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 00:54:15 2019][239834.951514] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 00:54:15 2019][239834.958160] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 00:54:15 2019][239834.964565] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:54:15 2019][239834.971602] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:54:15 2019][239834.979418] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:54:15 2019][239834.985834] [] kthread+0xd1/0xe0 [Thu Dec 12 00:54:15 2019][239834.990835] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:54:15 2019][239834.997402] [] 0xffffffffffffffff [Thu Dec 12 00:54:15 2019][239835.002510] LustreError: dumping log to /tmp/lustre-log.1576140855.67644 [Thu Dec 12 00:54:47 2019][239866.600723] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:54:47 2019][239866.600723] req@ffff88f2b0727850 x1650929716577312/t0(0) o10->68d8bbaa-aab1-a78f-323f-aaff4c375c30@10.8.7.4@o2ib6:626/0 lens 440/0 e 0 to 0 dl 1576140891 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 00:54:47 2019][239866.629355] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 710 previous similar messages [Thu Dec 12 00:54:52 2019][239871.788812] Pid: 66094, comm: ll_ost00_001 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:54:52 2019][239871.799329] Call Trace: [Thu Dec 12 00:54:52 2019][239871.801896] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:54:52 2019][239871.808894] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:54:52 2019][239871.816096] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:54:52 2019][239871.822745] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:54:52 2019][239871.829494] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:54:52 2019][239871.837020] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 00:54:52 2019][239871.844115] [] dqget+0x3fa/0x450 [Thu Dec 12 00:54:52 2019][239871.849119] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 00:54:52 2019][239871.854927] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 00:54:52 2019][239871.862549] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 00:54:52 2019][239871.869040] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 00:54:52 2019][239871.875172] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:54:52 2019][239871.882215] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:54:52 2019][239871.890017] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:54:52 2019][239871.896446] [] kthread+0xd1/0xe0 [Thu Dec 12 00:54:52 2019][239871.901449] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:54:52 2019][239871.908025] [] 0xffffffffffffffff [Thu Dec 12 00:54:52 2019][239871.913127] LustreError: dumping log to /tmp/lustre-log.1576140892.66094 [Thu Dec 12 00:54:56 2019][239875.884892] Pid: 67782, comm: ll_ost00_045 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:54:56 2019][239875.895408] Call Trace: [Thu Dec 12 00:54:56 2019][239875.897977] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:54:56 2019][239875.904976] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:54:56 2019][239875.912158] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:54:56 2019][239875.918808] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:54:56 2019][239875.925557] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:54:56 2019][239875.933082] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 00:54:56 2019][239875.940179] [] dqget+0x3fa/0x450 [Thu Dec 12 00:54:56 2019][239875.945181] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 00:54:56 2019][239875.950971] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 00:54:56 2019][239875.958594] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 00:54:56 2019][239875.965075] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 00:54:56 2019][239875.971226] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:54:56 2019][239875.978263] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:54:56 2019][239875.986064] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:54:56 2019][239875.992490] [] kthread+0xd1/0xe0 [Thu Dec 12 00:54:56 2019][239875.997487] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:54:56 2019][239876.004062] [] 0xffffffffffffffff [Thu Dec 12 00:54:56 2019][239876.009155] LustreError: dumping log to /tmp/lustre-log.1576140896.67782 [Thu Dec 12 00:55:04 2019][239883.857031] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089507 to 0x1980000402:3089569 [Thu Dec 12 00:55:07 2019][239886.643484] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 00:55:07 2019][239886.652449] Lustre: Skipped 62 previous similar messages [Thu Dec 12 00:55:09 2019][239888.989286] Lustre: fir-OST005a: Connection restored to f5f4e1fb-09a1-cbb0-925c-f94d3727005b (at 10.9.101.44@o2ib4) [Thu Dec 12 00:55:09 2019][239888.999806] Lustre: Skipped 345 previous similar messages [Thu Dec 12 00:55:33 2019][239913.208552] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797241 to 0x1900000401:11797281 [Thu Dec 12 00:55:37 2019][239916.845706] Pid: 67901, comm: ll_ost01_070 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:55:37 2019][239916.856230] Call Trace: [Thu Dec 12 00:55:37 2019][239916.858809] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:55:37 2019][239916.865806] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:55:37 2019][239916.872991] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:55:37 2019][239916.879638] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:55:37 2019][239916.886386] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:55:37 2019][239916.893911] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 00:55:37 2019][239916.901005] [] dqget+0x3fa/0x450 [Thu Dec 12 00:55:37 2019][239916.906011] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 00:55:37 2019][239916.911796] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 00:55:37 2019][239916.919410] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 00:55:37 2019][239916.925908] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 00:55:37 2019][239916.932051] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:55:37 2019][239916.939115] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:55:37 2019][239916.946919] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:55:37 2019][239916.953348] [] kthread+0xd1/0xe0 [Thu Dec 12 00:55:37 2019][239916.958349] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:55:37 2019][239916.964925] [] 0xffffffffffffffff [Thu Dec 12 00:55:37 2019][239916.970027] LustreError: dumping log to /tmp/lustre-log.1576140937.67901 [Thu Dec 12 00:55:41 2019][239920.941801] Pid: 67702, comm: ll_ost03_031 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:55:41 2019][239920.952318] Call Trace: [Thu Dec 12 00:55:41 2019][239920.954889] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:55:41 2019][239920.961886] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:55:41 2019][239920.969068] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:55:41 2019][239920.975717] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:55:41 2019][239920.982468] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:55:41 2019][239920.989991] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 00:55:41 2019][239920.997096] [] dqget+0x3fa/0x450 [Thu Dec 12 00:55:41 2019][239921.002099] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 00:55:41 2019][239921.007885] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 00:55:41 2019][239921.015498] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 00:55:41 2019][239921.021986] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 00:55:41 2019][239921.028131] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:55:41 2019][239921.035196] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:55:41 2019][239921.042998] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:55:41 2019][239921.049426] [] kthread+0xd1/0xe0 [Thu Dec 12 00:55:41 2019][239921.054430] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:55:41 2019][239921.061006] [] 0xffffffffffffffff [Thu Dec 12 00:55:41 2019][239921.066106] LustreError: dumping log to /tmp/lustre-log.1576140941.67702 [Thu Dec 12 00:55:49 2019][239929.133944] LustreError: dumping log to /tmp/lustre-log.1576140949.67900 [Thu Dec 12 00:56:19 2019][239959.099553] LustreError: 67744:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576140679, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005a_UUID lock: ffff88e670859680/0x7066c9c190ae10c5 lrc: 3/0,1 mode: --/PW res: [0x1980000400:0x112c96:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67744 timeout: 0 lvb_type: 0 [Thu Dec 12 00:56:30 2019][239970.094745] LustreError: dumping log to /tmp/lustre-log.1576140990.67694 [Thu Dec 12 00:56:34 2019][239974.190821] LustreError: dumping log to /tmp/lustre-log.1576140994.67683 [Thu Dec 12 00:57:29 2019][240028.686582] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1121977 to 0x1a80000401:1122017 [Thu Dec 12 00:59:25 2019][240145.394789] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117286 to 0x1800000402:1117313 [Thu Dec 12 00:59:34 2019][240154.418384] LNet: Service thread pid 67592 was inactive for 1201.54s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:59:34 2019][240154.435493] LNet: Skipped 9 previous similar messages [Thu Dec 12 00:59:34 2019][240154.440636] Pid: 67592, comm: ll_ost02_009 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:59:34 2019][240154.451171] Call Trace: [Thu Dec 12 00:59:34 2019][240154.453730] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 00:59:34 2019][240154.460772] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 00:59:34 2019][240154.468068] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 00:59:34 2019][240154.474717] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 00:59:34 2019][240154.481120] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:59:34 2019][240154.488159] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:59:34 2019][240154.495974] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:59:34 2019][240154.502392] [] kthread+0xd1/0xe0 [Thu Dec 12 00:59:34 2019][240154.507404] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:59:34 2019][240154.513970] [] 0xffffffffffffffff [Thu Dec 12 00:59:34 2019][240154.519101] LustreError: dumping log to /tmp/lustre-log.1576141174.67592 [Thu Dec 12 01:00:52 2019][240232.243942] Pid: 112494, comm: ll_ost00_075 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:00:52 2019][240232.254546] Call Trace: [Thu Dec 12 01:00:52 2019][240232.257127] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:00:52 2019][240232.264131] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:00:52 2019][240232.271314] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:00:52 2019][240232.277961] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:00:52 2019][240232.284716] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:00:52 2019][240232.292237] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:00:52 2019][240232.299340] [] dqget+0x3fa/0x450 [Thu Dec 12 01:00:52 2019][240232.304366] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:00:52 2019][240232.310150] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:00:52 2019][240232.317777] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:00:52 2019][240232.324253] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:00:52 2019][240232.330395] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:00:52 2019][240232.337433] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:00:52 2019][240232.345257] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:00:52 2019][240232.351674] [] kthread+0xd1/0xe0 [Thu Dec 12 01:00:52 2019][240232.356674] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:00:52 2019][240232.363252] [] 0xffffffffffffffff [Thu Dec 12 01:00:52 2019][240232.368350] LustreError: dumping log to /tmp/lustre-log.1576141252.112494 [Thu Dec 12 01:00:56 2019][240236.340028] Pid: 67736, comm: ll_ost02_034 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:00:56 2019][240236.350548] Call Trace: [Thu Dec 12 01:00:56 2019][240236.353118] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:00:56 2019][240236.360121] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:00:56 2019][240236.367304] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:00:56 2019][240236.373954] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:00:56 2019][240236.380704] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:00:56 2019][240236.388229] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:00:56 2019][240236.395323] [] dqget+0x3fa/0x450 [Thu Dec 12 01:00:56 2019][240236.400344] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:00:56 2019][240236.406140] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:00:56 2019][240236.413754] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:00:56 2019][240236.420242] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:00:56 2019][240236.426372] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:00:56 2019][240236.433433] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:00:56 2019][240236.441235] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:00:56 2019][240236.447664] [] kthread+0xd1/0xe0 [Thu Dec 12 01:00:56 2019][240236.452667] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:00:56 2019][240236.459243] [] 0xffffffffffffffff [Thu Dec 12 01:00:56 2019][240236.464356] LustreError: dumping log to /tmp/lustre-log.1576141256.67736 [Thu Dec 12 01:01:00 2019][240240.436112] Pid: 67785, comm: ll_ost02_040 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:01:00 2019][240240.446628] Call Trace: [Thu Dec 12 01:01:00 2019][240240.449190] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:01:00 2019][240240.456184] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:01:00 2019][240240.463367] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:01:00 2019][240240.470018] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:01:00 2019][240240.476766] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:01:00 2019][240240.484291] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:01:00 2019][240240.491390] [] dqget+0x3fa/0x450 [Thu Dec 12 01:01:00 2019][240240.496391] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:01:00 2019][240240.502199] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:01:00 2019][240240.509799] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:01:00 2019][240240.516286] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:01:00 2019][240240.522417] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:01:00 2019][240240.529469] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:01:00 2019][240240.537271] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:01:00 2019][240240.543699] [] kthread+0xd1/0xe0 [Thu Dec 12 01:01:01 2019][240240.548704] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:01:01 2019][240240.555269] [] 0xffffffffffffffff [Thu Dec 12 01:01:01 2019][240240.560377] LustreError: dumping log to /tmp/lustre-log.1576141260.67785 [Thu Dec 12 01:01:17 2019][240256.820444] Pid: 112566, comm: ll_ost02_086 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:01:17 2019][240256.831052] Call Trace: [Thu Dec 12 01:01:17 2019][240256.833606] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:01:17 2019][240256.840646] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:01:17 2019][240256.847941] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:01:17 2019][240256.854590] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:01:17 2019][240256.860991] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:01:17 2019][240256.868034] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:01:17 2019][240256.875847] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:01:17 2019][240256.882282] [] kthread+0xd1/0xe0 [Thu Dec 12 01:01:17 2019][240256.887282] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:01:17 2019][240256.893850] [] 0xffffffffffffffff [Thu Dec 12 01:01:17 2019][240256.898960] LustreError: dumping log to /tmp/lustre-log.1576141277.112566 [Thu Dec 12 01:01:20 2019][240260.434521] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125531 to 0x1980000400:1125569 [Thu Dec 12 01:01:31 2019][240270.998164] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467286 to 0x0:27467329 [Thu Dec 12 01:02:15 2019][240314.616600] LustreError: 67741:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576141035, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005a_UUID lock: ffff88e75d68f740/0x7066c9c190ae6a91 lrc: 3/0,1 mode: --/PW res: [0x1980000400:0x112c98:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67741 timeout: 0 lvb_type: 0 [Thu Dec 12 01:02:18 2019][240318.261695] LNet: Service thread pid 67675 was inactive for 1203.78s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 01:02:18 2019][240318.274731] LNet: Skipped 6 previous similar messages [Thu Dec 12 01:02:18 2019][240318.279879] LustreError: dumping log to /tmp/lustre-log.1576141338.67675 [Thu Dec 12 01:02:47 2019][240346.934237] LustreError: dumping log to /tmp/lustre-log.1576141367.112557 [Thu Dec 12 01:02:55 2019][240355.126422] LustreError: dumping log to /tmp/lustre-log.1576141375.67687 [Thu Dec 12 01:02:59 2019][240359.222485] LustreError: dumping log to /tmp/lustre-log.1576141379.67629 [Thu Dec 12 01:03:03 2019][240363.318573] LustreError: dumping log to /tmp/lustre-log.1576141383.67682 [Thu Dec 12 01:03:11 2019][240371.510737] LustreError: dumping log to /tmp/lustre-log.1576141391.67599 [Thu Dec 12 01:03:15 2019][240375.606814] LustreError: dumping log to /tmp/lustre-log.1576141395.112499 [Thu Dec 12 01:03:54 2019][240414.463197] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 01:03:54 2019][240414.473370] Lustre: Skipped 412 previous similar messages [Thu Dec 12 01:03:56 2019][240416.567635] LustreError: dumping log to /tmp/lustre-log.1576141436.112543 [Thu Dec 12 01:04:05 2019][240424.759800] LustreError: dumping log to /tmp/lustre-log.1576141445.67709 [Thu Dec 12 01:04:09 2019][240428.855879] LustreError: dumping log to /tmp/lustre-log.1576141449.67814 [Thu Dec 12 01:04:13 2019][240432.952007] LustreError: dumping log to /tmp/lustre-log.1576141453.112517 [Thu Dec 12 01:04:45 2019][240465.432406] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785188 to 0x1800000401:11785217 [Thu Dec 12 01:04:47 2019][240467.576679] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 01:04:47 2019][240467.576679] req@ffff88fd6e0f9050 x1650929716577312/t0(0) o10->68d8bbaa-aab1-a78f-323f-aaff4c375c30@10.8.7.4@o2ib6:472/0 lens 440/0 e 0 to 0 dl 1576141492 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 01:04:48 2019][240467.605315] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 700 previous similar messages [Thu Dec 12 01:04:54 2019][240473.912790] Pid: 112528, comm: ll_ost01_085 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:04:54 2019][240473.923403] Call Trace: [Thu Dec 12 01:04:54 2019][240473.925961] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:04:54 2019][240473.933002] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:04:54 2019][240473.940317] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:04:54 2019][240473.946979] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:04:54 2019][240473.953396] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:04:54 2019][240473.960439] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:04:54 2019][240473.968271] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:04:54 2019][240473.974688] [] kthread+0xd1/0xe0 [Thu Dec 12 01:04:54 2019][240473.979689] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:04:54 2019][240473.986265] [] 0xffffffffffffffff [Thu Dec 12 01:04:54 2019][240473.991373] LustreError: dumping log to /tmp/lustre-log.1576141494.112528 [Thu Dec 12 01:04:54 2019][240473.998923] Pid: 67595, comm: ll_ost00_008 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:04:54 2019][240474.009462] Call Trace: [Thu Dec 12 01:04:54 2019][240474.012021] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:04:54 2019][240474.019018] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:04:54 2019][240474.026201] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:04:54 2019][240474.032851] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:04:54 2019][240474.039586] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:04:54 2019][240474.047123] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:04:54 2019][240474.054216] [] dqget+0x3fa/0x450 [Thu Dec 12 01:04:54 2019][240474.059231] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:04:54 2019][240474.065006] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:04:54 2019][240474.072629] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:04:54 2019][240474.079106] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:04:54 2019][240474.085249] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:04:54 2019][240474.092279] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:04:54 2019][240474.100094] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:04:54 2019][240474.106510] [] kthread+0xd1/0xe0 [Thu Dec 12 01:04:54 2019][240474.111503] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:04:54 2019][240474.118073] [] 0xffffffffffffffff [Thu Dec 12 01:05:09 2019][240488.767686] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 01:05:09 2019][240488.776652] Lustre: Skipped 71 previous similar messages [Thu Dec 12 01:05:10 2019][240489.986614] Lustre: fir-OST005c: Connection restored to cec884d3-ca4b-8127-2f6b-7762665aa5f8 (at 10.9.0.64@o2ib4) [Thu Dec 12 01:05:10 2019][240489.996967] Lustre: Skipped 365 previous similar messages [Thu Dec 12 01:05:14 2019][240494.393191] Pid: 67651, comm: ll_ost02_021 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:05:14 2019][240494.403725] Call Trace: [Thu Dec 12 01:05:14 2019][240494.406297] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 01:05:14 2019][240494.412703] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:05:14 2019][240494.419773] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:05:14 2019][240494.427610] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:05:14 2019][240494.434032] [] kthread+0xd1/0xe0 [Thu Dec 12 01:05:14 2019][240494.439037] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:05:14 2019][240494.445631] [] 0xffffffffffffffff [Thu Dec 12 01:05:14 2019][240494.450750] LustreError: dumping log to /tmp/lustre-log.1576141514.67651 [Thu Dec 12 01:06:30 2019][240569.618538] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11841959 to 0x1980000401:11841985 [Thu Dec 12 01:06:44 2019][240584.506993] Pid: 67889, comm: ll_ost02_059 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:06:44 2019][240584.517517] Call Trace: [Thu Dec 12 01:06:44 2019][240584.520095] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:06:44 2019][240584.527093] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:06:44 2019][240584.534273] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:06:44 2019][240584.540924] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:06:44 2019][240584.547672] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:06:44 2019][240584.555212] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:06:44 2019][240584.562310] [] dqget+0x3fa/0x450 [Thu Dec 12 01:06:44 2019][240584.567314] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:06:44 2019][240584.573108] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:06:44 2019][240584.580721] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:06:44 2019][240584.587209] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:06:44 2019][240584.593339] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:06:45 2019][240584.600401] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:06:45 2019][240584.608205] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:06:45 2019][240584.614634] [] kthread+0xd1/0xe0 [Thu Dec 12 01:06:45 2019][240584.619649] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:06:45 2019][240584.626228] [] 0xffffffffffffffff [Thu Dec 12 01:06:45 2019][240584.631329] LustreError: dumping log to /tmp/lustre-log.1576141605.67889 [Thu Dec 12 01:06:53 2019][240592.699158] Pid: 66242, comm: ll_ost03_003 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:06:53 2019][240592.709677] Call Trace: [Thu Dec 12 01:06:53 2019][240592.712246] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:06:53 2019][240592.719242] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:06:53 2019][240592.726431] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:06:53 2019][240592.733078] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:06:53 2019][240592.739825] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:06:53 2019][240592.747365] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:06:53 2019][240592.754463] [] dqget+0x3fa/0x450 [Thu Dec 12 01:06:53 2019][240592.759467] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:06:53 2019][240592.765254] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:06:53 2019][240592.772865] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:06:53 2019][240592.779355] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:06:53 2019][240592.785485] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:06:53 2019][240592.792544] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:06:53 2019][240592.800350] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:06:53 2019][240592.806778] [] kthread+0xd1/0xe0 [Thu Dec 12 01:06:53 2019][240592.811794] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:06:53 2019][240592.818373] [] 0xffffffffffffffff [Thu Dec 12 01:06:53 2019][240592.823464] LustreError: dumping log to /tmp/lustre-log.1576141613.66242 [Thu Dec 12 01:06:57 2019][240596.795243] LustreError: dumping log to /tmp/lustre-log.1576141617.67716 [Thu Dec 12 01:07:25 2019][240625.467815] LustreError: dumping log to /tmp/lustre-log.1576141645.67677 [Thu Dec 12 01:07:29 2019][240629.563893] LustreError: dumping log to /tmp/lustre-log.1576141649.67878 [Thu Dec 12 01:07:34 2019][240633.659971] LustreError: dumping log to /tmp/lustre-log.1576141654.67778 [Thu Dec 12 01:07:38 2019][240637.756058] LustreError: dumping log to /tmp/lustre-log.1576141658.112501 [Thu Dec 12 01:07:40 2019][240640.052001] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089578 to 0x1980000402:3089601 [Thu Dec 12 01:08:09 2019][240669.116318] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797283 to 0x1900000401:11797313 [Thu Dec 12 01:08:56 2019][240715.581615] LustreError: dumping log to /tmp/lustre-log.1576141735.112521 [Thu Dec 12 01:09:08 2019][240727.869858] LustreError: dumping log to /tmp/lustre-log.1576141748.67607 [Thu Dec 12 01:09:13 2019][240733.302978] LustreError: 112496:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576141453, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0058_UUID lock: ffff89134edb5580/0x7066c9c190ae820e lrc: 3/0,1 mode: --/PW res: [0x1a3b7cb:0x0:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 112496 timeout: 0 lvb_type: 0 [Thu Dec 12 01:10:05 2019][240784.663581] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122023 to 0x1a80000401:1122049 [Thu Dec 12 01:11:21 2019][240861.033099] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125573 to 0x1980000400:1125601 [Thu Dec 12 01:11:23 2019][240863.040457] LNet: Service thread pid 67744 was inactive for 1203.92s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 01:11:23 2019][240863.057568] LNet: Skipped 9 previous similar messages [Thu Dec 12 01:11:23 2019][240863.062720] Pid: 67744, comm: ll_ost01_045 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:11:23 2019][240863.073256] Call Trace: [Thu Dec 12 01:11:23 2019][240863.075816] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:11:23 2019][240863.082855] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:11:23 2019][240863.090152] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:11:23 2019][240863.096808] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:11:23 2019][240863.103211] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:11:23 2019][240863.110268] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:11:23 2019][240863.118083] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:11:23 2019][240863.124499] [] kthread+0xd1/0xe0 [Thu Dec 12 01:11:23 2019][240863.129515] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:11:23 2019][240863.136080] [] 0xffffffffffffffff [Thu Dec 12 01:11:23 2019][240863.141201] LustreError: dumping log to /tmp/lustre-log.1576141883.67744 [Thu Dec 12 01:12:00 2019][240899.905137] Pid: 67865, comm: ll_ost02_057 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:12:00 2019][240899.915661] Call Trace: [Thu Dec 12 01:12:00 2019][240899.918230] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:12:00 2019][240899.925235] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:12:00 2019][240899.932424] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:12:00 2019][240899.939095] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:12:00 2019][240899.945845] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:12:00 2019][240899.953371] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 01:12:00 2019][240899.960483] [] tgt_client_data_update+0x303/0x5e0 [ptlrpc] [Thu Dec 12 01:12:00 2019][240899.967802] [] tgt_client_new+0x41b/0x610 [ptlrpc] [Thu Dec 12 01:12:00 2019][240899.974420] [] ofd_obd_connect+0x3a3/0x4c0 [ofd] [Thu Dec 12 01:12:00 2019][240899.980811] [] target_handle_connect+0xecb/0x2b10 [ptlrpc] [Thu Dec 12 01:12:00 2019][240899.988112] [] tgt_request_handle+0x50a/0x1580 [ptlrpc] [Thu Dec 12 01:12:00 2019][240899.995145] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:12:00 2019][240900.002973] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:12:00 2019][240900.009393] [] kthread+0xd1/0xe0 [Thu Dec 12 01:12:00 2019][240900.014410] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:12:00 2019][240900.020973] [] 0xffffffffffffffff [Thu Dec 12 01:12:00 2019][240900.026094] LustreError: dumping log to /tmp/lustre-log.1576141920.67865 [Thu Dec 12 01:12:01 2019][240901.521950] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117319 to 0x1800000402:1117345 [Thu Dec 12 01:12:16 2019][240916.289468] Pid: 112529, comm: ll_ost00_086 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:12:16 2019][240916.300076] Call Trace: [Thu Dec 12 01:12:16 2019][240916.302647] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:12:16 2019][240916.309643] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:12:16 2019][240916.316830] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:12:16 2019][240916.323491] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:12:16 2019][240916.330243] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:12:16 2019][240916.337767] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:12:16 2019][240916.344864] [] dqget+0x3fa/0x450 [Thu Dec 12 01:12:16 2019][240916.349868] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:12:16 2019][240916.355638] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:12:16 2019][240916.363264] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:12:16 2019][240916.369739] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:12:16 2019][240916.375883] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:12:16 2019][240916.382922] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:12:16 2019][240916.390749] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:12:16 2019][240916.397162] [] kthread+0xd1/0xe0 [Thu Dec 12 01:12:16 2019][240916.402176] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:12:16 2019][240916.408739] [] 0xffffffffffffffff [Thu Dec 12 01:12:16 2019][240916.413853] LustreError: dumping log to /tmp/lustre-log.1576141936.112529 [Thu Dec 12 01:13:01 2019][240961.346362] Pid: 66097, comm: ll_ost01_000 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:13:01 2019][240961.356882] Call Trace: [Thu Dec 12 01:13:01 2019][240961.359434] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 01:13:01 2019][240961.365827] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:13:01 2019][240961.372875] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:13:01 2019][240961.380680] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:13:01 2019][240961.387113] [] kthread+0xd1/0xe0 [Thu Dec 12 01:13:01 2019][240961.392113] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:13:01 2019][240961.398689] [] 0xffffffffffffffff [Thu Dec 12 01:13:01 2019][240961.403796] LustreError: dumping log to /tmp/lustre-log.1576141981.66097 [Thu Dec 12 01:13:55 2019][241015.435044] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 01:13:55 2019][241015.445224] Lustre: Skipped 494 previous similar messages [Thu Dec 12 01:14:08 2019][241028.022137] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467286 to 0x0:27467361 [Thu Dec 12 01:14:29 2019][241048.732841] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1104956 to 0x1a00000402:1105313 [Thu Dec 12 01:14:48 2019][241068.036509] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 01:14:48 2019][241068.036509] req@ffff88f8ca67e050 x1650929716577312/t0(0) o10->68d8bbaa-aab1-a78f-323f-aaff4c375c30@10.8.7.4@o2ib6:318/0 lens 440/0 e 0 to 0 dl 1576142093 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 01:14:48 2019][241068.065164] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 970 previous similar messages [Thu Dec 12 01:14:52 2019][241071.940600] Pid: 67604, comm: ll_ost00_010 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:14:52 2019][241071.951129] Call Trace: [Thu Dec 12 01:14:52 2019][241071.953697] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:14:52 2019][241071.960713] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:14:52 2019][241071.967898] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:14:52 2019][241071.974544] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:14:52 2019][241071.981293] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:14:52 2019][241071.988820] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:14:52 2019][241071.995916] [] dqget+0x3fa/0x450 [Thu Dec 12 01:14:52 2019][241072.000920] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:14:52 2019][241072.006692] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:14:52 2019][241072.014309] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:14:52 2019][241072.020783] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:14:52 2019][241072.026926] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:14:52 2019][241072.033966] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:14:52 2019][241072.041781] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:14:52 2019][241072.048197] [] kthread+0xd1/0xe0 [Thu Dec 12 01:14:52 2019][241072.053213] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:14:52 2019][241072.059790] [] 0xffffffffffffffff [Thu Dec 12 01:14:52 2019][241072.064906] LustreError: dumping log to /tmp/lustre-log.1576142092.67604 [Thu Dec 12 01:15:11 2019][241090.891516] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 01:15:11 2019][241090.900478] Lustre: Skipped 71 previous similar messages [Thu Dec 12 01:15:11 2019][241090.992913] Lustre: fir-OST005c: Connection restored to cec884d3-ca4b-8127-2f6b-7762665aa5f8 (at 10.9.0.64@o2ib4) [Thu Dec 12 01:15:11 2019][241091.003253] Lustre: Skipped 502 previous similar messages [Thu Dec 12 01:15:21 2019][241100.613140] LNet: Service thread pid 67739 was inactive for 1201.53s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 01:15:21 2019][241100.626170] LNet: Skipped 22 previous similar messages [Thu Dec 12 01:15:21 2019][241100.631400] LustreError: dumping log to /tmp/lustre-log.1576142121.67739 [Thu Dec 12 01:16:49 2019][241188.704342] Lustre: fir-OST0058: deleting orphan objects from 0x0:27506640 to 0x0:27506657 [Thu Dec 12 01:16:51 2019][241190.726934] Pid: 67743, comm: ll_ost03_040 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:16:51 2019][241190.737459] Call Trace: [Thu Dec 12 01:16:51 2019][241190.740019] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 01:16:51 2019][241190.746416] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:16:51 2019][241190.753481] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:16:51 2019][241190.761282] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:16:51 2019][241190.767717] [] kthread+0xd1/0xe0 [Thu Dec 12 01:16:51 2019][241190.772738] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:16:51 2019][241190.779333] [] 0xffffffffffffffff [Thu Dec 12 01:16:51 2019][241190.784484] LustreError: dumping log to /tmp/lustre-log.1576142211.67743 [Thu Dec 12 01:17:15 2019][241215.303418] Pid: 67741, comm: ll_ost01_044 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:17:15 2019][241215.313938] Call Trace: [Thu Dec 12 01:17:15 2019][241215.316490] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:17:15 2019][241215.323533] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:17:15 2019][241215.330828] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:17:15 2019][241215.337476] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:17:15 2019][241215.343878] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:17:15 2019][241215.350937] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:17:15 2019][241215.358753] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:17:15 2019][241215.365168] [] kthread+0xd1/0xe0 [Thu Dec 12 01:17:15 2019][241215.370167] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:17:15 2019][241215.376736] [] 0xffffffffffffffff [Thu Dec 12 01:17:15 2019][241215.381845] LustreError: dumping log to /tmp/lustre-log.1576142235.67741 [Thu Dec 12 01:17:21 2019][241220.735308] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785223 to 0x1800000401:11785249 [Thu Dec 12 01:17:27 2019][241227.591675] Pid: 112495, comm: ll_ost00_076 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:17:27 2019][241227.602275] Call Trace: [Thu Dec 12 01:17:27 2019][241227.604843] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:17:27 2019][241227.611842] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:17:28 2019][241227.619027] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:17:28 2019][241227.625674] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:17:28 2019][241227.632425] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:17:28 2019][241227.639949] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:17:28 2019][241227.647045] [] dqget+0x3fa/0x450 [Thu Dec 12 01:17:28 2019][241227.652048] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:17:28 2019][241227.657821] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:17:28 2019][241227.665445] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:17:28 2019][241227.671929] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:17:28 2019][241227.678072] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:17:28 2019][241227.685112] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:17:28 2019][241227.692927] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:17:28 2019][241227.699342] [] kthread+0xd1/0xe0 [Thu Dec 12 01:17:28 2019][241227.704359] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:17:28 2019][241227.710923] [] 0xffffffffffffffff [Thu Dec 12 01:17:28 2019][241227.716033] LustreError: dumping log to /tmp/lustre-log.1576142248.112495 [Thu Dec 12 01:17:48 2019][241248.072071] Pid: 67793, comm: ll_ost02_042 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:17:48 2019][241248.082590] Call Trace: [Thu Dec 12 01:17:48 2019][241248.085141] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 01:17:48 2019][241248.091542] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:17:48 2019][241248.098620] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:17:48 2019][241248.106423] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:17:48 2019][241248.112852] [] kthread+0xd1/0xe0 [Thu Dec 12 01:17:48 2019][241248.117854] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:17:48 2019][241248.124429] [] 0xffffffffffffffff [Thu Dec 12 01:17:48 2019][241248.129539] LustreError: dumping log to /tmp/lustre-log.1576142268.67793 [Thu Dec 12 01:18:21 2019][241280.840714] Pid: 113357, comm: ll_ost02_095 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:18:21 2019][241280.851317] Call Trace: [Thu Dec 12 01:18:21 2019][241280.853885] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:18:21 2019][241280.860889] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:18:21 2019][241280.868082] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:18:21 2019][241280.874750] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:18:21 2019][241280.881501] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:18:21 2019][241280.889027] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:18:21 2019][241280.896121] [] dqget+0x3fa/0x450 [Thu Dec 12 01:18:21 2019][241280.901126] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:18:21 2019][241280.906899] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:18:21 2019][241280.914517] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:18:21 2019][241280.920991] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:18:21 2019][241280.927133] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:18:21 2019][241280.934194] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:18:21 2019][241280.942014] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:18:21 2019][241280.948429] [] kthread+0xd1/0xe0 [Thu Dec 12 01:18:21 2019][241280.953431] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:18:21 2019][241280.960000] [] 0xffffffffffffffff [Thu Dec 12 01:18:21 2019][241280.965098] LustreError: dumping log to /tmp/lustre-log.1576142301.113357 [Thu Dec 12 01:18:25 2019][241284.936832] LustreError: dumping log to /tmp/lustre-log.1576142305.67407 [Thu Dec 12 01:18:29 2019][241289.014892] LustreError: 67855:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576142009, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005a_UUID lock: ffff890f4781a640/0x7066c9c190aea1b8 lrc: 3/0,1 mode: --/PW res: [0x1980000400:0x112ce2:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67855 timeout: 0 lvb_type: 0 [Thu Dec 12 01:18:29 2019][241289.032891] LustreError: dumping log to /tmp/lustre-log.1576142309.67795 [Thu Dec 12 01:18:29 2019][241289.065418] LustreError: 67855:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 1 previous similar message [Thu Dec 12 01:19:06 2019][241326.198724] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11841995 to 0x1980000401:11842017 [Thu Dec 12 01:20:07 2019][241387.338834] LustreError: dumping log to /tmp/lustre-log.1576142407.67700 [Thu Dec 12 01:20:16 2019][241396.226912] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089607 to 0x1980000402:3089633 [Thu Dec 12 01:20:45 2019][241425.227585] LustreError: dumping log to /tmp/lustre-log.1576142445.67588 [Thu Dec 12 01:20:45 2019][241425.571294] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797320 to 0x1900000401:11797345 [Thu Dec 12 01:21:22 2019][241462.040565] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125604 to 0x1980000400:1125633 [Thu Dec 12 01:21:46 2019][241485.644788] LustreError: dumping log to /tmp/lustre-log.1576142506.67790 [Thu Dec 12 01:22:18 2019][241518.413451] LNet: Service thread pid 67699 was inactive for 1203.91s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 01:22:18 2019][241518.430559] LNet: Skipped 9 previous similar messages [Thu Dec 12 01:22:18 2019][241518.435707] Pid: 67699, comm: ll_ost00_034 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:22:18 2019][241518.446243] Call Trace: [Thu Dec 12 01:22:18 2019][241518.448817] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:22:18 2019][241518.455823] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:22:18 2019][241518.463006] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:22:18 2019][241518.469657] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:22:18 2019][241518.476405] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:22:18 2019][241518.483945] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:22:18 2019][241518.491050] [] dqget+0x3fa/0x450 [Thu Dec 12 01:22:18 2019][241518.496054] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:22:18 2019][241518.501826] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:22:18 2019][241518.509447] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:22:18 2019][241518.515929] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:22:18 2019][241518.522073] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:22:18 2019][241518.529109] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:22:18 2019][241518.536911] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:22:18 2019][241518.543345] [] kthread+0xd1/0xe0 [Thu Dec 12 01:22:18 2019][241518.548351] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:22:18 2019][241518.554920] [] 0xffffffffffffffff [Thu Dec 12 01:22:18 2019][241518.560034] LustreError: dumping log to /tmp/lustre-log.1576142538.67699 [Thu Dec 12 01:22:41 2019][241540.726529] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122052 to 0x1a80000401:1122081 [Thu Dec 12 01:23:56 2019][241616.406994] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 01:23:56 2019][241616.417195] Lustre: Skipped 536 previous similar messages [Thu Dec 12 01:24:08 2019][241627.983609] Pid: 67775, comm: ll_ost03_044 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:24:08 2019][241627.994133] Call Trace: [Thu Dec 12 01:24:08 2019][241627.996685] [] ldlm_completion_ast+0x430/0x860 [ptlrpc] [Thu Dec 12 01:24:08 2019][241628.003726] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:24:08 2019][241628.011024] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:24:08 2019][241628.017680] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:24:08 2019][241628.024081] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:24:08 2019][241628.031122] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:24:08 2019][241628.038937] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:24:08 2019][241628.045351] [] kthread+0xd1/0xe0 [Thu Dec 12 01:24:08 2019][241628.050353] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:24:08 2019][241628.056928] [] 0xffffffffffffffff [Thu Dec 12 01:24:08 2019][241628.062037] LustreError: dumping log to /tmp/lustre-log.1576142648.67775 [Thu Dec 12 01:24:17 2019][241637.199792] Pid: 112496, comm: ll_ost02_068 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:24:17 2019][241637.210396] Call Trace: [Thu Dec 12 01:24:17 2019][241637.212948] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:24:17 2019][241637.219990] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:24:17 2019][241637.227285] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:24:17 2019][241637.233935] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:24:17 2019][241637.240335] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:24:17 2019][241637.247377] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:24:17 2019][241637.255191] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:24:17 2019][241637.261609] [] kthread+0xd1/0xe0 [Thu Dec 12 01:24:17 2019][241637.266608] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:24:17 2019][241637.273201] [] 0xffffffffffffffff [Thu Dec 12 01:24:17 2019][241637.278320] LustreError: dumping log to /tmp/lustre-log.1576142657.112496 [Thu Dec 12 01:24:29 2019][241649.488033] Pid: 67845, comm: ll_ost01_063 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:24:29 2019][241649.498553] Call Trace: [Thu Dec 12 01:24:29 2019][241649.501114] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:24:29 2019][241649.508154] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:24:29 2019][241649.515451] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:24:29 2019][241649.522108] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:24:29 2019][241649.528513] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:24:29 2019][241649.535568] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:24:29 2019][241649.543384] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:24:29 2019][241649.549799] [] kthread+0xd1/0xe0 [Thu Dec 12 01:24:29 2019][241649.554800] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:24:29 2019][241649.561377] [] 0xffffffffffffffff [Thu Dec 12 01:24:29 2019][241649.566486] LustreError: dumping log to /tmp/lustre-log.1576142669.67845 [Thu Dec 12 01:24:30 2019][241650.328678] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105317 to 0x1a00000402:1105345 [Thu Dec 12 01:24:37 2019][241657.552831] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117354 to 0x1800000402:1117377 [Thu Dec 12 01:24:49 2019][241668.648423] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 01:24:49 2019][241668.648423] req@ffff88f298f11050 x1650929716577312/t0(0) o10->68d8bbaa-aab1-a78f-323f-aaff4c375c30@10.8.7.4@o2ib6:164/0 lens 440/0 e 0 to 0 dl 1576142694 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 01:24:49 2019][241668.677062] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1066 previous similar messages [Thu Dec 12 01:24:54 2019][241674.064536] Pid: 67692, comm: ll_ost00_032 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:24:54 2019][241674.075058] Call Trace: [Thu Dec 12 01:24:54 2019][241674.077624] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:24:54 2019][241674.084616] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:24:54 2019][241674.091819] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:24:54 2019][241674.098483] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:24:54 2019][241674.105234] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:24:54 2019][241674.112775] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:24:54 2019][241674.119871] [] dqget+0x3fa/0x450 [Thu Dec 12 01:24:54 2019][241674.124874] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:24:54 2019][241674.130647] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:24:54 2019][241674.138271] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:24:54 2019][241674.144757] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:24:54 2019][241674.150899] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:24:54 2019][241674.157938] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:24:54 2019][241674.165755] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:24:54 2019][241674.172170] [] kthread+0xd1/0xe0 [Thu Dec 12 01:24:54 2019][241674.177185] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:24:54 2019][241674.183748] [] 0xffffffffffffffff [Thu Dec 12 01:24:54 2019][241674.188862] LustreError: dumping log to /tmp/lustre-log.1576142694.67692 [Thu Dec 12 01:25:12 2019][241691.962836] Lustre: fir-OST005c: Connection restored to cec884d3-ca4b-8127-2f6b-7762665aa5f8 (at 10.9.0.64@o2ib4) [Thu Dec 12 01:25:12 2019][241691.973189] Lustre: Skipped 546 previous similar messages [Thu Dec 12 01:25:13 2019][241693.015409] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 01:25:13 2019][241693.024375] Lustre: Skipped 71 previous similar messages [Thu Dec 12 01:25:39 2019][241719.121448] LNet: Service thread pid 67918 was inactive for 1203.27s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 01:25:39 2019][241719.134481] LNet: Skipped 6 previous similar messages [Thu Dec 12 01:25:39 2019][241719.139625] LustreError: dumping log to /tmp/lustre-log.1576142739.67918 [Thu Dec 12 01:26:45 2019][241784.612142] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467363 to 0x0:27467393 [Thu Dec 12 01:27:30 2019][241829.715609] Pid: 67652, comm: ll_ost00_020 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:27:30 2019][241829.726126] Call Trace: [Thu Dec 12 01:27:30 2019][241829.728695] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:27:30 2019][241829.735694] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:27:30 2019][241829.742896] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:27:30 2019][241829.749543] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:27:30 2019][241829.756296] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:27:30 2019][241829.763816] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:27:30 2019][241829.770912] [] dqget+0x3fa/0x450 [Thu Dec 12 01:27:30 2019][241829.775929] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:27:30 2019][241829.781733] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:27:30 2019][241829.789340] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:27:30 2019][241829.795828] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:27:30 2019][241829.801975] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:27:30 2019][241829.809029] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:27:30 2019][241829.816831] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:27:30 2019][241829.823258] [] kthread+0xd1/0xe0 [Thu Dec 12 01:27:30 2019][241829.828255] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:27:30 2019][241829.834832] [] 0xffffffffffffffff [Thu Dec 12 01:27:30 2019][241829.839945] LustreError: dumping log to /tmp/lustre-log.1576142850.67652 [Thu Dec 12 01:27:58 2019][241858.388217] Pid: 112491, comm: ll_ost00_072 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:27:58 2019][241858.398823] Call Trace: [Thu Dec 12 01:27:58 2019][241858.401395] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:27:58 2019][241858.408393] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:27:58 2019][241858.415577] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:27:58 2019][241858.422217] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:27:58 2019][241858.428966] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:27:58 2019][241858.436498] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:27:58 2019][241858.443580] [] dqget+0x3fa/0x450 [Thu Dec 12 01:27:58 2019][241858.448588] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:27:58 2019][241858.454362] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:27:58 2019][241858.461986] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:27:58 2019][241858.468462] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:27:58 2019][241858.474608] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:27:58 2019][241858.481643] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:27:58 2019][241858.489444] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:27:58 2019][241858.495874] [] kthread+0xd1/0xe0 [Thu Dec 12 01:27:58 2019][241858.500867] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:27:58 2019][241858.507435] [] 0xffffffffffffffff [Thu Dec 12 01:27:58 2019][241858.512539] LustreError: dumping log to /tmp/lustre-log.1576142878.112491 [Thu Dec 12 01:29:24 2019][241944.405877] Pid: 66337, comm: ll_ost03_004 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:29:24 2019][241944.416400] Call Trace: [Thu Dec 12 01:29:24 2019][241944.418953] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 01:29:24 2019][241944.425371] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:29:24 2019][241944.432432] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:29:24 2019][241944.440246] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:29:24 2019][241944.446677] [] kthread+0xd1/0xe0 [Thu Dec 12 01:29:24 2019][241944.451682] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:29:24 2019][241944.458276] [] 0xffffffffffffffff [Thu Dec 12 01:29:24 2019][241944.463397] LustreError: dumping log to /tmp/lustre-log.1576142964.66337 [Thu Dec 12 01:29:24 2019][241944.471028] Pid: 112544, comm: ll_ost03_072 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:29:24 2019][241944.481667] Call Trace: [Thu Dec 12 01:29:24 2019][241944.484218] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:29:24 2019][241944.491210] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:29:24 2019][241944.498399] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:29:24 2019][241944.505049] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:29:24 2019][241944.511815] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:29:24 2019][241944.519343] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:29:24 2019][241944.526450] [] dqget+0x3fa/0x450 [Thu Dec 12 01:29:24 2019][241944.531442] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:29:24 2019][241944.537216] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:29:24 2019][241944.544832] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:29:24 2019][241944.551304] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:29:24 2019][241944.557449] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:29:24 2019][241944.564470] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:29:24 2019][241944.572286] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:29:24 2019][241944.578701] [] kthread+0xd1/0xe0 [Thu Dec 12 01:29:24 2019][241944.583695] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:29:25 2019][241944.590277] [] 0xffffffffffffffff [Thu Dec 12 01:29:25 2019][241945.106104] Lustre: fir-OST0058: deleting orphan objects from 0x0:27506661 to 0x0:27506689 [Thu Dec 12 01:29:29 2019][241948.940996] LustreError: 67630:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576142669, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0058_UUID lock: ffff88fcea0e69c0/0x7066c9c190aed28e lrc: 3/0,1 mode: --/PW res: [0x1900000400:0x1127a7:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67630 timeout: 0 lvb_type: 0 [Thu Dec 12 01:29:29 2019][241948.984752] LustreError: 67630:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 8 previous similar messages [Thu Dec 12 01:29:32 2019][241952.598043] Pid: 67707, comm: ll_ost03_033 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:29:32 2019][241952.608559] Call Trace: [Thu Dec 12 01:29:32 2019][241952.611130] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:29:32 2019][241952.618128] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:29:32 2019][241952.625311] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:29:32 2019][241952.631959] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:29:33 2019][241952.638709] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:29:33 2019][241952.646234] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:29:33 2019][241952.653331] [] dqget+0x3fa/0x450 [Thu Dec 12 01:29:33 2019][241952.658333] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:29:33 2019][241952.664119] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:29:33 2019][241952.671732] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:29:33 2019][241952.678229] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:29:33 2019][241952.684359] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:29:33 2019][241952.691421] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:29:33 2019][241952.699223] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:29:33 2019][241952.705666] [] kthread+0xd1/0xe0 [Thu Dec 12 01:29:33 2019][241952.710672] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:29:33 2019][241952.717248] [] 0xffffffffffffffff [Thu Dec 12 01:29:33 2019][241952.722349] LustreError: dumping log to /tmp/lustre-log.1576142973.67707 [Thu Dec 12 01:29:37 2019][241956.694121] LustreError: dumping log to /tmp/lustre-log.1576142977.67791 [Thu Dec 12 01:29:45 2019][241964.886285] LustreError: dumping log to /tmp/lustre-log.1576142985.67628 [Thu Dec 12 01:29:57 2019][241976.707678] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785256 to 0x1800000401:11785281 [Thu Dec 12 01:30:01 2019][241980.782343] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3048832 to 0x1a00000401:3048865 [Thu Dec 12 01:30:05 2019][241985.366702] LustreError: dumping log to /tmp/lustre-log.1576143005.67871 [Thu Dec 12 01:30:26 2019][242005.847095] LustreError: dumping log to /tmp/lustre-log.1576143026.67797 [Thu Dec 12 01:31:23 2019][242063.108364] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125639 to 0x1980000400:1125665 [Thu Dec 12 01:31:42 2019][242082.336325] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842023 to 0x1980000401:11842049 [Thu Dec 12 01:32:16 2019][242116.441290] LustreError: dumping log to /tmp/lustre-log.1576143136.66715 [Thu Dec 12 01:32:41 2019][242141.017776] LNet: Service thread pid 67842 was inactive for 1200.82s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 01:32:41 2019][242141.034883] LNet: Skipped 9 previous similar messages [Thu Dec 12 01:32:41 2019][242141.040025] Pid: 67842, comm: ll_ost00_056 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:32:41 2019][242141.050578] Call Trace: [Thu Dec 12 01:32:41 2019][242141.053143] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:32:41 2019][242141.060138] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:32:41 2019][242141.067318] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:32:41 2019][242141.073971] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:32:41 2019][242141.080719] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:32:41 2019][242141.088245] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:32:41 2019][242141.095339] [] dqget+0x3fa/0x450 [Thu Dec 12 01:32:41 2019][242141.100342] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:32:41 2019][242141.106114] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:32:41 2019][242141.113742] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:32:41 2019][242141.120217] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:32:41 2019][242141.126371] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:32:41 2019][242141.133407] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:32:41 2019][242141.141220] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:32:41 2019][242141.147638] [] kthread+0xd1/0xe0 [Thu Dec 12 01:32:41 2019][242141.152639] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:32:41 2019][242141.159209] [] 0xffffffffffffffff [Thu Dec 12 01:32:41 2019][242141.164306] LustreError: dumping log to /tmp/lustre-log.1576143161.67842 [Thu Dec 12 01:32:41 2019][242141.171832] Pid: 67619, comm: ll_ost00_013 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:32:41 2019][242141.182378] Call Trace: [Thu Dec 12 01:32:41 2019][242141.184931] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:32:41 2019][242141.191924] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:32:41 2019][242141.199123] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:32:41 2019][242141.205779] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:32:41 2019][242141.212532] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:32:41 2019][242141.220049] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:32:41 2019][242141.227144] [] dqget+0x3fa/0x450 [Thu Dec 12 01:32:41 2019][242141.232144] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:32:41 2019][242141.237920] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:32:41 2019][242141.245538] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:32:41 2019][242141.252016] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:32:41 2019][242141.258180] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:32:41 2019][242141.265216] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:32:41 2019][242141.273022] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:32:41 2019][242141.279454] [] kthread+0xd1/0xe0 [Thu Dec 12 01:32:41 2019][242141.284445] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:32:41 2019][242141.291016] [] 0xffffffffffffffff [Thu Dec 12 01:32:52 2019][242151.665744] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089644 to 0x1980000402:3089665 [Thu Dec 12 01:33:21 2019][242180.690249] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797355 to 0x1900000401:11797377 [Thu Dec 12 01:33:24 2019][242183.682367] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3085791 to 0x1900000402:3085857 [Thu Dec 12 01:33:30 2019][242190.170744] Pid: 67855, comm: ll_ost01_065 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:33:30 2019][242190.181262] Call Trace: [Thu Dec 12 01:33:30 2019][242190.183819] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:33:30 2019][242190.190863] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:33:30 2019][242190.198161] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:33:30 2019][242190.204816] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:33:30 2019][242190.211220] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:33:30 2019][242190.218258] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:33:30 2019][242190.226084] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:33:30 2019][242190.232514] [] kthread+0xd1/0xe0 [Thu Dec 12 01:33:30 2019][242190.237516] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:33:30 2019][242190.244093] [] 0xffffffffffffffff [Thu Dec 12 01:33:30 2019][242190.249202] LustreError: dumping log to /tmp/lustre-log.1576143210.67855 [Thu Dec 12 01:33:34 2019][242194.266835] Pid: 67672, comm: ll_ost03_024 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:33:34 2019][242194.277357] Call Trace: [Thu Dec 12 01:33:34 2019][242194.279928] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:33:34 2019][242194.286932] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:33:34 2019][242194.294120] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:33:34 2019][242194.300765] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:33:34 2019][242194.307515] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:33:34 2019][242194.315041] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:33:34 2019][242194.322161] [] dqget+0x3fa/0x450 [Thu Dec 12 01:33:34 2019][242194.327165] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:33:34 2019][242194.332936] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:33:34 2019][242194.340568] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:33:34 2019][242194.347047] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:33:34 2019][242194.353189] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:33:34 2019][242194.360236] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:33:34 2019][242194.368054] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:33:34 2019][242194.374484] [] kthread+0xd1/0xe0 [Thu Dec 12 01:33:34 2019][242194.379501] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:33:34 2019][242194.386064] [] 0xffffffffffffffff [Thu Dec 12 01:33:34 2019][242194.391177] LustreError: dumping log to /tmp/lustre-log.1576143214.67672 [Thu Dec 12 01:33:38 2019][242198.362921] Pid: 67052, comm: ll_ost02_007 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:33:38 2019][242198.373438] Call Trace: [Thu Dec 12 01:33:38 2019][242198.376007] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:33:38 2019][242198.383005] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:33:38 2019][242198.390186] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:33:38 2019][242198.396840] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:33:38 2019][242198.403578] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:33:38 2019][242198.411118] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:33:38 2019][242198.418225] [] dqget+0x3fa/0x450 [Thu Dec 12 01:33:38 2019][242198.423227] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:33:38 2019][242198.429015] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:33:38 2019][242198.436626] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:33:38 2019][242198.443115] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:33:38 2019][242198.449247] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:33:38 2019][242198.456307] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:33:38 2019][242198.464110] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:33:38 2019][242198.470551] [] kthread+0xd1/0xe0 [Thu Dec 12 01:33:38 2019][242198.475548] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:33:38 2019][242198.482125] [] 0xffffffffffffffff [Thu Dec 12 01:33:38 2019][242198.487227] LustreError: dumping log to /tmp/lustre-log.1576143218.67052 [Thu Dec 12 01:33:42 2019][242202.459007] LustreError: dumping log to /tmp/lustre-log.1576143222.67784 [Thu Dec 12 01:33:57 2019][242217.378925] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 01:33:57 2019][242217.389106] Lustre: Skipped 568 previous similar messages [Thu Dec 12 01:34:23 2019][242243.419813] LustreError: dumping log to /tmp/lustre-log.1576143263.113358 [Thu Dec 12 01:34:31 2019][242250.676570] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105356 to 0x1a00000402:1105377 [Thu Dec 12 01:34:50 2019][242270.612364] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 01:34:50 2019][242270.612364] req@ffff88f60b563050 x1650929716577312/t0(0) o10->68d8bbaa-aab1-a78f-323f-aaff4c375c30@10.8.7.4@o2ib6:10/0 lens 440/0 e 0 to 0 dl 1576143295 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 01:34:51 2019][242270.640911] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1185 previous similar messages [Thu Dec 12 01:34:52 2019][242272.092384] LustreError: dumping log to /tmp/lustre-log.1576143292.67804 [Thu Dec 12 01:35:04 2019][242284.380627] LustreError: dumping log to /tmp/lustre-log.1576143304.67883 [Thu Dec 12 01:35:12 2019][242292.487208] Lustre: fir-OST0058: Connection restored to a8841932-bc4a-ab11-1ace-8e1fdda46930 (at 10.8.23.23@o2ib6) [Thu Dec 12 01:35:12 2019][242292.497646] Lustre: Skipped 591 previous similar messages [Thu Dec 12 01:35:15 2019][242295.139417] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 01:35:15 2019][242295.148382] Lustre: Skipped 71 previous similar messages [Thu Dec 12 01:35:17 2019][242296.949481] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122085 to 0x1a80000401:1122113 [Thu Dec 12 01:36:14 2019][242354.014010] LNet: Service thread pid 67633 was inactive for 1200.75s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 01:36:14 2019][242354.027048] LNet: Skipped 9 previous similar messages [Thu Dec 12 01:36:14 2019][242354.032195] LustreError: dumping log to /tmp/lustre-log.1576143374.67633 [Thu Dec 12 01:36:22 2019][242362.206172] LustreError: dumping log to /tmp/lustre-log.1576143382.67841 [Thu Dec 12 01:36:59 2019][242399.070925] LustreError: dumping log to /tmp/lustre-log.1576143419.112508 [Thu Dec 12 01:37:05 2019][242405.463637] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124270 to 0x1900000400:1124353 [Thu Dec 12 01:37:07 2019][242407.263092] LustreError: dumping log to /tmp/lustre-log.1576143427.67849 [Thu Dec 12 01:37:11 2019][242411.359167] LustreError: dumping log to /tmp/lustre-log.1576143431.66108 [Thu Dec 12 01:37:13 2019][242412.759801] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117382 to 0x1800000402:1117409 [Thu Dec 12 01:37:24 2019][242423.910203] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3081886 to 0x1a80000402:3081953 [Thu Dec 12 01:37:28 2019][242427.743515] LustreError: dumping log to /tmp/lustre-log.1576143448.67697 [Thu Dec 12 01:38:17 2019][242476.896501] Pid: 67037, comm: ll_ost01_006 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:38:17 2019][242476.907023] Call Trace: [Thu Dec 12 01:38:17 2019][242476.909577] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 01:38:17 2019][242476.915976] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:38:17 2019][242476.923040] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:38:17 2019][242476.930846] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:38:17 2019][242476.937273] [] kthread+0xd1/0xe0 [Thu Dec 12 01:38:17 2019][242476.942278] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:38:17 2019][242476.948852] [] 0xffffffffffffffff [Thu Dec 12 01:38:17 2019][242476.953963] LustreError: dumping log to /tmp/lustre-log.1576143497.67037 [Thu Dec 12 01:38:33 2019][242493.280855] Pid: 66105, comm: ll_ost01_001 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:38:33 2019][242493.291379] Call Trace: [Thu Dec 12 01:38:33 2019][242493.293954] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:38:33 2019][242493.300954] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:38:33 2019][242493.308138] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:38:33 2019][242493.314786] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:38:33 2019][242493.321534] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:38:33 2019][242493.329059] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:38:33 2019][242493.336156] [] dqget+0x3fa/0x450 [Thu Dec 12 01:38:33 2019][242493.341161] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:38:33 2019][242493.346931] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:38:33 2019][242493.354550] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:38:33 2019][242493.361024] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:38:33 2019][242493.367169] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:38:33 2019][242493.374214] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:38:33 2019][242493.382030] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:38:33 2019][242493.388447] [] kthread+0xd1/0xe0 [Thu Dec 12 01:38:33 2019][242493.393461] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:38:33 2019][242493.400026] [] 0xffffffffffffffff [Thu Dec 12 01:38:33 2019][242493.405139] LustreError: dumping log to /tmp/lustre-log.1576143513.66105 [Thu Dec 12 01:38:51 2019][242511.352773] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800376 to 0x1a80000400:11800449 [Thu Dec 12 01:39:20 2019][242539.832472] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467396 to 0x0:27467425 [Thu Dec 12 01:39:39 2019][242558.818134] Pid: 67831, comm: ll_ost02_049 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:39:39 2019][242558.828654] Call Trace: [Thu Dec 12 01:39:39 2019][242558.831215] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:39:39 2019][242558.838253] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:39:39 2019][242558.845552] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:39:39 2019][242558.852200] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:39:39 2019][242558.858614] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:39:39 2019][242558.865652] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:39:39 2019][242558.873464] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:39:39 2019][242558.879881] [] kthread+0xd1/0xe0 [Thu Dec 12 01:39:39 2019][242558.884882] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:39:39 2019][242558.891451] [] 0xffffffffffffffff [Thu Dec 12 01:39:39 2019][242558.896558] LustreError: dumping log to /tmp/lustre-log.1576143579.67831 [Thu Dec 12 01:40:03 2019][242583.394627] Pid: 112493, comm: ll_ost00_074 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:40:03 2019][242583.405241] Call Trace: [Thu Dec 12 01:40:03 2019][242583.407815] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:40:03 2019][242583.414814] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:40:03 2019][242583.422015] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:40:03 2019][242583.428666] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:40:03 2019][242583.435414] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:40:03 2019][242583.442938] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:40:03 2019][242583.450034] [] dqget+0x3fa/0x450 [Thu Dec 12 01:40:03 2019][242583.455038] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:40:03 2019][242583.460818] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:40:03 2019][242583.468434] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:40:03 2019][242583.474911] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:40:03 2019][242583.481068] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:40:03 2019][242583.488118] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:40:03 2019][242583.495933] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:40:03 2019][242583.502349] [] kthread+0xd1/0xe0 [Thu Dec 12 01:40:03 2019][242583.507349] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:40:03 2019][242583.513927] [] 0xffffffffffffffff [Thu Dec 12 01:40:03 2019][242583.519027] LustreError: dumping log to /tmp/lustre-log.1576143603.112493 [Thu Dec 12 01:40:28 2019][242607.971110] Pid: 67805, comm: ll_ost02_045 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:40:28 2019][242607.981631] Call Trace: [Thu Dec 12 01:40:28 2019][242607.984183] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 01:40:28 2019][242607.990576] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:40:28 2019][242607.997652] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:40:28 2019][242608.005455] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:40:28 2019][242608.011883] [] kthread+0xd1/0xe0 [Thu Dec 12 01:40:28 2019][242608.016887] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:40:28 2019][242608.023463] [] 0xffffffffffffffff [Thu Dec 12 01:40:28 2019][242608.028574] LustreError: dumping log to /tmp/lustre-log.1576143628.67805 [Thu Dec 12 01:40:32 2019][242612.067191] LustreError: dumping log to /tmp/lustre-log.1576143632.112505 [Thu Dec 12 01:40:48 2019][242627.572508] LustreError: 67625:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576143347, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005c_UUID lock: ffff8910b6c8a880/0x7066c9c190af09a7 lrc: 3/0,1 mode: --/PW res: [0x1a00000401:0x2e857d:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67625 timeout: 0 lvb_type: 0 [Thu Dec 12 01:40:48 2019][242627.616229] LustreError: 67625:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 3 previous similar messages [Thu Dec 12 01:40:52 2019][242632.547610] LustreError: dumping log to /tmp/lustre-log.1576143652.112539 [Thu Dec 12 01:41:24 2019][242664.408412] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125670 to 0x1980000400:1125697 [Thu Dec 12 01:42:01 2019][242701.134334] Lustre: fir-OST0058: deleting orphan objects from 0x0:27506691 to 0x0:27506721 [Thu Dec 12 01:42:02 2019][242702.181039] LustreError: dumping log to /tmp/lustre-log.1576143722.67765 [Thu Dec 12 01:42:18 2019][242718.565377] LustreError: dumping log to /tmp/lustre-log.1576143738.67767 [Thu Dec 12 01:42:33 2019][242733.250818] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785289 to 0x1800000401:11785313 [Thu Dec 12 01:42:37 2019][242737.645523] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3048873 to 0x1a00000401:3048897 [Thu Dec 12 01:42:39 2019][242739.045809] LustreError: dumping log to /tmp/lustre-log.1576143759.67673 [Thu Dec 12 01:43:58 2019][242818.350993] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 01:43:58 2019][242818.361176] Lustre: Skipped 565 previous similar messages [Thu Dec 12 01:44:17 2019][242837.351756] LNet: Service thread pid 112532 was inactive for 1204.12s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 01:44:17 2019][242837.368972] LNet: Skipped 9 previous similar messages [Thu Dec 12 01:44:17 2019][242837.374125] Pid: 112532, comm: ll_ost01_086 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:44:17 2019][242837.384932] Call Trace: [Thu Dec 12 01:44:17 2019][242837.387513] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:44:17 2019][242837.394522] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:44:17 2019][242837.401822] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:44:17 2019][242837.408564] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:44:17 2019][242837.415305] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:44:17 2019][242837.422887] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:44:17 2019][242837.429981] [] dqget+0x3fa/0x450 [Thu Dec 12 01:44:17 2019][242837.435086] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:44:17 2019][242837.440882] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:44:17 2019][242837.448552] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:44:17 2019][242837.455035] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:44:17 2019][242837.461281] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:44:17 2019][242837.468357] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:44:17 2019][242837.476239] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:44:17 2019][242837.482674] [] kthread+0xd1/0xe0 [Thu Dec 12 01:44:17 2019][242837.487708] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:44:17 2019][242837.494294] [] 0xffffffffffffffff [Thu Dec 12 01:44:17 2019][242837.499501] LustreError: dumping log to /tmp/lustre-log.1576143857.112532 [Thu Dec 12 01:44:18 2019][242837.887424] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842056 to 0x1980000401:11842081 [Thu Dec 12 01:44:25 2019][242845.543923] Pid: 67620, comm: ll_ost01_014 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:44:25 2019][242845.554442] Call Trace: [Thu Dec 12 01:44:25 2019][242845.557016] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:44:25 2019][242845.564008] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:44:25 2019][242845.571198] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:44:25 2019][242845.577866] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:44:25 2019][242845.584618] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:44:25 2019][242845.592143] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:44:25 2019][242845.599236] [] dqget+0x3fa/0x450 [Thu Dec 12 01:44:25 2019][242845.604242] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:44:25 2019][242845.610027] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:44:25 2019][242845.617639] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:44:25 2019][242845.624127] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:44:25 2019][242845.630259] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:44:26 2019][242845.637333] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:44:26 2019][242845.645131] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:44:26 2019][242845.651558] [] kthread+0xd1/0xe0 [Thu Dec 12 01:44:26 2019][242845.656564] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:44:26 2019][242845.663147] [] 0xffffffffffffffff [Thu Dec 12 01:44:26 2019][242845.668257] LustreError: dumping log to /tmp/lustre-log.1576143866.67620 [Thu Dec 12 01:44:30 2019][242849.640011] Pid: 67925, comm: ll_ost00_068 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:44:30 2019][242849.650531] Call Trace: [Thu Dec 12 01:44:30 2019][242849.653101] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:44:30 2019][242849.660099] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:44:30 2019][242849.667299] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:44:30 2019][242849.673949] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:44:30 2019][242849.680695] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:44:30 2019][242849.688225] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:44:30 2019][242849.695320] [] dqget+0x3fa/0x450 [Thu Dec 12 01:44:30 2019][242849.700321] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:44:30 2019][242849.706110] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:44:30 2019][242849.713720] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:44:30 2019][242849.720219] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:44:30 2019][242849.726349] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:44:30 2019][242849.733420] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:44:30 2019][242849.741229] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:44:30 2019][242849.747643] [] kthread+0xd1/0xe0 [Thu Dec 12 01:44:30 2019][242849.752659] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:44:30 2019][242849.759223] [] 0xffffffffffffffff [Thu Dec 12 01:44:30 2019][242849.764335] LustreError: dumping log to /tmp/lustre-log.1576143870.67925 [Thu Dec 12 01:44:30 2019][242849.771700] Pid: 67630, comm: ll_ost01_018 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:44:30 2019][242849.782223] Call Trace: [Thu Dec 12 01:44:30 2019][242849.784770] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:44:30 2019][242849.791792] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:44:30 2019][242849.799117] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:44:30 2019][242849.805763] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:44:30 2019][242849.812178] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:44:30 2019][242849.819204] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:44:30 2019][242849.827033] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:44:30 2019][242849.833445] [] kthread+0xd1/0xe0 [Thu Dec 12 01:44:30 2019][242849.838451] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:44:30 2019][242849.845007] [] 0xffffffffffffffff [Thu Dec 12 01:44:32 2019][242852.570711] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105381 to 0x1a00000402:1105409 [Thu Dec 12 01:44:51 2019][242871.236451] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 01:44:51 2019][242871.236451] req@ffff88f103e63850 x1650929716577312/t0(0) o10->68d8bbaa-aab1-a78f-323f-aaff4c375c30@10.8.7.4@o2ib6:611/0 lens 440/0 e 0 to 0 dl 1576143896 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 01:44:51 2019][242871.265086] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1183 previous similar messages [Thu Dec 12 01:44:54 2019][242874.216504] Pid: 67631, comm: ll_ost00_016 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:44:54 2019][242874.227023] Call Trace: [Thu Dec 12 01:44:54 2019][242874.229600] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:44:54 2019][242874.236597] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:44:54 2019][242874.243781] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:44:54 2019][242874.250431] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:44:54 2019][242874.257179] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:44:54 2019][242874.264704] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:44:54 2019][242874.271799] [] dqget+0x3fa/0x450 [Thu Dec 12 01:44:54 2019][242874.276819] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:44:54 2019][242874.282607] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:44:54 2019][242874.290219] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:44:54 2019][242874.296708] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:44:54 2019][242874.302838] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:44:54 2019][242874.309882] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:44:54 2019][242874.317685] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:44:54 2019][242874.324111] [] kthread+0xd1/0xe0 [Thu Dec 12 01:44:54 2019][242874.329117] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:44:54 2019][242874.335708] [] 0xffffffffffffffff [Thu Dec 12 01:44:54 2019][242874.340826] LustreError: dumping log to /tmp/lustre-log.1576143894.67631 [Thu Dec 12 01:45:14 2019][242893.983421] Lustre: fir-OST005c: Connection restored to e2e512e9-5e98-1086-a71a-3e4545e26e0b (at 10.8.25.1@o2ib6) [Thu Dec 12 01:45:14 2019][242893.993788] Lustre: Skipped 625 previous similar messages [Thu Dec 12 01:45:17 2019][242897.264212] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 01:45:17 2019][242897.273187] Lustre: Skipped 71 previous similar messages [Thu Dec 12 01:45:19 2019][242898.792992] LustreError: dumping log to /tmp/lustre-log.1576143919.66096 [Thu Dec 12 01:45:28 2019][242907.918468] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089681 to 0x1980000402:3089697 [Thu Dec 12 01:45:57 2019][242936.801426] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797383 to 0x1900000401:11797409 [Thu Dec 12 01:46:00 2019][242939.953635] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3085869 to 0x1900000402:3085889 [Thu Dec 12 01:46:16 2019][242956.138147] LNet: Service thread pid 67663 was inactive for 1201.35s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 01:46:16 2019][242956.151186] LNet: Skipped 14 previous similar messages [Thu Dec 12 01:46:16 2019][242956.156422] LustreError: dumping log to /tmp/lustre-log.1576143976.67663 [Thu Dec 12 01:47:01 2019][243001.195060] LustreError: dumping log to /tmp/lustre-log.1576144021.67679 [Thu Dec 12 01:47:25 2019][243024.833392] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3081963 to 0x1a80000402:3081985 [Thu Dec 12 01:47:26 2019][243025.771548] LustreError: dumping log to /tmp/lustre-log.1576144046.67637 [Thu Dec 12 01:47:30 2019][243029.867631] LustreError: dumping log to /tmp/lustre-log.1576144050.67668 [Thu Dec 12 01:47:54 2019][243053.796775] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122117 to 0x1a80000401:1122145 [Thu Dec 12 01:49:29 2019][243149.189864] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3070575 to 0x1800000400:3070625 [Thu Dec 12 01:49:33 2019][243152.750103] Pid: 112506, comm: ll_ost02_072 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:49:33 2019][243152.760730] Call Trace: [Thu Dec 12 01:49:33 2019][243152.763303] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:49:33 2019][243152.770306] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:49:33 2019][243152.777498] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:49:33 2019][243152.784148] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:49:33 2019][243152.790894] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:49:33 2019][243152.798431] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 01:49:33 2019][243152.805568] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 01:49:33 2019][243152.811786] [] ofd_destroy+0x5d0/0x960 [ofd] [Thu Dec 12 01:49:33 2019][243152.817850] [] ofd_destroy_by_fid+0x1f4/0x4a0 [ofd] [Thu Dec 12 01:49:33 2019][243152.824499] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:49:33 2019][243152.830901] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:49:33 2019][243152.837951] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:49:33 2019][243152.845766] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:49:33 2019][243152.852181] [] kthread+0xd1/0xe0 [Thu Dec 12 01:49:33 2019][243152.857197] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:49:33 2019][243152.863762] [] 0xffffffffffffffff [Thu Dec 12 01:49:33 2019][243152.868887] LustreError: dumping log to /tmp/lustre-log.1576144173.112506 [Thu Dec 12 01:49:33 2019][243152.876872] Pid: 67639, comm: ll_ost02_017 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:49:33 2019][243152.887421] Call Trace: [Thu Dec 12 01:49:33 2019][243152.889970] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:49:33 2019][243152.896964] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:49:33 2019][243152.904147] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:49:33 2019][243152.910795] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:49:33 2019][243152.917546] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:49:33 2019][243152.925063] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 01:49:33 2019][243152.932157] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 01:49:33 2019][243152.938388] [] ofd_destroy+0x5d0/0x960 [ofd] [Thu Dec 12 01:49:33 2019][243152.944446] [] ofd_destroy_by_fid+0x1f4/0x4a0 [ofd] [Thu Dec 12 01:49:33 2019][243152.951098] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:49:33 2019][243152.957499] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:49:33 2019][243152.964531] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:49:33 2019][243152.972355] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:49:33 2019][243152.978778] [] kthread+0xd1/0xe0 [Thu Dec 12 01:49:33 2019][243152.983794] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:49:33 2019][243152.990349] [] 0xffffffffffffffff [Thu Dec 12 01:49:41 2019][243161.526995] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124364 to 0x1900000400:1124385 [Thu Dec 12 01:49:49 2019][243169.631148] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117411 to 0x1800000402:1117441 [Thu Dec 12 01:50:05 2019][243185.518767] Pid: 67689, comm: ll_ost00_031 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:50:05 2019][243185.529287] Call Trace: [Thu Dec 12 01:50:05 2019][243185.531853] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:50:05 2019][243185.538854] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:50:05 2019][243185.546037] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:50:05 2019][243185.552688] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:50:05 2019][243185.559434] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:50:05 2019][243185.566959] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:50:05 2019][243185.574058] [] dqget+0x3fa/0x450 [Thu Dec 12 01:50:05 2019][243185.579060] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:50:05 2019][243185.584846] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:50:05 2019][243185.592457] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:50:05 2019][243185.598947] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:50:05 2019][243185.605077] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:50:05 2019][243185.612129] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:50:05 2019][243185.619932] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:50:05 2019][243185.626361] [] kthread+0xd1/0xe0 [Thu Dec 12 01:50:06 2019][243185.631363] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:50:06 2019][243185.637940] [] 0xffffffffffffffff [Thu Dec 12 01:50:06 2019][243185.643040] LustreError: dumping log to /tmp/lustre-log.1576144205.67689 [Thu Dec 12 01:50:18 2019][243197.807078] Pid: 67863, comm: ll_ost01_067 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:50:18 2019][243197.817596] Call Trace: [Thu Dec 12 01:50:18 2019][243197.820150] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:50:18 2019][243197.827189] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:50:18 2019][243197.834478] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:50:18 2019][243197.841127] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:50:18 2019][243197.847527] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:50:18 2019][243197.854569] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:50:18 2019][243197.862386] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:50:18 2019][243197.868800] [] kthread+0xd1/0xe0 [Thu Dec 12 01:50:18 2019][243197.873799] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:50:18 2019][243197.880378] [] 0xffffffffffffffff [Thu Dec 12 01:50:18 2019][243197.885485] LustreError: dumping log to /tmp/lustre-log.1576144218.67863 [Thu Dec 12 01:50:50 2019][243230.575679] Pid: 67666, comm: ll_ost01_030 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:50:50 2019][243230.586199] Call Trace: [Thu Dec 12 01:50:51 2019][243230.588756] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 01:50:51 2019][243230.595157] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:50:51 2019][243230.602227] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:50:51 2019][243230.610042] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:50:51 2019][243230.616474] [] kthread+0xd1/0xe0 [Thu Dec 12 01:50:51 2019][243230.621478] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:50:51 2019][243230.628053] [] 0xffffffffffffffff [Thu Dec 12 01:50:51 2019][243230.633161] LustreError: dumping log to /tmp/lustre-log.1576144250.67666 [Thu Dec 12 01:51:25 2019][243264.812787] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125699 to 0x1980000400:1125729 [Thu Dec 12 01:51:28 2019][243268.243254] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800455 to 0x1a80000400:11800481 [Thu Dec 12 01:51:56 2019][243296.482338] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467430 to 0x0:27467457 [Thu Dec 12 01:52:16 2019][243315.830430] LustreError: 67608:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576144036, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0058_UUID lock: ffff88f5fed1ad00/0x7066c9c190af3456 lrc: 3/0,1 mode: --/PW res: [0x1900000400:0x112809:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67608 timeout: 0 lvb_type: 0 [Thu Dec 12 01:52:16 2019][243315.874153] LustreError: 67608:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 9 previous similar messages [Thu Dec 12 01:52:16 2019][243316.593424] LustreError: dumping log to /tmp/lustre-log.1576144336.66093 [Thu Dec 12 01:52:41 2019][243341.169907] LustreError: dumping log to /tmp/lustre-log.1576144361.67678 [Thu Dec 12 01:53:06 2019][243365.746399] LustreError: dumping log to /tmp/lustre-log.1576144386.112520 [Thu Dec 12 01:53:10 2019][243369.842480] LustreError: dumping log to /tmp/lustre-log.1576144390.67726 [Thu Dec 12 01:53:49 2019][243408.828570] Lustre: fir-OST005c: deleting orphan objects from 0x0:27192412 to 0x0:27192449 [Thu Dec 12 01:53:59 2019][243419.323023] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 01:53:59 2019][243419.333200] Lustre: Skipped 710 previous similar messages [Thu Dec 12 01:54:23 2019][243443.571970] LustreError: dumping log to /tmp/lustre-log.1576144463.67649 [Thu Dec 12 01:54:32 2019][243451.764129] LustreError: dumping log to /tmp/lustre-log.1576144472.67818 [Thu Dec 12 01:54:34 2019][243453.852918] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105411 to 0x1a00000402:1105441 [Thu Dec 12 01:54:37 2019][243456.869525] Lustre: fir-OST0058: deleting orphan objects from 0x0:27506724 to 0x0:27506753 [Thu Dec 12 01:54:40 2019][243459.956299] LNet: Service thread pid 67858 was inactive for 1201.49s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 01:54:40 2019][243459.973412] LNet: Skipped 9 previous similar messages [Thu Dec 12 01:54:40 2019][243459.978559] Pid: 67858, comm: ll_ost03_059 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:54:40 2019][243459.989099] Call Trace: [Thu Dec 12 01:54:40 2019][243459.991654] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 01:54:40 2019][243459.998055] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:54:40 2019][243460.005117] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:54:40 2019][243460.012934] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:54:40 2019][243460.019365] [] kthread+0xd1/0xe0 [Thu Dec 12 01:54:40 2019][243460.024369] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:54:40 2019][243460.030945] [] 0xffffffffffffffff [Thu Dec 12 01:54:40 2019][243460.036054] LustreError: dumping log to /tmp/lustre-log.1576144480.67858 [Thu Dec 12 01:54:52 2019][243471.880550] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 01:54:52 2019][243471.880550] req@ffff891237d13050 x1650929716577312/t0(0) o10->68d8bbaa-aab1-a78f-323f-aaff4c375c30@10.8.7.4@o2ib6:457/0 lens 440/0 e 0 to 0 dl 1576144497 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 01:54:52 2019][243471.909187] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1246 previous similar messages [Thu Dec 12 01:54:52 2019][243472.244537] Pid: 67670, comm: ll_ost00_023 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:54:52 2019][243472.255055] Call Trace: [Thu Dec 12 01:54:52 2019][243472.257632] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:54:52 2019][243472.264640] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:54:52 2019][243472.271823] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:54:52 2019][243472.278471] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:54:52 2019][243472.285223] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:54:52 2019][243472.292746] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:54:52 2019][243472.299840] [] dqget+0x3fa/0x450 [Thu Dec 12 01:54:52 2019][243472.304845] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:54:52 2019][243472.310640] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:54:52 2019][243472.318253] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:54:52 2019][243472.324740] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:54:52 2019][243472.330886] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:54:52 2019][243472.337942] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:54:52 2019][243472.345744] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:54:52 2019][243472.352170] [] kthread+0xd1/0xe0 [Thu Dec 12 01:54:52 2019][243472.357175] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:54:52 2019][243472.363751] [] 0xffffffffffffffff [Thu Dec 12 01:54:52 2019][243472.368852] LustreError: dumping log to /tmp/lustre-log.1576144492.67670 [Thu Dec 12 01:55:09 2019][243488.700547] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785320 to 0x1800000401:11785345 [Thu Dec 12 01:55:13 2019][243493.140727] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3048914 to 0x1a00000401:3048929 [Thu Dec 12 01:55:15 2019][243494.959161] Lustre: fir-OST005a: Connection restored to f5f4e1fb-09a1-cbb0-925c-f94d3727005b (at 10.9.101.44@o2ib4) [Thu Dec 12 01:55:15 2019][243494.969719] Lustre: Skipped 658 previous similar messages [Thu Dec 12 01:55:17 2019][243496.821040] Pid: 67641, comm: ll_ost00_019 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:55:17 2019][243496.831560] Call Trace: [Thu Dec 12 01:55:17 2019][243496.834130] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:55:17 2019][243496.841127] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:55:17 2019][243496.848311] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:55:17 2019][243496.854962] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:55:17 2019][243496.861711] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:55:17 2019][243496.869252] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:55:17 2019][243496.876347] [] dqget+0x3fa/0x450 [Thu Dec 12 01:55:17 2019][243496.881352] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:55:17 2019][243496.887136] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:55:17 2019][243496.894749] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:55:17 2019][243496.901239] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:55:17 2019][243496.907368] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:55:17 2019][243496.914413] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:55:17 2019][243496.922218] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:55:17 2019][243496.928645] [] kthread+0xd1/0xe0 [Thu Dec 12 01:55:17 2019][243496.933662] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:55:17 2019][243496.940241] [] 0xffffffffffffffff [Thu Dec 12 01:55:17 2019][243496.945340] LustreError: dumping log to /tmp/lustre-log.1576144517.67641 [Thu Dec 12 01:55:19 2019][243499.387679] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 01:55:19 2019][243499.396651] Lustre: Skipped 71 previous similar messages [Thu Dec 12 01:55:49 2019][243529.589697] Pid: 67625, comm: ll_ost03_015 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:55:49 2019][243529.600223] Call Trace: [Thu Dec 12 01:55:49 2019][243529.602789] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:55:49 2019][243529.609831] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:55:49 2019][243529.617168] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:55:49 2019][243529.623835] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:55:49 2019][243529.630239] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:55:50 2019][243529.637270] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:55:50 2019][243529.645103] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:55:50 2019][243529.651520] [] kthread+0xd1/0xe0 [Thu Dec 12 01:55:50 2019][243529.656518] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:55:50 2019][243529.663088] [] 0xffffffffffffffff [Thu Dec 12 01:55:50 2019][243529.668198] LustreError: dumping log to /tmp/lustre-log.1576144549.67625 [Thu Dec 12 01:56:54 2019][243594.438729] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842091 to 0x1980000401:11842113 [Thu Dec 12 01:56:55 2019][243595.127017] Pid: 66114, comm: ll_ost03_001 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:56:55 2019][243595.137533] Call Trace: [Thu Dec 12 01:56:55 2019][243595.140111] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:56:55 2019][243595.147141] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:56:55 2019][243595.154464] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:56:55 2019][243595.161125] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:56:55 2019][243595.167565] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:56:55 2019][243595.174608] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:56:55 2019][243595.182430] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:56:55 2019][243595.188847] [] kthread+0xd1/0xe0 [Thu Dec 12 01:56:55 2019][243595.193847] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:56:55 2019][243595.200415] [] 0xffffffffffffffff [Thu Dec 12 01:56:55 2019][243595.205522] LustreError: dumping log to /tmp/lustre-log.1576144615.66114 [Thu Dec 12 01:56:55 2019][243595.213232] LNet: Service thread pid 67660 was inactive for 1202.61s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 01:56:55 2019][243595.226294] LNet: Skipped 9 previous similar messages [Thu Dec 12 01:57:24 2019][243623.799598] LustreError: dumping log to /tmp/lustre-log.1576144644.67655 [Thu Dec 12 01:57:26 2019][243625.898880] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3081992 to 0x1a80000402:3082017 [Thu Dec 12 01:57:28 2019][243627.895668] LustreError: dumping log to /tmp/lustre-log.1576144648.112554 [Thu Dec 12 01:57:32 2019][243631.991757] LustreError: dumping log to /tmp/lustre-log.1576144652.67731 [Thu Dec 12 01:57:36 2019][243636.087837] LustreError: dumping log to /tmp/lustre-log.1576144656.67615 [Thu Dec 12 01:57:52 2019][243652.472166] LustreError: dumping log to /tmp/lustre-log.1576144672.112492 [Thu Dec 12 01:57:56 2019][243656.568278] LustreError: dumping log to /tmp/lustre-log.1576144676.67686 [Thu Dec 12 01:58:04 2019][243663.760132] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089704 to 0x1980000402:3089729 [Thu Dec 12 01:58:33 2019][243692.946701] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797424 to 0x1900000401:11797441 [Thu Dec 12 01:58:36 2019][243696.104758] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3085896 to 0x1900000402:3085921 [Thu Dec 12 01:59:35 2019][243754.874206] LustreError: dumping log to /tmp/lustre-log.1576144775.67740 [Thu Dec 12 02:00:03 2019][243783.546787] Pid: 67611, comm: ll_ost00_012 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:00:03 2019][243783.557305] Call Trace: [Thu Dec 12 02:00:03 2019][243783.559881] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:00:03 2019][243783.566885] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:00:03 2019][243783.574083] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:00:03 2019][243783.580736] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:00:03 2019][243783.587484] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:00:03 2019][243783.595010] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:00:03 2019][243783.602104] [] dqget+0x3fa/0x450 [Thu Dec 12 02:00:03 2019][243783.607109] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:00:03 2019][243783.612882] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:00:03 2019][243783.620498] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:00:03 2019][243783.626975] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:00:03 2019][243783.633129] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:00:03 2019][243783.640173] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:00:03 2019][243783.647987] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:00:03 2019][243783.654404] [] kthread+0xd1/0xe0 [Thu Dec 12 02:00:04 2019][243783.659419] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:00:04 2019][243783.665982] [] 0xffffffffffffffff [Thu Dec 12 02:00:04 2019][243783.671095] LustreError: dumping log to /tmp/lustre-log.1576144803.67611 [Thu Dec 12 02:00:31 2019][243811.012004] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122155 to 0x1a80000401:1122177 [Thu Dec 12 02:00:40 2019][243820.411516] Pid: 67840, comm: ll_ost01_062 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:00:40 2019][243820.422034] Call Trace: [Thu Dec 12 02:00:40 2019][243820.424595] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:00:40 2019][243820.431626] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:00:40 2019][243820.438923] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:00:40 2019][243820.445572] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:00:40 2019][243820.451973] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:00:40 2019][243820.459004] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:00:40 2019][243820.466820] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:00:40 2019][243820.473235] [] kthread+0xd1/0xe0 [Thu Dec 12 02:00:40 2019][243820.478238] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:00:40 2019][243820.484822] [] 0xffffffffffffffff [Thu Dec 12 02:00:40 2019][243820.489930] LustreError: dumping log to /tmp/lustre-log.1576144840.67840 [Thu Dec 12 02:01:17 2019][243857.276255] Pid: 67826, comm: ll_ost02_048 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:01:17 2019][243857.286773] Call Trace: [Thu Dec 12 02:01:17 2019][243857.289325] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:01:17 2019][243857.296365] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:01:17 2019][243857.303662] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:01:17 2019][243857.310309] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:01:17 2019][243857.316714] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:01:17 2019][243857.323753] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:01:17 2019][243857.331575] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:01:17 2019][243857.338028] [] kthread+0xd1/0xe0 [Thu Dec 12 02:01:17 2019][243857.343045] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:01:17 2019][243857.349638] [] 0xffffffffffffffff [Thu Dec 12 02:01:17 2019][243857.354764] LustreError: dumping log to /tmp/lustre-log.1576144877.67826 [Thu Dec 12 02:01:26 2019][243866.349098] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125734 to 0x1980000400:1125761 [Thu Dec 12 02:02:05 2019][243905.236943] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3070636 to 0x1800000400:3070657 [Thu Dec 12 02:02:17 2019][243916.958121] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124391 to 0x1900000400:1124417 [Thu Dec 12 02:02:19 2019][243918.717473] Pid: 112489, comm: ll_ost00_070 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:02:19 2019][243918.728081] Call Trace: [Thu Dec 12 02:02:19 2019][243918.730648] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:02:19 2019][243918.737656] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:02:19 2019][243918.744856] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:02:19 2019][243918.751506] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:02:19 2019][243918.758254] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:02:19 2019][243918.765780] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:02:19 2019][243918.772876] [] dqget+0x3fa/0x450 [Thu Dec 12 02:02:19 2019][243918.777880] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:02:19 2019][243918.783651] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:02:19 2019][243918.791268] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:02:19 2019][243918.797743] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:02:19 2019][243918.803901] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:02:19 2019][243918.810944] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:02:19 2019][243918.818742] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:02:19 2019][243918.825157] [] kthread+0xd1/0xe0 [Thu Dec 12 02:02:19 2019][243918.830172] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:02:19 2019][243918.836737] [] 0xffffffffffffffff [Thu Dec 12 02:02:19 2019][243918.841847] LustreError: dumping log to /tmp/lustre-log.1576144939.112489 [Thu Dec 12 02:02:25 2019][243924.942253] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117450 to 0x1800000402:1117473 [Thu Dec 12 02:02:31 2019][243931.005736] Pid: 67695, comm: ll_ost01_037 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:02:31 2019][243931.016255] Call Trace: [Thu Dec 12 02:02:31 2019][243931.018813] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:02:31 2019][243931.025846] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:02:31 2019][243931.033145] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:02:31 2019][243931.039790] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:02:31 2019][243931.046192] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:02:31 2019][243931.053233] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:02:31 2019][243931.061048] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:02:31 2019][243931.067465] [] kthread+0xd1/0xe0 [Thu Dec 12 02:02:31 2019][243931.072465] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:02:31 2019][243931.079042] [] 0xffffffffffffffff [Thu Dec 12 02:02:31 2019][243931.084150] LustreError: dumping log to /tmp/lustre-log.1576144951.67695 [Thu Dec 12 02:02:41 2019][243940.790925] LustreError: 67642:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576144661, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005e_UUID lock: ffff890020f4f500/0x7066c9c190b32db0 lrc: 3/0,1 mode: --/PW res: [0x1a80000401:0x111f69:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67642 timeout: 0 lvb_type: 0 [Thu Dec 12 02:02:41 2019][243940.834649] LustreError: 67642:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 4 previous similar messages [Thu Dec 12 02:02:43 2019][243943.293967] LustreError: dumping log to /tmp/lustre-log.1576144963.67216 [Thu Dec 12 02:03:04 2019][243963.774379] LustreError: dumping log to /tmp/lustre-log.1576144984.67405 [Thu Dec 12 02:03:28 2019][243988.350855] LustreError: dumping log to /tmp/lustre-log.1576145008.67627 [Thu Dec 12 02:04:00 2019][244019.723529] Lustre: fir-OST0054: Client fe16bc49-4bbe-dc30-a069-fee92bf3e984 (at 10.9.104.23@o2ib4) reconnecting [Thu Dec 12 02:04:00 2019][244019.733802] Lustre: Skipped 743 previous similar messages [Thu Dec 12 02:04:04 2019][244023.943208] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800494 to 0x1a80000400:11800513 [Thu Dec 12 02:04:05 2019][244025.215578] LustreError: dumping log to /tmp/lustre-log.1576145045.67589 [Thu Dec 12 02:04:33 2019][244052.833348] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467461 to 0x0:27467489 [Thu Dec 12 02:04:35 2019][244055.592844] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105448 to 0x1a00000402:1105473 [Thu Dec 12 02:04:53 2019][244073.568554] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 02:04:53 2019][244073.568554] req@ffff890d6ff65050 x1650929716577312/t0(0) o10->68d8bbaa-aab1-a78f-323f-aaff4c375c30@10.8.7.4@o2ib6:303/0 lens 440/0 e 0 to 0 dl 1576145098 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 02:04:53 2019][244073.597191] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1551 previous similar messages [Thu Dec 12 02:04:54 2019][244074.368546] LustreError: dumping log to /tmp/lustre-log.1576145094.112526 [Thu Dec 12 02:05:16 2019][244096.361888] Lustre: fir-OST005c: Connection restored to cec884d3-ca4b-8127-2f6b-7762665aa5f8 (at 10.9.0.64@o2ib4) [Thu Dec 12 02:05:16 2019][244096.372243] Lustre: Skipped 765 previous similar messages [Thu Dec 12 02:05:19 2019][244098.945044] LNet: Service thread pid 112509 was inactive for 1204.01s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 02:05:19 2019][244098.962237] LNet: Skipped 9 previous similar messages [Thu Dec 12 02:05:19 2019][244098.967385] Pid: 112509, comm: ll_ost00_081 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:05:19 2019][244098.978009] Call Trace: [Thu Dec 12 02:05:19 2019][244098.980586] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:05:19 2019][244098.987583] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:05:19 2019][244098.994751] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:05:19 2019][244099.001416] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:05:19 2019][244099.008150] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:05:19 2019][244099.015704] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:05:19 2019][244099.022806] [] dqget+0x3fa/0x450 [Thu Dec 12 02:05:19 2019][244099.027820] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:05:19 2019][244099.033606] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:05:19 2019][244099.041229] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:05:19 2019][244099.047704] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:05:19 2019][244099.053847] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:05:19 2019][244099.060887] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:05:19 2019][244099.068712] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:05:19 2019][244099.075128] [] kthread+0xd1/0xe0 [Thu Dec 12 02:05:19 2019][244099.080129] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:05:19 2019][244099.086719] [] 0xffffffffffffffff [Thu Dec 12 02:05:19 2019][244099.091815] LustreError: dumping log to /tmp/lustre-log.1576145119.112509 [Thu Dec 12 02:05:21 2019][244101.511858] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 02:05:21 2019][244101.520826] Lustre: Skipped 71 previous similar messages [Thu Dec 12 02:05:43 2019][244123.521532] Pid: 67603, comm: ll_ost02_010 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:05:43 2019][244123.532052] Call Trace: [Thu Dec 12 02:05:43 2019][244123.534604] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 02:05:43 2019][244123.541002] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:05:43 2019][244123.548066] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:05:43 2019][244123.555866] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:05:43 2019][244123.562296] [] kthread+0xd1/0xe0 [Thu Dec 12 02:05:43 2019][244123.567300] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:05:43 2019][244123.573874] [] 0xffffffffffffffff [Thu Dec 12 02:05:43 2019][244123.578983] LustreError: dumping log to /tmp/lustre-log.1576145143.67603 [Thu Dec 12 02:05:47 2019][244127.617601] Pid: 67406, comm: ll_ost00_007 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:05:47 2019][244127.628123] Call Trace: [Thu Dec 12 02:05:47 2019][244127.630693] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:05:47 2019][244127.637692] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:05:47 2019][244127.644875] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:05:47 2019][244127.651524] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:05:47 2019][244127.658287] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:05:48 2019][244127.665814] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:05:48 2019][244127.672911] [] dqget+0x3fa/0x450 [Thu Dec 12 02:05:48 2019][244127.677913] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:05:48 2019][244127.683701] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:05:48 2019][244127.691311] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:05:48 2019][244127.697801] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:05:48 2019][244127.703938] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:05:48 2019][244127.710979] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:05:48 2019][244127.718792] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:05:48 2019][244127.725227] [] kthread+0xd1/0xe0 [Thu Dec 12 02:05:48 2019][244127.730244] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:05:48 2019][244127.736806] [] 0xffffffffffffffff [Thu Dec 12 02:05:48 2019][244127.741919] LustreError: dumping log to /tmp/lustre-log.1576145148.67406 [Thu Dec 12 02:06:25 2019][244165.564579] Lustre: fir-OST005c: deleting orphan objects from 0x0:27192451 to 0x0:27192481 [Thu Dec 12 02:07:13 2019][244213.564536] Lustre: fir-OST0058: deleting orphan objects from 0x0:27506755 to 0x0:27506785 [Thu Dec 12 02:07:18 2019][244217.731400] Pid: 67608, comm: ll_ost01_011 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:07:18 2019][244217.741917] Call Trace: [Thu Dec 12 02:07:18 2019][244217.744478] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:07:18 2019][244217.751519] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:07:18 2019][244217.758816] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:07:18 2019][244217.765472] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:07:18 2019][244217.771875] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:07:18 2019][244217.778905] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:07:18 2019][244217.786718] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:07:18 2019][244217.793136] [] kthread+0xd1/0xe0 [Thu Dec 12 02:07:18 2019][244217.798135] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:07:18 2019][244217.804706] [] 0xffffffffffffffff [Thu Dec 12 02:07:18 2019][244217.809824] LustreError: dumping log to /tmp/lustre-log.1576145238.67608 [Thu Dec 12 02:07:18 2019][244217.817430] Pid: 112565, comm: ll_ost03_078 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:07:18 2019][244217.828059] Call Trace: [Thu Dec 12 02:07:18 2019][244217.830606] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 02:07:18 2019][244217.836998] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:07:18 2019][244217.844034] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:07:18 2019][244217.851836] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:07:18 2019][244217.858256] [] kthread+0xd1/0xe0 [Thu Dec 12 02:07:18 2019][244217.863250] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:07:18 2019][244217.869803] [] 0xffffffffffffffff [Thu Dec 12 02:07:27 2019][244226.910827] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082024 to 0x1a80000402:3082049 [Thu Dec 12 02:07:30 2019][244230.019650] LNet: Service thread pid 67899 was inactive for 1203.17s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 02:07:30 2019][244230.032681] LNet: Skipped 15 previous similar messages [Thu Dec 12 02:07:30 2019][244230.037916] LustreError: dumping log to /tmp/lustre-log.1576145250.67899 [Thu Dec 12 02:07:45 2019][244245.307605] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785353 to 0x1800000401:11785377 [Thu Dec 12 02:07:49 2019][244249.307760] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3048936 to 0x1a00000401:3048961 [Thu Dec 12 02:07:54 2019][244254.596126] LustreError: dumping log to /tmp/lustre-log.1576145274.67867 [Thu Dec 12 02:08:05 2019][244264.732052] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089736 to 0x1980000402:3089761 [Thu Dec 12 02:09:30 2019][244350.109693] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842123 to 0x1980000401:11842145 [Thu Dec 12 02:10:06 2019][244385.670715] LustreError: dumping log to /tmp/lustre-log.1576145405.112513 [Thu Dec 12 02:10:30 2019][244410.247206] Pid: 112497, comm: ll_ost00_077 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:10:30 2019][244410.257812] Call Trace: [Thu Dec 12 02:10:30 2019][244410.260387] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:10:30 2019][244410.267386] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:10:30 2019][244410.274573] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:10:30 2019][244410.281219] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:10:30 2019][244410.287967] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:10:30 2019][244410.295494] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:10:30 2019][244410.302591] [] dqget+0x3fa/0x450 [Thu Dec 12 02:10:30 2019][244410.307609] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:10:30 2019][244410.313382] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:10:30 2019][244410.321008] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:10:30 2019][244410.327483] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:10:30 2019][244410.333625] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:10:30 2019][244410.340664] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:10:30 2019][244410.348481] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:10:30 2019][244410.354894] [] kthread+0xd1/0xe0 [Thu Dec 12 02:10:30 2019][244410.359914] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:10:30 2019][244410.366475] [] 0xffffffffffffffff [Thu Dec 12 02:10:30 2019][244410.371602] LustreError: dumping log to /tmp/lustre-log.1576145430.112497 [Thu Dec 12 02:10:30 2019][244410.379116] Pid: 112490, comm: ll_ost00_071 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:10:30 2019][244410.389754] Call Trace: [Thu Dec 12 02:10:30 2019][244410.392302] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:10:30 2019][244410.399311] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:10:30 2019][244410.406480] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:10:30 2019][244410.413144] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:10:30 2019][244410.419882] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:10:30 2019][244410.427418] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:10:30 2019][244410.434502] [] dqget+0x3fa/0x450 [Thu Dec 12 02:10:30 2019][244410.439529] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:10:30 2019][244410.445310] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:10:30 2019][244410.452923] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:10:30 2019][244410.459400] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:10:30 2019][244410.465544] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:10:30 2019][244410.472565] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:10:30 2019][244410.480381] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:10:30 2019][244410.486798] [] kthread+0xd1/0xe0 [Thu Dec 12 02:10:30 2019][244410.491802] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:10:30 2019][244410.498358] [] 0xffffffffffffffff [Thu Dec 12 02:11:09 2019][244449.103660] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797453 to 0x1900000401:11797473 [Thu Dec 12 02:11:12 2019][244452.007748] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3085933 to 0x1900000402:3085953 [Thu Dec 12 02:11:27 2019][244466.964684] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125767 to 0x1980000400:1125793 [Thu Dec 12 02:11:27 2019][244467.592340] Pid: 113354, comm: ll_ost02_092 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:11:27 2019][244467.602943] Call Trace: [Thu Dec 12 02:11:27 2019][244467.605494] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:11:27 2019][244467.612527] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:11:27 2019][244467.619814] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:11:27 2019][244467.626480] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:11:27 2019][244467.632881] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:11:27 2019][244467.639913] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:11:27 2019][244467.647729] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:11:27 2019][244467.654147] [] kthread+0xd1/0xe0 [Thu Dec 12 02:11:28 2019][244467.659148] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:11:28 2019][244467.665723] [] 0xffffffffffffffff [Thu Dec 12 02:11:28 2019][244467.670832] LustreError: dumping log to /tmp/lustre-log.1576145487.113354 [Thu Dec 12 02:11:48 2019][244488.072775] Pid: 112524, comm: ll_ost03_068 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:11:48 2019][244488.083381] Call Trace: [Thu Dec 12 02:11:48 2019][244488.085941] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:11:48 2019][244488.092980] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:11:48 2019][244488.100279] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:11:48 2019][244488.106927] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:11:48 2019][244488.113332] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:11:48 2019][244488.120370] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:11:48 2019][244488.128185] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:11:48 2019][244488.134599] [] kthread+0xd1/0xe0 [Thu Dec 12 02:11:48 2019][244488.139601] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:11:48 2019][244488.146177] [] 0xffffffffffffffff [Thu Dec 12 02:11:48 2019][244488.151286] LustreError: dumping log to /tmp/lustre-log.1576145508.112524 [Thu Dec 12 02:12:13 2019][244512.649214] Pid: 67654, comm: ll_ost02_022 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:12:13 2019][244512.659732] Call Trace: [Thu Dec 12 02:12:13 2019][244512.662292] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 02:12:13 2019][244512.668691] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:12:13 2019][244512.675754] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:12:13 2019][244512.683555] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:12:13 2019][244512.689983] [] kthread+0xd1/0xe0 [Thu Dec 12 02:12:13 2019][244512.694996] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:12:13 2019][244512.701584] [] 0xffffffffffffffff [Thu Dec 12 02:12:13 2019][244512.706706] LustreError: dumping log to /tmp/lustre-log.1576145533.67654 [Thu Dec 12 02:12:17 2019][244516.745305] LustreError: dumping log to /tmp/lustre-log.1576145537.67815 [Thu Dec 12 02:12:41 2019][244541.321771] LustreError: dumping log to /tmp/lustre-log.1576145561.112563 [Thu Dec 12 02:14:01 2019][244621.267131] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 02:14:01 2019][244621.277306] Lustre: Skipped 789 previous similar messages [Thu Dec 12 02:14:15 2019][244635.401272] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122183 to 0x1a80000401:1122209 [Thu Dec 12 02:14:36 2019][244656.004751] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105448 to 0x1a00000402:1105505 [Thu Dec 12 02:14:41 2019][244661.355914] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3070661 to 0x1800000400:3070689 [Thu Dec 12 02:14:52 2019][244672.396368] LustreError: dumping log to /tmp/lustre-log.1576145692.67860 [Thu Dec 12 02:14:53 2019][244673.133159] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124428 to 0x1900000400:1124449 [Thu Dec 12 02:14:54 2019][244674.380427] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 02:14:54 2019][244674.380427] req@ffff890df37ee850 x1650929716577312/t0(0) o10->68d8bbaa-aab1-a78f-323f-aaff4c375c30@10.8.7.4@o2ib6:149/0 lens 440/0 e 0 to 0 dl 1576145699 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 02:14:54 2019][244674.409087] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1589 previous similar messages [Thu Dec 12 02:15:01 2019][244680.829213] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117480 to 0x1800000402:1117505 [Thu Dec 12 02:15:17 2019][244696.972864] LustreError: dumping log to /tmp/lustre-log.1576145717.112502 [Thu Dec 12 02:15:17 2019][244696.984894] Lustre: fir-OST005a: Connection restored to f5f4e1fb-09a1-cbb0-925c-f94d3727005b (at 10.9.101.44@o2ib4) [Thu Dec 12 02:15:17 2019][244696.995421] Lustre: Skipped 794 previous similar messages [Thu Dec 12 02:15:24 2019][244703.635682] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 02:15:24 2019][244703.644647] Lustre: Skipped 71 previous similar messages [Thu Dec 12 02:16:06 2019][244746.125835] LNet: Service thread pid 67779 was inactive for 1200.45s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 02:16:06 2019][244746.142959] LNet: Skipped 9 previous similar messages [Thu Dec 12 02:16:06 2019][244746.148106] Pid: 67779, comm: ll_ost01_050 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:16:06 2019][244746.158643] Call Trace: [Thu Dec 12 02:16:06 2019][244746.161193] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:16:06 2019][244746.168227] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:16:06 2019][244746.175513] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:16:06 2019][244746.182163] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:16:06 2019][244746.188565] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:16:06 2019][244746.195596] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:16:06 2019][244746.203411] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:16:06 2019][244746.209844] [] kthread+0xd1/0xe0 [Thu Dec 12 02:16:06 2019][244746.214859] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:16:06 2019][244746.221423] [] 0xffffffffffffffff [Thu Dec 12 02:16:06 2019][244746.226545] LustreError: dumping log to /tmp/lustre-log.1576145766.67779 [Thu Dec 12 02:16:06 2019][244746.234164] Pid: 112545, comm: ll_ost01_089 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:16:06 2019][244746.244796] Call Trace: [Thu Dec 12 02:16:06 2019][244746.247343] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 02:16:06 2019][244746.253733] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:16:06 2019][244746.260769] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:16:06 2019][244746.268588] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:16:06 2019][244746.275032] [] kthread+0xd1/0xe0 [Thu Dec 12 02:16:06 2019][244746.280038] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:16:06 2019][244746.286604] [] 0xffffffffffffffff [Thu Dec 12 02:16:26 2019][244766.606244] Pid: 66903, comm: ll_ost02_006 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:16:26 2019][244766.616760] Call Trace: [Thu Dec 12 02:16:26 2019][244766.619312] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:16:26 2019][244766.626355] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:16:26 2019][244766.633651] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:16:26 2019][244766.640310] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:16:26 2019][244766.646712] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:16:26 2019][244766.653768] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:16:26 2019][244766.661591] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:16:26 2019][244766.668010] [] kthread+0xd1/0xe0 [Thu Dec 12 02:16:26 2019][244766.673009] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:16:27 2019][244766.679578] [] 0xffffffffffffffff [Thu Dec 12 02:16:27 2019][244766.684687] LustreError: dumping log to /tmp/lustre-log.1576145786.66903 [Thu Dec 12 02:16:40 2019][244779.766220] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800522 to 0x1a80000400:11800545 [Thu Dec 12 02:16:56 2019][244796.436850] LustreError: 112547:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576145516, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005a_UUID lock: ffff88e9f0f4d340/0x7066c9c190b38bea lrc: 3/0,1 mode: --/PW res: [0x1a490d4:0x0:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 112547 timeout: 0 lvb_type: 0 [Thu Dec 12 02:16:56 2019][244796.480062] LustreError: 112547:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 9 previous similar messages [Thu Dec 12 02:17:09 2019][244809.056343] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467491 to 0x0:27467521 [Thu Dec 12 02:17:28 2019][244827.882738] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082051 to 0x1a80000402:3082081 [Thu Dec 12 02:17:28 2019][244828.047456] Pid: 66384, comm: ll_ost00_004 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:17:28 2019][244828.057974] Call Trace: [Thu Dec 12 02:17:28 2019][244828.060544] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:17:28 2019][244828.067543] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:17:28 2019][244828.074733] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:17:28 2019][244828.081382] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:17:28 2019][244828.088118] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:17:28 2019][244828.095669] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:17:28 2019][244828.102765] [] dqget+0x3fa/0x450 [Thu Dec 12 02:17:28 2019][244828.107779] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:17:28 2019][244828.113553] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:17:28 2019][244828.121178] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:17:28 2019][244828.127663] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:17:28 2019][244828.133812] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:17:28 2019][244828.140859] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:17:28 2019][244828.148676] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:17:28 2019][244828.155122] [] kthread+0xd1/0xe0 [Thu Dec 12 02:17:28 2019][244828.160123] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:17:28 2019][244828.166708] [] 0xffffffffffffffff [Thu Dec 12 02:17:28 2019][244828.171807] LustreError: dumping log to /tmp/lustre-log.1576145848.66384 [Thu Dec 12 02:17:44 2019][244844.431793] Pid: 67642, comm: ll_ost01_020 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:17:44 2019][244844.442314] Call Trace: [Thu Dec 12 02:17:44 2019][244844.444868] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:17:44 2019][244844.451906] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:17:44 2019][244844.459202] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:17:44 2019][244844.465852] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:17:44 2019][244844.472255] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:17:44 2019][244844.479309] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:17:44 2019][244844.487144] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:17:44 2019][244844.493577] [] kthread+0xd1/0xe0 [Thu Dec 12 02:17:44 2019][244844.498577] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:17:44 2019][244844.505161] [] 0xffffffffffffffff [Thu Dec 12 02:17:44 2019][244844.510274] LustreError: dumping log to /tmp/lustre-log.1576145864.67642 [Thu Dec 12 02:17:52 2019][244852.623954] LNet: Service thread pid 112540 was inactive for 1201.63s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 02:17:52 2019][244852.637077] LNet: Skipped 6 previous similar messages [Thu Dec 12 02:17:52 2019][244852.642225] LustreError: dumping log to /tmp/lustre-log.1576145872.112540 [Thu Dec 12 02:18:06 2019][244866.327961] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089765 to 0x1980000402:3089793 [Thu Dec 12 02:18:13 2019][244873.104370] LustreError: dumping log to /tmp/lustre-log.1576145893.67681 [Thu Dec 12 02:18:17 2019][244877.200447] LustreError: dumping log to /tmp/lustre-log.1576145897.67776 [Thu Dec 12 02:18:21 2019][244881.296515] LustreError: dumping log to /tmp/lustre-log.1576145901.67811 [Thu Dec 12 02:19:01 2019][244921.210547] Lustre: fir-OST005c: deleting orphan objects from 0x0:27192484 to 0x0:27192513 [Thu Dec 12 02:19:06 2019][244926.353407] LustreError: dumping log to /tmp/lustre-log.1576145946.67613 [Thu Dec 12 02:19:10 2019][244930.449508] LustreError: dumping log to /tmp/lustre-log.1576145950.67704 [Thu Dec 12 02:19:14 2019][244934.545583] LustreError: dumping log to /tmp/lustre-log.1576145954.67857 [Thu Dec 12 02:19:35 2019][244955.025978] LustreError: dumping log to /tmp/lustre-log.1576145975.67909 [Thu Dec 12 02:19:43 2019][244963.218142] LustreError: dumping log to /tmp/lustre-log.1576145983.67854 [Thu Dec 12 02:19:49 2019][244968.739568] Lustre: fir-OST0058: deleting orphan objects from 0x0:27506790 to 0x0:27506817 [Thu Dec 12 02:19:55 2019][244975.506392] LustreError: dumping log to /tmp/lustre-log.1576145995.67648 [Thu Dec 12 02:20:04 2019][244983.698547] LustreError: dumping log to /tmp/lustre-log.1576146003.67735 [Thu Dec 12 02:20:21 2019][245000.890591] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785385 to 0x1800000401:11785409 [Thu Dec 12 02:20:24 2019][245004.178952] LustreError: dumping log to /tmp/lustre-log.1576146024.67661 [Thu Dec 12 02:20:25 2019][245004.874736] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3048967 to 0x1a00000401:3048993 [Thu Dec 12 02:20:28 2019][245008.275039] LustreError: dumping log to /tmp/lustre-log.1576146028.67748 [Thu Dec 12 02:21:28 2019][245068.352651] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125800 to 0x1980000400:1125825 [Thu Dec 12 02:22:06 2019][245105.804728] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842158 to 0x1980000401:11842177 [Thu Dec 12 02:22:10 2019][245110.677064] Pid: 67149, comm: ll_ost01_007 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:22:10 2019][245110.687584] Call Trace: [Thu Dec 12 02:22:10 2019][245110.690153] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:22:10 2019][245110.697151] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:22:11 2019][245110.704337] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:22:11 2019][245110.710976] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:22:11 2019][245110.717701] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:22:11 2019][245110.725216] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:22:11 2019][245110.732310] [] dqget+0x3fa/0x450 [Thu Dec 12 02:22:11 2019][245110.737330] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:22:11 2019][245110.743115] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:22:11 2019][245110.750723] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:22:11 2019][245110.757209] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:22:11 2019][245110.763341] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:22:11 2019][245110.770400] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:22:11 2019][245110.778206] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:22:11 2019][245110.784631] [] kthread+0xd1/0xe0 [Thu Dec 12 02:22:11 2019][245110.789634] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:22:11 2019][245110.796203] [] 0xffffffffffffffff [Thu Dec 12 02:22:11 2019][245110.801317] LustreError: dumping log to /tmp/lustre-log.1576146131.67149 [Thu Dec 12 02:22:15 2019][245114.773156] Pid: 67601, comm: ll_ost00_009 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:22:15 2019][245114.783692] Call Trace: [Thu Dec 12 02:22:15 2019][245114.786264] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:22:15 2019][245114.793260] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:22:15 2019][245114.800445] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:22:15 2019][245114.807092] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:22:15 2019][245114.813841] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:22:15 2019][245114.821365] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:22:15 2019][245114.828463] [] dqget+0x3fa/0x450 [Thu Dec 12 02:22:15 2019][245114.833481] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:22:15 2019][245114.839271] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:22:15 2019][245114.846902] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:22:15 2019][245114.853390] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:22:15 2019][245114.859548] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:22:15 2019][245114.866591] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:22:15 2019][245114.874420] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:22:15 2019][245114.880837] [] kthread+0xd1/0xe0 [Thu Dec 12 02:22:15 2019][245114.885854] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:22:15 2019][245114.892417] [] 0xffffffffffffffff [Thu Dec 12 02:22:15 2019][245114.897544] LustreError: dumping log to /tmp/lustre-log.1576146135.67601 [Thu Dec 12 02:22:19 2019][245118.869229] Pid: 67927, comm: ll_ost01_075 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:22:19 2019][245118.879747] Call Trace: [Thu Dec 12 02:22:19 2019][245118.882317] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:22:19 2019][245118.889312] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:22:19 2019][245118.896497] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:22:19 2019][245118.903146] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:22:19 2019][245118.909893] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:22:19 2019][245118.917421] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:22:19 2019][245118.924516] [] dqget+0x3fa/0x450 [Thu Dec 12 02:22:19 2019][245118.929535] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:22:19 2019][245118.935322] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:22:19 2019][245118.942935] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:22:19 2019][245118.949424] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:22:19 2019][245118.955553] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:22:19 2019][245118.962615] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:22:19 2019][245118.970419] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:22:19 2019][245118.976845] [] kthread+0xd1/0xe0 [Thu Dec 12 02:22:19 2019][245118.981851] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:22:19 2019][245118.988424] [] 0xffffffffffffffff [Thu Dec 12 02:22:19 2019][245118.993539] LustreError: dumping log to /tmp/lustre-log.1576146139.67927 [Thu Dec 12 02:22:39 2019][245139.349639] Pid: 67852, comm: ll_ost00_059 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:22:39 2019][245139.360174] Call Trace: [Thu Dec 12 02:22:39 2019][245139.362753] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:22:39 2019][245139.369776] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:22:39 2019][245139.377005] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:22:39 2019][245139.383678] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:22:39 2019][245139.390454] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:22:39 2019][245139.398003] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:22:39 2019][245139.405123] [] dqget+0x3fa/0x450 [Thu Dec 12 02:22:39 2019][245139.410137] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:22:39 2019][245139.415917] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:22:39 2019][245139.423577] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:22:39 2019][245139.430078] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:22:39 2019][245139.436230] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:22:39 2019][245139.443270] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:22:39 2019][245139.451097] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:22:39 2019][245139.457509] [] kthread+0xd1/0xe0 [Thu Dec 12 02:22:39 2019][245139.462511] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:22:39 2019][245139.469079] [] 0xffffffffffffffff [Thu Dec 12 02:22:39 2019][245139.474179] LustreError: dumping log to /tmp/lustre-log.1576146159.67852 [Thu Dec 12 02:23:04 2019][245163.926129] Pid: 67634, comm: ll_ost00_018 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:23:04 2019][245163.936663] Call Trace: [Thu Dec 12 02:23:04 2019][245163.939239] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:23:04 2019][245163.946230] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:23:04 2019][245163.953406] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:23:04 2019][245163.960055] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:23:04 2019][245163.966804] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:23:04 2019][245163.974345] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:23:04 2019][245163.981442] [] dqget+0x3fa/0x450 [Thu Dec 12 02:23:04 2019][245163.986436] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:23:04 2019][245163.992223] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:23:04 2019][245163.999835] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:23:04 2019][245164.006323] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:23:04 2019][245164.012453] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:23:04 2019][245164.019498] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:23:04 2019][245164.027300] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:23:04 2019][245164.033744] [] kthread+0xd1/0xe0 [Thu Dec 12 02:23:04 2019][245164.038757] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:23:04 2019][245164.045350] [] 0xffffffffffffffff [Thu Dec 12 02:23:04 2019][245164.050451] LustreError: dumping log to /tmp/lustre-log.1576146184.67634 [Thu Dec 12 02:23:08 2019][245168.022259] LustreError: dumping log to /tmp/lustre-log.1576146188.67821 [Thu Dec 12 02:23:45 2019][245205.326654] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797482 to 0x1900000401:11797505 [Thu Dec 12 02:23:48 2019][245207.798809] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3085959 to 0x1900000402:3085985 [Thu Dec 12 02:23:57 2019][245217.175196] LustreError: dumping log to /tmp/lustre-log.1576146237.112558 [Thu Dec 12 02:24:02 2019][245222.073473] Lustre: fir-OST005a: Client 717fa73e-8071-a76f-931e-8957a8ca32aa (at 10.9.101.41@o2ib4) reconnecting [Thu Dec 12 02:24:02 2019][245222.083737] Lustre: Skipped 792 previous similar messages [Thu Dec 12 02:24:32 2019][245252.465167] Lustre: fir-OST005a: deleting orphan objects from 0x0:27562200 to 0x0:27562241 [Thu Dec 12 02:24:37 2019][245256.976675] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105514 to 0x1a00000402:1105537 [Thu Dec 12 02:24:50 2019][245270.424257] LustreError: dumping log to /tmp/lustre-log.1576146290.67691 [Thu Dec 12 02:24:54 2019][245274.520335] LustreError: dumping log to /tmp/lustre-log.1576146294.67632 [Thu Dec 12 02:24:54 2019][245274.584357] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 02:24:54 2019][245274.584357] req@ffff88ebc457c050 x1648527343518128/t0(0) o10->1d4d1153-82cd-6bbc-4932-1e6a2a506ca0@10.8.30.27@o2ib6:749/0 lens 440/0 e 0 to 0 dl 1576146299 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 02:24:54 2019][245274.613172] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1630 previous similar messages [Thu Dec 12 02:25:18 2019][245297.957831] Lustre: fir-OST005a: Connection restored to f5f4e1fb-09a1-cbb0-925c-f94d3727005b (at 10.9.101.44@o2ib4) [Thu Dec 12 02:25:18 2019][245297.968354] Lustre: Skipped 811 previous similar messages [Thu Dec 12 02:25:19 2019][245299.096823] LustreError: dumping log to /tmp/lustre-log.1576146319.67916 [Thu Dec 12 02:25:26 2019][245305.759932] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 02:25:26 2019][245305.768898] Lustre: Skipped 71 previous similar messages [Thu Dec 12 02:25:43 2019][245323.082001] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122212 to 0x1a80000401:1122241 [Thu Dec 12 02:26:33 2019][245372.826286] LustreError: dumping log to /tmp/lustre-log.1576146393.112536 [Thu Dec 12 02:27:17 2019][245417.682987] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3070697 to 0x1800000400:3070721 [Thu Dec 12 02:27:29 2019][245428.780099] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124451 to 0x1900000400:1124481 [Thu Dec 12 02:27:29 2019][245428.958687] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082088 to 0x1a80000402:3082113 [Thu Dec 12 02:27:30 2019][245430.171442] LNet: Service thread pid 112567 was inactive for 1203.28s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 02:27:30 2019][245430.188635] LNet: Skipped 9 previous similar messages [Thu Dec 12 02:27:30 2019][245430.193785] Pid: 112567, comm: ll_ost00_097 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:27:30 2019][245430.204406] Call Trace: [Thu Dec 12 02:27:30 2019][245430.206974] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:27:30 2019][245430.213973] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:27:30 2019][245430.221158] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:27:30 2019][245430.227806] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:27:30 2019][245430.234554] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:27:30 2019][245430.242080] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:27:30 2019][245430.249176] [] dqget+0x3fa/0x450 [Thu Dec 12 02:27:30 2019][245430.254182] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:27:30 2019][245430.259970] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:27:30 2019][245430.267602] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:27:30 2019][245430.274080] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:27:30 2019][245430.280222] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:27:30 2019][245430.287261] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:27:30 2019][245430.295077] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:27:30 2019][245430.301493] [] kthread+0xd1/0xe0 [Thu Dec 12 02:27:30 2019][245430.306507] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:27:30 2019][245430.313070] [] 0xffffffffffffffff [Thu Dec 12 02:27:30 2019][245430.318184] LustreError: dumping log to /tmp/lustre-log.1576146450.112567 [Thu Dec 12 02:27:37 2019][245436.852255] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117514 to 0x1800000402:1117537 [Thu Dec 12 02:27:55 2019][245454.747927] Pid: 66253, comm: ll_ost00_003 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:27:55 2019][245454.758449] Call Trace: [Thu Dec 12 02:27:55 2019][245454.761018] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:27:55 2019][245454.768017] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:27:55 2019][245454.775203] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:27:55 2019][245454.781847] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:27:55 2019][245454.788597] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:27:55 2019][245454.796124] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:27:55 2019][245454.803218] [] dqget+0x3fa/0x450 [Thu Dec 12 02:27:55 2019][245454.808221] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:27:55 2019][245454.813992] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:27:55 2019][245454.821621] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:27:55 2019][245454.828094] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:27:55 2019][245454.834238] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:27:55 2019][245454.841276] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:27:55 2019][245454.849094] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:27:55 2019][245454.855508] [] kthread+0xd1/0xe0 [Thu Dec 12 02:27:55 2019][245454.860508] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:27:55 2019][245454.867070] [] 0xffffffffffffffff [Thu Dec 12 02:27:55 2019][245454.872182] LustreError: dumping log to /tmp/lustre-log.1576146475.66253 [Thu Dec 12 02:28:07 2019][245466.795979] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089802 to 0x1980000402:3089825 [Thu Dec 12 02:28:44 2019][245503.900901] Pid: 67788, comm: ll_ost01_051 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:28:44 2019][245503.911421] Call Trace: [Thu Dec 12 02:28:44 2019][245503.913974] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 02:28:44 2019][245503.920372] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:28:44 2019][245503.927436] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:28:44 2019][245503.935237] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:28:44 2019][245503.941664] [] kthread+0xd1/0xe0 [Thu Dec 12 02:28:44 2019][245503.946669] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:28:44 2019][245503.953243] [] 0xffffffffffffffff [Thu Dec 12 02:28:44 2019][245503.958353] LustreError: dumping log to /tmp/lustre-log.1576146524.67788 [Thu Dec 12 02:29:16 2019][245535.965256] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800552 to 0x1a80000400:11800577 [Thu Dec 12 02:29:45 2019][245564.735392] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467524 to 0x0:27467553 [Thu Dec 12 02:30:06 2019][245585.822526] Pid: 112550, comm: ll_ost00_091 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:30:06 2019][245585.833129] Call Trace: [Thu Dec 12 02:30:06 2019][245585.835699] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:30:06 2019][245585.842697] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:30:06 2019][245585.849900] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:30:06 2019][245585.856547] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:30:06 2019][245585.863310] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:30:06 2019][245585.870832] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:30:06 2019][245585.877938] [] dqget+0x3fa/0x450 [Thu Dec 12 02:30:06 2019][245585.882927] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:30:06 2019][245585.888701] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:30:06 2019][245585.896319] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:30:06 2019][245585.902791] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:30:06 2019][245585.908935] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:30:06 2019][245585.915967] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:30:06 2019][245585.923796] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:30:06 2019][245585.930206] [] kthread+0xd1/0xe0 [Thu Dec 12 02:30:06 2019][245585.935213] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:30:06 2019][245585.941768] [] 0xffffffffffffffff [Thu Dec 12 02:30:06 2019][245585.946870] LustreError: dumping log to /tmp/lustre-log.1576146606.112550 [Thu Dec 12 02:30:30 2019][245610.399022] Pid: 112552, comm: ll_ost00_092 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:30:30 2019][245610.409630] Call Trace: [Thu Dec 12 02:30:30 2019][245610.412209] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:30:30 2019][245610.419214] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:30:30 2019][245610.426400] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:30:30 2019][245610.433071] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:30:30 2019][245610.439845] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:30:30 2019][245610.447390] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:30:30 2019][245610.454489] [] dqget+0x3fa/0x450 [Thu Dec 12 02:30:30 2019][245610.459496] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:30:30 2019][245610.465268] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:30:30 2019][245610.472886] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:30:30 2019][245610.479361] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:30:30 2019][245610.485506] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:30:30 2019][245610.492544] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:30:30 2019][245610.500344] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:30:30 2019][245610.506772] [] kthread+0xd1/0xe0 [Thu Dec 12 02:30:30 2019][245610.511777] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:30:30 2019][245610.518351] [] 0xffffffffffffffff [Thu Dec 12 02:30:30 2019][245610.523444] LustreError: dumping log to /tmp/lustre-log.1576146630.112552 [Thu Dec 12 02:30:55 2019][245634.975512] LNet: Service thread pid 66233 was inactive for 1200.91s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 02:30:55 2019][245634.988547] LNet: Skipped 19 previous similar messages [Thu Dec 12 02:30:55 2019][245634.993785] LustreError: dumping log to /tmp/lustre-log.1576146655.66233 [Thu Dec 12 02:30:59 2019][245639.071601] LustreError: dumping log to /tmp/lustre-log.1576146659.67676 [Thu Dec 12 02:31:29 2019][245668.810889] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125800 to 0x1980000400:1125857 [Thu Dec 12 02:31:37 2019][245677.033595] Lustre: fir-OST005c: deleting orphan objects from 0x0:27192484 to 0x0:27192545 [Thu Dec 12 02:32:00 2019][245700.512816] LustreError: dumping log to /tmp/lustre-log.1576146720.112547 [Thu Dec 12 02:32:17 2019][245716.897172] LustreError: dumping log to /tmp/lustre-log.1576146737.112516 [Thu Dec 12 02:32:25 2019][245725.042542] Lustre: fir-OST0058: deleting orphan objects from 0x0:27506790 to 0x0:27506849 [Thu Dec 12 02:32:33 2019][245733.281464] Pid: 67617, comm: ll_ost03_014 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:32:33 2019][245733.291986] Call Trace: [Thu Dec 12 02:32:33 2019][245733.294537] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 02:32:33 2019][245733.300927] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:32:33 2019][245733.307993] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:32:33 2019][245733.315807] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:32:33 2019][245733.322237] [] kthread+0xd1/0xe0 [Thu Dec 12 02:32:33 2019][245733.327240] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:32:33 2019][245733.333818] [] 0xffffffffffffffff [Thu Dec 12 02:32:33 2019][245733.338927] LustreError: dumping log to /tmp/lustre-log.1576146753.67617 [Thu Dec 12 02:32:41 2019][245741.473756] Pid: 67621, comm: ll_ost00_014 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:32:41 2019][245741.484276] Call Trace: [Thu Dec 12 02:32:41 2019][245741.486845] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:32:41 2019][245741.493852] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:32:41 2019][245741.501053] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:32:41 2019][245741.507702] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:32:41 2019][245741.514450] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:32:41 2019][245741.521977] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:32:41 2019][245741.529072] [] dqget+0x3fa/0x450 [Thu Dec 12 02:32:41 2019][245741.534076] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:32:41 2019][245741.539856] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:32:41 2019][245741.547482] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:32:41 2019][245741.553957] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:32:41 2019][245741.560100] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:32:41 2019][245741.567157] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:32:41 2019][245741.575002] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:32:41 2019][245741.581422] [] kthread+0xd1/0xe0 [Thu Dec 12 02:32:41 2019][245741.586423] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:32:41 2019][245741.592990] [] 0xffffffffffffffff [Thu Dec 12 02:32:41 2019][245741.598093] LustreError: dumping log to /tmp/lustre-log.1576146761.67621 [Thu Dec 12 02:33:01 2019][245760.825819] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049000 to 0x1a00000401:3049025 [Thu Dec 12 02:33:06 2019][245766.050105] Pid: 29310, comm: ll_ost00_099 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:33:06 2019][245766.060628] Call Trace: [Thu Dec 12 02:33:06 2019][245766.063194] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:33:06 2019][245766.070193] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:33:06 2019][245766.077382] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:33:06 2019][245766.084032] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:33:06 2019][245766.090769] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:33:06 2019][245766.098309] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:33:06 2019][245766.105391] [] dqget+0x3fa/0x450 [Thu Dec 12 02:33:06 2019][245766.110406] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:33:06 2019][245766.116179] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:33:06 2019][245766.123804] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:33:06 2019][245766.130297] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:33:06 2019][245766.136440] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:33:06 2019][245766.143479] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:33:06 2019][245766.151304] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:33:06 2019][245766.157728] [] kthread+0xd1/0xe0 [Thu Dec 12 02:33:06 2019][245766.162745] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:33:06 2019][245766.169326] [] 0xffffffffffffffff [Thu Dec 12 02:33:06 2019][245766.174437] LustreError: dumping log to /tmp/lustre-log.1576146786.29310 [Thu Dec 12 02:33:23 2019][245782.805115] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785417 to 0x1800000401:11785441 [Thu Dec 12 02:34:03 2019][245822.898922] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 02:34:03 2019][245822.909101] Lustre: Skipped 768 previous similar messages [Thu Dec 12 02:34:38 2019][245858.540627] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105542 to 0x1a00000402:1105569 [Thu Dec 12 02:34:42 2019][245862.459717] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842192 to 0x1980000401:11842209 [Thu Dec 12 02:34:52 2019][245872.548215] Pid: 29376, comm: ll_ost00_102 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:34:52 2019][245872.558738] Call Trace: [Thu Dec 12 02:34:52 2019][245872.561311] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:34:52 2019][245872.568308] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:34:52 2019][245872.575506] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:34:52 2019][245872.582157] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:34:52 2019][245872.588905] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:34:52 2019][245872.596431] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:34:52 2019][245872.603528] [] dqget+0x3fa/0x450 [Thu Dec 12 02:34:52 2019][245872.608529] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:34:52 2019][245872.614301] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:34:52 2019][245872.621920] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:34:52 2019][245872.628402] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:34:52 2019][245872.634549] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:34:52 2019][245872.641584] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:34:52 2019][245872.649408] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:34:52 2019][245872.655824] [] kthread+0xd1/0xe0 [Thu Dec 12 02:34:52 2019][245872.660841] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:34:52 2019][245872.667418] [] 0xffffffffffffffff [Thu Dec 12 02:34:52 2019][245872.672535] LustreError: dumping log to /tmp/lustre-log.1576146892.29376 [Thu Dec 12 02:34:55 2019][245874.826272] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/-150), not sending early reply [Thu Dec 12 02:34:55 2019][245874.826272] req@ffff88f42c6cb050 x1651382341323616/t0(0) o4->4cd291bb-d8c0-256c-78b1-5ae56b16acd9@10.9.107.53@o2ib4:595/0 lens 6640/0 e 0 to 0 dl 1576146900 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 02:34:55 2019][245874.855451] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1556 previous similar messages [Thu Dec 12 02:35:17 2019][245897.124688] Pid: 67787, comm: ll_ost00_047 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:35:17 2019][245897.135230] Call Trace: [Thu Dec 12 02:35:17 2019][245897.137818] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:35:17 2019][245897.144825] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:35:17 2019][245897.152059] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:35:17 2019][245897.158708] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:35:17 2019][245897.165457] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:35:17 2019][245897.172981] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:35:17 2019][245897.180079] [] dqget+0x3fa/0x450 [Thu Dec 12 02:35:17 2019][245897.185081] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:35:17 2019][245897.190853] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:35:17 2019][245897.198478] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:35:17 2019][245897.204953] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:35:17 2019][245897.211096] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:35:17 2019][245897.218151] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:35:17 2019][245897.225976] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:35:17 2019][245897.232394] [] kthread+0xd1/0xe0 [Thu Dec 12 02:35:17 2019][245897.237409] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:35:17 2019][245897.243973] [] 0xffffffffffffffff [Thu Dec 12 02:35:17 2019][245897.249084] LustreError: dumping log to /tmp/lustre-log.1576146917.67787 [Thu Dec 12 02:35:19 2019][245899.157848] Lustre: fir-OST005a: Connection restored to f5f4e1fb-09a1-cbb0-925c-f94d3727005b (at 10.9.101.44@o2ib4) [Thu Dec 12 02:35:19 2019][245899.168376] Lustre: Skipped 822 previous similar messages [Thu Dec 12 02:35:28 2019][245907.883442] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 02:35:28 2019][245907.892414] Lustre: Skipped 50 previous similar messages [Thu Dec 12 02:35:42 2019][245921.701149] LustreError: dumping log to /tmp/lustre-log.1576146941.29351 [Thu Dec 12 02:36:21 2019][245960.741777] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797513 to 0x1900000401:11797537 [Thu Dec 12 02:36:24 2019][245963.829873] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3085992 to 0x1900000402:3086017 [Thu Dec 12 02:37:08 2019][246008.504171] Lustre: fir-OST005a: deleting orphan objects from 0x0:27562243 to 0x0:27562273 [Thu Dec 12 02:37:28 2019][246028.199246] LustreError: dumping log to /tmp/lustre-log.1576147048.67719 [Thu Dec 12 02:37:30 2019][246030.266665] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082121 to 0x1a80000402:3082145 [Thu Dec 12 02:37:53 2019][246052.775736] LNet: Service thread pid 67830 was inactive for 1201.66s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 02:37:53 2019][246052.792876] LNet: Skipped 9 previous similar messages [Thu Dec 12 02:37:53 2019][246052.798018] Pid: 67830, comm: ll_ost00_054 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:37:53 2019][246052.808556] Call Trace: [Thu Dec 12 02:37:53 2019][246052.811129] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:37:53 2019][246052.818121] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:37:53 2019][246052.825305] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:37:53 2019][246052.831954] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:37:53 2019][246052.838703] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:37:53 2019][246052.846227] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:37:53 2019][246052.853333] [] dqget+0x3fa/0x450 [Thu Dec 12 02:37:53 2019][246052.858337] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:37:53 2019][246052.864125] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:37:53 2019][246052.871791] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:37:53 2019][246052.878287] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:37:53 2019][246052.884431] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:37:53 2019][246052.891479] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:37:53 2019][246052.899294] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:37:53 2019][246052.905711] [] kthread+0xd1/0xe0 [Thu Dec 12 02:37:53 2019][246052.910733] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:37:53 2019][246052.917297] [] 0xffffffffffffffff [Thu Dec 12 02:37:53 2019][246052.922402] LustreError: dumping log to /tmp/lustre-log.1576147073.67830 [Thu Dec 12 02:38:08 2019][246067.999841] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089828 to 0x1980000402:3089857 [Thu Dec 12 02:38:19 2019][246079.304980] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122243 to 0x1a80000401:1122273 [Thu Dec 12 02:38:25 2019][246084.824358] LustreError: 67733:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576146805, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0054_UUID lock: ffff89070c2b1d40/0x7066c9c190b583dc lrc: 3/0,1 mode: --/PW res: [0x1800000401:0xb3d4c6:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67733 timeout: 0 lvb_type: 0 [Thu Dec 12 02:38:25 2019][246084.868098] LustreError: 67733:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 6 previous similar messages [Thu Dec 12 02:39:53 2019][246173.105904] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3070728 to 0x1800000400:3070753 [Thu Dec 12 02:40:04 2019][246183.850312] Pid: 29356, comm: ll_ost00_101 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:40:04 2019][246183.860852] Call Trace: [Thu Dec 12 02:40:04 2019][246183.863430] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:40:04 2019][246183.870444] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:40:04 2019][246183.877641] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:40:04 2019][246183.884318] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:40:04 2019][246183.891076] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:40:04 2019][246183.898612] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:40:04 2019][246183.905715] [] dqget+0x3fa/0x450 [Thu Dec 12 02:40:04 2019][246183.910742] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:40:04 2019][246183.916517] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:40:04 2019][246183.924160] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:40:04 2019][246183.930635] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:40:04 2019][246183.936777] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:40:04 2019][246183.943865] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:40:04 2019][246183.951693] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:40:04 2019][246183.958125] [] kthread+0xd1/0xe0 [Thu Dec 12 02:40:04 2019][246183.963134] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:40:04 2019][246183.969720] [] 0xffffffffffffffff [Thu Dec 12 02:40:04 2019][246183.974839] LustreError: dumping log to /tmp/lustre-log.1576147204.29356 [Thu Dec 12 02:40:04 2019][246184.119327] LustreError: 112542:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576146904, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005e_UUID lock: ffff8906eebcec00/0x7066c9c190b5889e lrc: 3/0,1 mode: --/PW res: [0x1a31f22:0x0:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 112542 timeout: 0 lvb_type: 0 [Thu Dec 12 02:40:05 2019][246184.899052] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124486 to 0x1900000400:1124513 [Thu Dec 12 02:40:13 2019][246192.931229] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117544 to 0x1800000402:1117569 [Thu Dec 12 02:40:28 2019][246208.426796] Pid: 112568, comm: ll_ost00_098 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:40:28 2019][246208.437400] Call Trace: [Thu Dec 12 02:40:28 2019][246208.439967] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:40:28 2019][246208.446962] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:40:28 2019][246208.454154] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:40:28 2019][246208.460800] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:40:28 2019][246208.467551] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:40:28 2019][246208.475075] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:40:28 2019][246208.482171] [] dqget+0x3fa/0x450 [Thu Dec 12 02:40:28 2019][246208.487175] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:40:28 2019][246208.492947] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:40:28 2019][246208.500589] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:40:28 2019][246208.507063] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:40:28 2019][246208.513204] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:40:28 2019][246208.520246] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:40:28 2019][246208.528063] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:40:28 2019][246208.534477] [] kthread+0xd1/0xe0 [Thu Dec 12 02:40:28 2019][246208.539491] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:40:28 2019][246208.546056] [] 0xffffffffffffffff [Thu Dec 12 02:40:28 2019][246208.551167] LustreError: dumping log to /tmp/lustre-log.1576147228.112568 [Thu Dec 12 02:40:49 2019][246228.907208] Pid: 67600, comm: ll_ost03_009 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:40:49 2019][246228.917723] Call Trace: [Thu Dec 12 02:40:49 2019][246228.920277] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:40:49 2019][246228.927308] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:40:49 2019][246228.934609] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:40:49 2019][246228.941252] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:40:49 2019][246228.947655] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:40:49 2019][246228.954688] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:40:49 2019][246228.962504] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:40:49 2019][246228.968918] [] kthread+0xd1/0xe0 [Thu Dec 12 02:40:49 2019][246228.973918] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:40:49 2019][246228.980515] [] 0xffffffffffffffff [Thu Dec 12 02:40:49 2019][246228.985621] LustreError: dumping log to /tmp/lustre-log.1576147249.67600 [Thu Dec 12 02:40:49 2019][246228.993077] Pid: 67786, comm: ll_ost03_048 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:40:49 2019][246229.003628] Call Trace: [Thu Dec 12 02:40:49 2019][246229.006173] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:40:49 2019][246229.013212] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:40:49 2019][246229.020487] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:40:49 2019][246229.027147] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:40:49 2019][246229.033538] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:40:49 2019][246229.040575] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:40:49 2019][246229.048408] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:40:49 2019][246229.054839] [] kthread+0xd1/0xe0 [Thu Dec 12 02:40:49 2019][246229.059833] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:40:49 2019][246229.066424] [] 0xffffffffffffffff [Thu Dec 12 02:41:21 2019][246261.675847] LNet: Service thread pid 112562 was inactive for 1203.66s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 02:41:22 2019][246261.688968] LNet: Skipped 11 previous similar messages [Thu Dec 12 02:41:22 2019][246261.694207] LustreError: dumping log to /tmp/lustre-log.1576147281.112562 [Thu Dec 12 02:41:30 2019][246270.288449] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125862 to 0x1980000400:1125889 [Thu Dec 12 02:41:52 2019][246292.084197] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800595 to 0x1a80000400:11800641 [Thu Dec 12 02:42:19 2019][246319.021009] LustreError: dumping log to /tmp/lustre-log.1576147339.29390 [Thu Dec 12 02:42:21 2019][246320.910380] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467558 to 0x0:27467585 [Thu Dec 12 02:42:43 2019][246343.597501] LustreError: dumping log to /tmp/lustre-log.1576147363.29525 [Thu Dec 12 02:43:04 2019][246364.077899] Pid: 29512, comm: ll_ost00_109 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:43:04 2019][246364.088416] Call Trace: [Thu Dec 12 02:43:04 2019][246364.090986] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:43:04 2019][246364.097983] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:43:04 2019][246364.105185] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:43:04 2019][246364.111850] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:43:04 2019][246364.118617] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:43:04 2019][246364.126144] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:43:04 2019][246364.133238] [] dqget+0x3fa/0x450 [Thu Dec 12 02:43:04 2019][246364.138259] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:43:04 2019][246364.144040] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:43:04 2019][246364.151663] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:43:04 2019][246364.158141] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:43:04 2019][246364.164282] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:43:04 2019][246364.171324] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:43:04 2019][246364.179152] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:43:04 2019][246364.185571] [] kthread+0xd1/0xe0 [Thu Dec 12 02:43:04 2019][246364.190587] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:43:04 2019][246364.197151] [] 0xffffffffffffffff [Thu Dec 12 02:43:04 2019][246364.202263] LustreError: dumping log to /tmp/lustre-log.1576147384.29512 [Thu Dec 12 02:43:33 2019][246392.750468] Pid: 67751, comm: ll_ost02_037 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:43:33 2019][246392.760987] Call Trace: [Thu Dec 12 02:43:33 2019][246392.763539] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 02:43:33 2019][246392.769929] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:43:33 2019][246392.776985] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:43:33 2019][246392.784811] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:43:33 2019][246392.791247] [] kthread+0xd1/0xe0 [Thu Dec 12 02:43:33 2019][246392.796251] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:43:33 2019][246392.802835] [] 0xffffffffffffffff [Thu Dec 12 02:43:33 2019][246392.807954] LustreError: dumping log to /tmp/lustre-log.1576147413.67751 [Thu Dec 12 02:43:33 2019][246392.815433] Pid: 29402, comm: ll_ost00_105 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:43:33 2019][246392.825994] Call Trace: [Thu Dec 12 02:43:33 2019][246392.828546] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:43:33 2019][246392.835562] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:43:33 2019][246392.842741] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:43:33 2019][246392.849433] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:43:33 2019][246392.856165] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:43:33 2019][246392.863720] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:43:33 2019][246392.870819] [] dqget+0x3fa/0x450 [Thu Dec 12 02:43:33 2019][246392.875836] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:43:33 2019][246392.881613] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:43:33 2019][246392.889244] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:43:33 2019][246392.895720] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:43:33 2019][246392.901880] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:43:33 2019][246392.908928] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:43:33 2019][246392.916767] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:43:33 2019][246392.923186] [] kthread+0xd1/0xe0 [Thu Dec 12 02:43:33 2019][246392.928219] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:43:33 2019][246392.934773] [] 0xffffffffffffffff [Thu Dec 12 02:44:02 2019][246422.711081] LustreError: 112551:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576147142, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0058_UUID lock: ffff891d57f5d100/0x7066c9c190b59785 lrc: 3/0,1 mode: --/PW res: [0x1900000402:0x2f1685:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 112551 timeout: 0 lvb_type: 0 [Thu Dec 12 02:44:04 2019][246423.926791] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 02:44:04 2019][246423.936960] Lustre: Skipped 970 previous similar messages [Thu Dec 12 02:44:14 2019][246433.812543] Lustre: fir-OST005c: deleting orphan objects from 0x0:27192548 to 0x0:27192577 [Thu Dec 12 02:44:39 2019][246458.920492] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105575 to 0x1a00000402:1105601 [Thu Dec 12 02:44:54 2019][246474.672103] Pid: 29401, comm: ll_ost00_104 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:44:54 2019][246474.682621] Call Trace: [Thu Dec 12 02:44:54 2019][246474.685198] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:44:54 2019][246474.692193] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:44:54 2019][246474.699370] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:44:54 2019][246474.706015] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:44:54 2019][246474.712769] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:44:55 2019][246474.720290] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:44:55 2019][246474.727387] [] dqget+0x3fa/0x450 [Thu Dec 12 02:44:55 2019][246474.732390] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:44:55 2019][246474.738177] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:44:55 2019][246474.745790] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:44:55 2019][246474.752294] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:44:55 2019][246474.758426] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:44:55 2019][246474.765468] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:44:55 2019][246474.773271] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:44:55 2019][246474.779700] [] kthread+0xd1/0xe0 [Thu Dec 12 02:44:55 2019][246474.784696] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:44:55 2019][246474.791271] [] 0xffffffffffffffff [Thu Dec 12 02:44:55 2019][246474.796371] LustreError: dumping log to /tmp/lustre-log.1576147495.29401 [Thu Dec 12 02:44:55 2019][246474.900106] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 02:44:55 2019][246474.900106] req@ffff891136168850 x1648527343518128/t0(0) o10->1d4d1153-82cd-6bbc-4932-1e6a2a506ca0@10.8.30.27@o2ib6:440/0 lens 440/0 e 0 to 0 dl 1576147500 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 02:44:55 2019][246474.928936] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1785 previous similar messages [Thu Dec 12 02:45:01 2019][246481.169622] Lustre: fir-OST0058: deleting orphan objects from 0x0:27506852 to 0x0:27506881 [Thu Dec 12 02:45:11 2019][246491.056498] Pid: 67667, comm: ll_ost03_022 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:45:11 2019][246491.067018] Call Trace: [Thu Dec 12 02:45:11 2019][246491.069568] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 02:45:11 2019][246491.075969] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:45:11 2019][246491.083035] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:45:11 2019][246491.090841] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:45:11 2019][246491.097256] [] kthread+0xd1/0xe0 [Thu Dec 12 02:45:11 2019][246491.102256] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:45:11 2019][246491.108849] [] 0xffffffffffffffff [Thu Dec 12 02:45:11 2019][246491.113959] LustreError: dumping log to /tmp/lustre-log.1576147511.67667 [Thu Dec 12 02:45:19 2019][246499.248574] LustreError: dumping log to /tmp/lustre-log.1576147519.29501 [Thu Dec 12 02:45:20 2019][246500.060606] Lustre: fir-OST005a: Connection restored to f5f4e1fb-09a1-cbb0-925c-f94d3727005b (at 10.9.101.44@o2ib4) [Thu Dec 12 02:45:20 2019][246500.071124] Lustre: Skipped 918 previous similar messages [Thu Dec 12 02:45:30 2019][246510.007219] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 02:45:30 2019][246510.016186] Lustre: Skipped 51 previous similar messages [Thu Dec 12 02:45:34 2019][246514.336620] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785450 to 0x1800000401:11785473 [Thu Dec 12 02:45:37 2019][246516.856686] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049034 to 0x1a00000401:3049057 [Thu Dec 12 02:46:29 2019][246568.881964] LustreError: dumping log to /tmp/lustre-log.1576147589.67836 [Thu Dec 12 02:46:33 2019][246572.978057] LustreError: dumping log to /tmp/lustre-log.1576147593.67693 [Thu Dec 12 02:47:18 2019][246617.946639] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842213 to 0x1980000401:11842241 [Thu Dec 12 02:47:31 2019][246630.918468] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082149 to 0x1a80000402:3082177 [Thu Dec 12 02:48:09 2019][246668.891693] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089859 to 0x1980000402:3089889 [Thu Dec 12 02:48:58 2019][246718.044603] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797546 to 0x1900000401:11797569 [Thu Dec 12 02:49:00 2019][246719.892732] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086025 to 0x1900000402:3086049 [Thu Dec 12 02:49:44 2019][246764.463124] Lustre: fir-OST005a: deleting orphan objects from 0x0:27562243 to 0x0:27562305 [Thu Dec 12 02:50:06 2019][246785.974255] LNet: Service thread pid 67799 was inactive for 1203.85s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 02:50:06 2019][246785.991385] LNet: Skipped 9 previous similar messages [Thu Dec 12 02:50:06 2019][246785.996533] Pid: 67799, comm: ll_ost02_044 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:50:06 2019][246786.007076] Call Trace: [Thu Dec 12 02:50:06 2019][246786.009626] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 02:50:06 2019][246786.016019] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:50:06 2019][246786.023080] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:50:06 2019][246786.030882] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:50:06 2019][246786.037311] [] kthread+0xd1/0xe0 [Thu Dec 12 02:50:06 2019][246786.042330] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:50:06 2019][246786.048909] [] 0xffffffffffffffff [Thu Dec 12 02:50:06 2019][246786.054016] LustreError: dumping log to /tmp/lustre-log.1576147806.67799 [Thu Dec 12 02:50:55 2019][246834.975909] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122275 to 0x1a80000401:1122305 [Thu Dec 12 02:51:31 2019][246870.940342] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125891 to 0x1980000400:1125921 [Thu Dec 12 02:52:03 2019][246903.620592] LustreError: 112530:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576147623, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0054_UUID lock: ffff88fef6f3fbc0/0x7066c9c190b5adc7 lrc: 3/0,1 mode: --/PW res: [0x1a2f979:0x0:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 112530 timeout: 0 lvb_type: 0 [Thu Dec 12 02:52:29 2019][246928.984866] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3070765 to 0x1800000400:3070785 [Thu Dec 12 02:52:41 2019][246941.306057] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124515 to 0x1900000400:1124545 [Thu Dec 12 02:52:49 2019][246949.090154] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117571 to 0x1800000402:1117601 [Thu Dec 12 02:53:26 2019][246986.682221] Pid: 67733, comm: ll_ost02_033 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:53:26 2019][246986.692745] Call Trace: [Thu Dec 12 02:53:26 2019][246986.695302] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:53:26 2019][246986.702333] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:53:26 2019][246986.709621] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:53:26 2019][246986.716267] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:53:26 2019][246986.722670] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:53:27 2019][246986.729713] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:53:27 2019][246986.737542] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:53:27 2019][246986.743959] [] kthread+0xd1/0xe0 [Thu Dec 12 02:53:27 2019][246986.748961] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:53:27 2019][246986.755529] [] 0xffffffffffffffff [Thu Dec 12 02:53:27 2019][246986.760636] LustreError: dumping log to /tmp/lustre-log.1576148007.67733 [Thu Dec 12 02:53:55 2019][247015.354787] Pid: 112512, comm: ll_ost01_079 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:53:55 2019][247015.365398] Call Trace: [Thu Dec 12 02:53:55 2019][247015.367960] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 02:53:55 2019][247015.374358] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:53:55 2019][247015.381426] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:53:55 2019][247015.389261] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:53:55 2019][247015.395693] [] kthread+0xd1/0xe0 [Thu Dec 12 02:53:55 2019][247015.400697] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:53:55 2019][247015.407271] [] 0xffffffffffffffff [Thu Dec 12 02:53:55 2019][247015.412380] LustreError: dumping log to /tmp/lustre-log.1576148035.112512 [Thu Dec 12 02:54:04 2019][247024.547823] Lustre: fir-OST0058: Client ec8d663e-70c3-0c7c-9511-dfaaba3f32c1 (at 10.9.104.45@o2ib4) reconnecting [Thu Dec 12 02:54:04 2019][247024.558087] Lustre: Skipped 1002 previous similar messages [Thu Dec 12 02:54:29 2019][247049.059147] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800646 to 0x1a80000400:11800673 [Thu Dec 12 02:54:40 2019][247060.276417] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105603 to 0x1a00000402:1105633 [Thu Dec 12 02:54:56 2019][247076.236003] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 02:54:56 2019][247076.236003] req@ffff88f75c15f850 x1648527343518128/t0(0) o10->1d4d1153-82cd-6bbc-4932-1e6a2a506ca0@10.8.30.27@o2ib6:286/0 lens 440/0 e 0 to 0 dl 1576148101 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 02:54:56 2019][247076.264812] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1998 previous similar messages [Thu Dec 12 02:54:57 2019][247076.781289] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467558 to 0x0:27467617 [Thu Dec 12 02:55:05 2019][247084.988154] Pid: 112542, comm: ll_ost02_082 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:55:05 2019][247084.998762] Call Trace: [Thu Dec 12 02:55:05 2019][247085.001322] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:55:05 2019][247085.008364] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:55:05 2019][247085.015660] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:55:05 2019][247085.022306] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:55:05 2019][247085.028710] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:55:05 2019][247085.035766] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:55:05 2019][247085.043582] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:55:05 2019][247085.049999] [] kthread+0xd1/0xe0 [Thu Dec 12 02:55:05 2019][247085.054999] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:55:05 2019][247085.061566] [] 0xffffffffffffffff [Thu Dec 12 02:55:05 2019][247085.066675] LustreError: dumping log to /tmp/lustre-log.1576148105.112542 [Thu Dec 12 02:55:21 2019][247101.357552] Lustre: fir-OST005a: Connection restored to f5f4e1fb-09a1-cbb0-925c-f94d3727005b (at 10.9.101.44@o2ib4) [Thu Dec 12 02:55:21 2019][247101.368075] Lustre: Skipped 1025 previous similar messages [Thu Dec 12 02:55:32 2019][247112.131401] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 02:55:32 2019][247112.140368] Lustre: Skipped 49 previous similar messages [Thu Dec 12 02:56:10 2019][247150.525464] Pid: 67919, comm: ll_ost02_063 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:56:10 2019][247150.535984] Call Trace: [Thu Dec 12 02:56:10 2019][247150.538537] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 02:56:10 2019][247150.544925] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:56:10 2019][247150.551990] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:56:10 2019][247150.559798] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:56:10 2019][247150.566236] [] kthread+0xd1/0xe0 [Thu Dec 12 02:56:10 2019][247150.571265] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:56:10 2019][247150.577849] [] 0xffffffffffffffff [Thu Dec 12 02:56:10 2019][247150.582967] LustreError: dumping log to /tmp/lustre-log.1576148170.67919 [Thu Dec 12 02:56:51 2019][247191.071544] Lustre: fir-OST005c: deleting orphan objects from 0x0:27192579 to 0x0:27192609 [Thu Dec 12 02:57:32 2019][247232.706360] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082184 to 0x1a80000402:3082209 [Thu Dec 12 02:57:37 2019][247237.376466] Lustre: fir-OST0058: deleting orphan objects from 0x0:27506884 to 0x0:27506913 [Thu Dec 12 02:57:44 2019][247244.735321] Pid: 67847, comm: ll_ost03_055 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:57:45 2019][247244.745840] Call Trace: [Thu Dec 12 02:57:45 2019][247244.748401] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 02:57:45 2019][247244.754797] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:57:45 2019][247244.761865] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:57:45 2019][247244.769672] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:57:45 2019][247244.776098] [] kthread+0xd1/0xe0 [Thu Dec 12 02:57:45 2019][247244.781101] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:57:45 2019][247244.787677] [] 0xffffffffffffffff [Thu Dec 12 02:57:45 2019][247244.792787] LustreError: dumping log to /tmp/lustre-log.1576148265.67847 [Thu Dec 12 02:58:10 2019][247270.703510] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785478 to 0x1800000401:11785505 [Thu Dec 12 02:58:11 2019][247270.711576] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089895 to 0x1980000402:3089921 [Thu Dec 12 02:58:13 2019][247272.791635] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049065 to 0x1a00000401:3049089 [Thu Dec 12 02:59:02 2019][247322.560844] Pid: 67685, comm: ll_ost02_028 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:59:02 2019][247322.571362] Call Trace: [Thu Dec 12 02:59:02 2019][247322.573938] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:59:02 2019][247322.580945] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:59:02 2019][247322.588142] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:59:02 2019][247322.594796] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:59:02 2019][247322.601544] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:59:02 2019][247322.609070] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:59:02 2019][247322.616165] [] dqget+0x3fa/0x450 [Thu Dec 12 02:59:02 2019][247322.621169] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:59:02 2019][247322.626940] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:59:02 2019][247322.634558] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:59:02 2019][247322.641033] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:59:02 2019][247322.647188] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:59:02 2019][247322.654242] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:59:02 2019][247322.662065] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:59:02 2019][247322.668482] [] kthread+0xd1/0xe0 [Thu Dec 12 02:59:02 2019][247322.673495] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:59:02 2019][247322.680060] [] 0xffffffffffffffff [Thu Dec 12 02:59:02 2019][247322.685173] LustreError: dumping log to /tmp/lustre-log.1576148342.67685 [Thu Dec 12 02:59:06 2019][247326.656918] Pid: 112551, comm: ll_ost03_075 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:59:06 2019][247326.667521] Call Trace: [Thu Dec 12 02:59:06 2019][247326.670082] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:59:06 2019][247326.677121] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:59:06 2019][247326.684418] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:59:06 2019][247326.691066] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:59:06 2019][247326.697469] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:59:06 2019][247326.704508] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:59:06 2019][247326.712326] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:59:06 2019][247326.718740] [] kthread+0xd1/0xe0 [Thu Dec 12 02:59:06 2019][247326.723741] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:59:07 2019][247326.730311] [] 0xffffffffffffffff [Thu Dec 12 02:59:07 2019][247326.735419] LustreError: dumping log to /tmp/lustre-log.1576148346.112551 [Thu Dec 12 02:59:15 2019][247334.849075] Pid: 67910, comm: ll_ost02_061 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:59:15 2019][247334.859594] Call Trace: [Thu Dec 12 02:59:15 2019][247334.862164] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:59:15 2019][247334.869177] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:59:15 2019][247334.876368] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:59:15 2019][247334.883023] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:59:15 2019][247334.889794] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:59:15 2019][247334.897324] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:59:15 2019][247334.904435] [] dqget+0x3fa/0x450 [Thu Dec 12 02:59:15 2019][247334.909474] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:59:15 2019][247334.915277] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:59:15 2019][247334.922900] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:59:15 2019][247334.929395] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:59:15 2019][247334.935531] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:59:15 2019][247334.942593] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:59:15 2019][247334.950405] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:59:15 2019][247334.956849] [] kthread+0xd1/0xe0 [Thu Dec 12 02:59:15 2019][247334.961853] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:59:15 2019][247334.968438] [] 0xffffffffffffffff [Thu Dec 12 02:59:15 2019][247334.973552] LustreError: dumping log to /tmp/lustre-log.1576148355.67910 [Thu Dec 12 02:59:39 2019][247359.666862] Lustre: fir-OST0054: deleting orphan objects from 0x0:27457914 to 0x0:27457953 [Thu Dec 12 02:59:47 2019][247367.617723] LNet: Service thread pid 67708 was inactive for 1202.29s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 02:59:47 2019][247367.630763] LNet: Skipped 5 previous similar messages [Thu Dec 12 02:59:47 2019][247367.635910] LustreError: dumping log to /tmp/lustre-log.1576148387.67708 [Thu Dec 12 02:59:54 2019][247374.265558] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842243 to 0x1980000401:11842273 [Thu Dec 12 03:01:33 2019][247473.132475] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125925 to 0x1980000400:1125953 [Thu Dec 12 03:01:34 2019][247474.491521] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797577 to 0x1900000401:11797601 [Thu Dec 12 03:01:36 2019][247475.835612] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086059 to 0x1900000402:3086081 [Thu Dec 12 03:02:20 2019][247520.054007] Lustre: fir-OST005a: deleting orphan objects from 0x0:27562243 to 0x0:27562337 [Thu Dec 12 03:02:23 2019][247522.773765] LustreError: 112569:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576148243, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0058_UUID lock: ffff891d57f5a880/0x7066c9c190b5cad1 lrc: 3/0,1 mode: --/PW res: [0x1900000402:0x2f16c8:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 112569 timeout: 0 lvb_type: 0 [Thu Dec 12 03:02:23 2019][247522.817652] LustreError: 112569:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 2 previous similar messages [Thu Dec 12 03:02:39 2019][247539.653082] LNet: Service thread pid 113351 was inactive for 1200.50s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 03:02:39 2019][247539.670277] LNet: Skipped 8 previous similar messages [Thu Dec 12 03:02:39 2019][247539.675419] Pid: 113351, comm: ll_ost02_089 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:02:39 2019][247539.686038] Call Trace: [Thu Dec 12 03:02:39 2019][247539.688590] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:02:39 2019][247539.694975] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:02:39 2019][247539.702037] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:02:39 2019][247539.709855] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:02:39 2019][247539.716286] [] kthread+0xd1/0xe0 [Thu Dec 12 03:02:40 2019][247539.721288] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:02:40 2019][247539.727863] [] 0xffffffffffffffff [Thu Dec 12 03:02:40 2019][247539.732973] LustreError: dumping log to /tmp/lustre-log.1576148559.113351 [Thu Dec 12 03:03:31 2019][247590.926789] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122310 to 0x1a80000401:1122337 [Thu Dec 12 03:04:04 2019][247624.714508] Lustre: fir-OST005e: Client 7126efc2-9676-1db9-94d0-ae09c1520697 (at 10.9.101.26@o2ib4) reconnecting [Thu Dec 12 03:04:04 2019][247624.724769] Lustre: Skipped 990 previous similar messages [Thu Dec 12 03:04:41 2019][247661.424196] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105638 to 0x1a00000402:1105665 [Thu Dec 12 03:04:58 2019][247677.983820] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 03:04:58 2019][247677.983820] req@ffff8901ad32e850 x1648527343229280/t0(0) o4->1d4d1153-82cd-6bbc-4932-1e6a2a506ca0@10.8.30.27@o2ib6:133/0 lens 488/0 e 0 to 0 dl 1576148703 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 03:04:58 2019][247678.012544] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1962 previous similar messages [Thu Dec 12 03:05:05 2019][247684.935728] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3070796 to 0x1800000400:3070817 [Thu Dec 12 03:05:17 2019][247697.000874] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124549 to 0x1900000400:1124577 [Thu Dec 12 03:05:21 2019][247701.489301] Lustre: fir-OST0058: Connection restored to 635a05c8-c7a3-e96d-15e7-653531254cf2 (at 10.9.110.38@o2ib4) [Thu Dec 12 03:05:21 2019][247701.499823] Lustre: Skipped 999 previous similar messages [Thu Dec 12 03:05:25 2019][247705.321033] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117606 to 0x1800000402:1117633 [Thu Dec 12 03:05:53 2019][247733.385421] Lustre: fir-OST0056: Export ffff8902f7fafc00 already connecting from 10.8.22.24@o2ib6 [Thu Dec 12 03:05:53 2019][247733.394387] Lustre: Skipped 46 previous similar messages [Thu Dec 12 03:06:33 2019][247773.129679] Pid: 67766, comm: ll_ost01_048 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:06:33 2019][247773.140199] Call Trace: [Thu Dec 12 03:06:33 2019][247773.142751] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:06:33 2019][247773.149149] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:06:33 2019][247773.156214] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:06:33 2019][247773.164014] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:06:33 2019][247773.170441] [] kthread+0xd1/0xe0 [Thu Dec 12 03:06:33 2019][247773.175446] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:06:33 2019][247773.182020] [] 0xffffffffffffffff [Thu Dec 12 03:06:33 2019][247773.187132] LustreError: dumping log to /tmp/lustre-log.1576148793.67766 [Thu Dec 12 03:07:05 2019][247805.378019] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800675 to 0x1a80000400:11800705 [Thu Dec 12 03:07:06 2019][247805.898326] Pid: 112530, comm: ll_ost02_079 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:07:06 2019][247805.908938] Call Trace: [Thu Dec 12 03:07:06 2019][247805.911496] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 03:07:06 2019][247805.918529] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 03:07:06 2019][247805.925827] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 03:07:06 2019][247805.932474] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 03:07:06 2019][247805.938879] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:07:06 2019][247805.945924] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:07:06 2019][247805.953726] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:07:06 2019][247805.960154] [] kthread+0xd1/0xe0 [Thu Dec 12 03:07:06 2019][247805.965156] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:07:06 2019][247805.971717] [] 0xffffffffffffffff [Thu Dec 12 03:07:06 2019][247805.976838] LustreError: dumping log to /tmp/lustre-log.1576148826.112530 [Thu Dec 12 03:07:33 2019][247832.820181] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467620 to 0x0:27467649 [Thu Dec 12 03:07:34 2019][247833.886155] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082211 to 0x1a80000402:3082241 [Thu Dec 12 03:08:11 2019][247871.683423] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089923 to 0x1980000402:3089953 [Thu Dec 12 03:08:48 2019][247908.300344] Pid: 67645, comm: ll_ost02_019 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:08:48 2019][247908.310862] Call Trace: [Thu Dec 12 03:08:48 2019][247908.313422] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:08:48 2019][247908.319821] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:08:48 2019][247908.326884] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:08:48 2019][247908.334685] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:08:48 2019][247908.341114] [] kthread+0xd1/0xe0 [Thu Dec 12 03:08:48 2019][247908.346117] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:08:48 2019][247908.352677] [] 0xffffffffffffffff [Thu Dec 12 03:08:48 2019][247908.357786] LustreError: dumping log to /tmp/lustre-log.1576148928.67645 [Thu Dec 12 03:09:27 2019][247946.822431] Lustre: fir-OST005c: deleting orphan objects from 0x0:27192611 to 0x0:27192641 [Thu Dec 12 03:10:13 2019][247993.199331] Lustre: fir-OST0058: deleting orphan objects from 0x0:27506916 to 0x0:27506945 [Thu Dec 12 03:10:22 2019][248002.510245] Pid: 67872, comm: ll_ost03_062 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:10:22 2019][248002.520766] Call Trace: [Thu Dec 12 03:10:22 2019][248002.523316] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:10:22 2019][248002.529716] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:10:22 2019][248002.536780] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:10:22 2019][248002.544579] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:10:22 2019][248002.551010] [] kthread+0xd1/0xe0 [Thu Dec 12 03:10:22 2019][248002.556011] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:10:22 2019][248002.562596] [] 0xffffffffffffffff [Thu Dec 12 03:10:22 2019][248002.567706] LustreError: dumping log to /tmp/lustre-log.1576149022.67872 [Thu Dec 12 03:10:35 2019][248014.798469] Pid: 67580, comm: ll_ost03_006 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:10:35 2019][248014.808999] Call Trace: [Thu Dec 12 03:10:35 2019][248014.811570] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 03:10:35 2019][248014.818598] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 03:10:35 2019][248014.825933] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 03:10:35 2019][248014.832611] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 03:10:35 2019][248014.839015] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:10:35 2019][248014.846060] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:10:35 2019][248014.853911] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:10:35 2019][248014.860329] [] kthread+0xd1/0xe0 [Thu Dec 12 03:10:35 2019][248014.865347] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:10:35 2019][248014.871938] [] 0xffffffffffffffff [Thu Dec 12 03:10:35 2019][248014.877057] LustreError: dumping log to /tmp/lustre-log.1576149035.67580 [Thu Dec 12 03:10:46 2019][248025.958371] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785511 to 0x1800000401:11785537 [Thu Dec 12 03:10:49 2019][248029.062530] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049095 to 0x1a00000401:3049121 [Thu Dec 12 03:11:34 2019][248074.132080] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125956 to 0x1980000400:1125985 [Thu Dec 12 03:12:15 2019][248115.177715] Lustre: fir-OST0054: deleting orphan objects from 0x0:27457956 to 0x0:27457985 [Thu Dec 12 03:12:30 2019][248130.264457] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842276 to 0x1980000401:11842305 [Thu Dec 12 03:14:05 2019][248224.919314] Lustre: fir-OST005e: Client 7126efc2-9676-1db9-94d0-ae09c1520697 (at 10.9.101.26@o2ib4) reconnecting [Thu Dec 12 03:14:05 2019][248224.929576] Lustre: Skipped 1020 previous similar messages [Thu Dec 12 03:14:08 2019][248227.832679] LustreError: 67772:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576148948, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005c_UUID lock: ffff89134edb7740/0x7066c9c190b63a22 lrc: 3/0,1 mode: --/PW res: [0x1a00000401:0x2e8666:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67772 timeout: 0 lvb_type: 0 [Thu Dec 12 03:14:08 2019][248227.876400] LustreError: 67772:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 2 previous similar messages [Thu Dec 12 03:14:10 2019][248230.218434] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797607 to 0x1900000401:11797633 [Thu Dec 12 03:14:12 2019][248231.810529] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086093 to 0x1900000402:3086113 [Thu Dec 12 03:14:42 2019][248262.396088] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105668 to 0x1a00000402:1105697 [Thu Dec 12 03:14:56 2019][248276.436873] Lustre: fir-OST005a: deleting orphan objects from 0x0:27562339 to 0x0:27562369 [Thu Dec 12 03:14:58 2019][248278.331688] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/-150), not sending early reply [Thu Dec 12 03:14:58 2019][248278.331688] req@ffff88ffa5698850 x1648620021418976/t0(0) o4->4cd581b8-382e-0698-9aa0-2f501c687dc8@10.8.25.10@o2ib6:733/0 lens 520/0 e 0 to 0 dl 1576149303 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 03:14:58 2019][248278.360683] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 2027 previous similar messages [Thu Dec 12 03:15:17 2019][248297.428049] LNet: Service thread pid 112559 was inactive for 1201.25s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 03:15:17 2019][248297.445244] LNet: Skipped 5 previous similar messages [Thu Dec 12 03:15:17 2019][248297.450394] Pid: 112559, comm: ll_ost02_085 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:15:17 2019][248297.461016] Call Trace: [Thu Dec 12 03:15:17 2019][248297.463566] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:15:17 2019][248297.469960] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:15:17 2019][248297.477036] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:15:17 2019][248297.484840] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:15:17 2019][248297.491270] [] kthread+0xd1/0xe0 [Thu Dec 12 03:15:17 2019][248297.496272] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:15:17 2019][248297.502847] [] 0xffffffffffffffff [Thu Dec 12 03:15:17 2019][248297.507965] LustreError: dumping log to /tmp/lustre-log.1576149317.112559 [Thu Dec 12 03:15:23 2019][248302.812188] Lustre: fir-OST005a: Connection restored to f5f4e1fb-09a1-cbb0-925c-f94d3727005b (at 10.9.101.44@o2ib4) [Thu Dec 12 03:15:23 2019][248302.822722] Lustre: Skipped 1035 previous similar messages [Thu Dec 12 03:15:55 2019][248335.509313] Lustre: fir-OST0056: Export ffff8902f7fafc00 already connecting from 10.8.22.24@o2ib6 [Thu Dec 12 03:15:55 2019][248335.518281] Lustre: Skipped 56 previous similar messages [Thu Dec 12 03:16:07 2019][248347.149767] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122340 to 0x1a80000401:1122369 [Thu Dec 12 03:16:35 2019][248375.253580] Pid: 113355, comm: ll_ost02_093 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:16:35 2019][248375.264188] Call Trace: [Thu Dec 12 03:16:35 2019][248375.266749] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 03:16:35 2019][248375.273787] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 03:16:35 2019][248375.281087] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 03:16:35 2019][248375.287733] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 03:16:35 2019][248375.294136] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:16:35 2019][248375.301167] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:16:35 2019][248375.308983] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:16:35 2019][248375.315413] [] kthread+0xd1/0xe0 [Thu Dec 12 03:16:35 2019][248375.320415] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:16:35 2019][248375.326985] [] 0xffffffffffffffff [Thu Dec 12 03:16:35 2019][248375.332093] LustreError: dumping log to /tmp/lustre-log.1576149395.113355 [Thu Dec 12 03:17:24 2019][248424.406594] Pid: 112569, comm: ll_ost03_079 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:17:24 2019][248424.417202] Call Trace: [Thu Dec 12 03:17:24 2019][248424.419763] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 03:17:24 2019][248424.426803] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 03:17:24 2019][248424.434114] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 03:17:24 2019][248424.440773] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 03:17:24 2019][248424.447175] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:17:24 2019][248424.454232] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:17:24 2019][248424.462048] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:17:24 2019][248424.468463] [] kthread+0xd1/0xe0 [Thu Dec 12 03:17:24 2019][248424.473480] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:17:24 2019][248424.480051] [] 0xffffffffffffffff [Thu Dec 12 03:17:24 2019][248424.485158] LustreError: dumping log to /tmp/lustre-log.1576149444.112569 [Thu Dec 12 03:17:35 2019][248434.962051] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082250 to 0x1a80000402:3082273 [Thu Dec 12 03:17:41 2019][248441.086639] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3070827 to 0x1800000400:3070849 [Thu Dec 12 03:17:53 2019][248453.489843] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124579 to 0x1900000400:1124609 [Thu Dec 12 03:18:01 2019][248460.880000] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117636 to 0x1800000402:1117665 [Thu Dec 12 03:18:09 2019][248469.463444] Pid: 67902, comm: ll_ost02_060 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:18:09 2019][248469.473964] Call Trace: [Thu Dec 12 03:18:09 2019][248469.476521] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 03:18:09 2019][248469.483571] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 03:18:09 2019][248469.490868] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 03:18:09 2019][248469.497532] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 03:18:09 2019][248469.503935] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:18:09 2019][248469.510966] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:18:09 2019][248469.518799] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:18:09 2019][248469.525230] [] kthread+0xd1/0xe0 [Thu Dec 12 03:18:09 2019][248469.530233] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:18:09 2019][248469.536801] [] 0xffffffffffffffff [Thu Dec 12 03:18:09 2019][248469.541911] LustreError: dumping log to /tmp/lustre-log.1576149489.67902 [Thu Dec 12 03:18:12 2019][248472.015269] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089960 to 0x1980000402:3089985 [Thu Dec 12 03:19:11 2019][248530.904667] Pid: 67605, comm: ll_ost01_010 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:19:11 2019][248530.915193] Call Trace: [Thu Dec 12 03:19:11 2019][248530.917771] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:19:11 2019][248530.924168] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:19:11 2019][248530.931247] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:19:11 2019][248530.939047] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:19:11 2019][248530.945477] [] kthread+0xd1/0xe0 [Thu Dec 12 03:19:11 2019][248530.950479] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:19:11 2019][248530.957054] [] 0xffffffffffffffff [Thu Dec 12 03:19:11 2019][248530.962174] LustreError: dumping log to /tmp/lustre-log.1576149551.67605 [Thu Dec 12 03:19:41 2019][248561.120970] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800711 to 0x1a80000400:11800737 [Thu Dec 12 03:20:09 2019][248589.691092] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467653 to 0x0:27467681 [Thu Dec 12 03:21:26 2019][248666.075326] Pid: 67914, comm: ll_ost02_062 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:21:26 2019][248666.085842] Call Trace: [Thu Dec 12 03:21:26 2019][248666.088392] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:21:26 2019][248666.094786] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:21:26 2019][248666.101848] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:21:26 2019][248666.109649] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:21:26 2019][248666.116076] [] kthread+0xd1/0xe0 [Thu Dec 12 03:21:26 2019][248666.121080] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:21:26 2019][248666.127664] [] 0xffffffffffffffff [Thu Dec 12 03:21:26 2019][248666.132776] LustreError: dumping log to /tmp/lustre-log.1576149686.67914 [Thu Dec 12 03:21:36 2019][248675.743947] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125990 to 0x1980000400:1126017 [Thu Dec 12 03:22:03 2019][248703.213365] Lustre: fir-OST005c: deleting orphan objects from 0x0:27192644 to 0x0:27192673 [Thu Dec 12 03:22:44 2019][248743.900878] Pid: 66110, comm: ll_ost03_000 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:22:44 2019][248743.911403] Call Trace: [Thu Dec 12 03:22:44 2019][248743.913983] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 03:22:44 2019][248743.920977] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 03:22:44 2019][248743.928161] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 03:22:44 2019][248743.934810] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 03:22:44 2019][248743.941559] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 03:22:44 2019][248743.949084] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 03:22:44 2019][248743.956186] [] tgt_client_data_update+0x303/0x5e0 [ptlrpc] [Thu Dec 12 03:22:44 2019][248743.963497] [] tgt_client_new+0x41b/0x610 [ptlrpc] [Thu Dec 12 03:22:44 2019][248743.970096] [] ofd_obd_connect+0x3a3/0x4c0 [ofd] [Thu Dec 12 03:22:44 2019][248743.976488] [] target_handle_connect+0xecb/0x2b10 [ptlrpc] [Thu Dec 12 03:22:44 2019][248743.983784] [] tgt_request_handle+0x50a/0x1580 [ptlrpc] [Thu Dec 12 03:22:44 2019][248743.990815] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:22:44 2019][248743.998630] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:22:44 2019][248744.005044] [] kthread+0xd1/0xe0 [Thu Dec 12 03:22:44 2019][248744.010073] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:22:44 2019][248744.016632] [] 0xffffffffffffffff [Thu Dec 12 03:22:44 2019][248744.021745] LustreError: dumping log to /tmp/lustre-log.1576149764.66110 [Thu Dec 12 03:22:50 2019][248750.113278] Lustre: fir-OST0058: deleting orphan objects from 0x0:27506947 to 0x0:27506977 [Thu Dec 12 03:23:00 2019][248760.285186] Pid: 67800, comm: ll_ost03_051 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:23:00 2019][248760.295705] Call Trace: [Thu Dec 12 03:23:00 2019][248760.298248] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:23:00 2019][248760.304637] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:23:00 2019][248760.311698] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:23:00 2019][248760.319503] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:23:00 2019][248760.325930] [] kthread+0xd1/0xe0 [Thu Dec 12 03:23:00 2019][248760.330924] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:23:00 2019][248760.337492] [] 0xffffffffffffffff [Thu Dec 12 03:23:00 2019][248760.342584] LustreError: dumping log to /tmp/lustre-log.1576149780.67800 [Thu Dec 12 03:23:22 2019][248782.005337] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785547 to 0x1800000401:11785569 [Thu Dec 12 03:23:25 2019][248784.821470] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049132 to 0x1a00000401:3049153 [Thu Dec 12 03:23:49 2019][248809.438173] Pid: 67922, comm: ll_ost02_065 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:23:49 2019][248809.448693] Call Trace: [Thu Dec 12 03:23:49 2019][248809.451246] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 03:23:49 2019][248809.458286] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 03:23:49 2019][248809.465596] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 03:23:49 2019][248809.472248] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 03:23:49 2019][248809.478665] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:23:49 2019][248809.485709] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:23:49 2019][248809.493523] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:23:49 2019][248809.499939] [] kthread+0xd1/0xe0 [Thu Dec 12 03:23:49 2019][248809.504939] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:23:49 2019][248809.511521] [] 0xffffffffffffffff [Thu Dec 12 03:23:49 2019][248809.516644] LustreError: dumping log to /tmp/lustre-log.1576149829.67922 [Thu Dec 12 03:24:05 2019][248825.036042] Lustre: fir-OST005c: Client 6fe05dcf-b9e2-99d7-33ce-acbd0a395824 (at 10.9.117.43@o2ib4) reconnecting [Thu Dec 12 03:24:05 2019][248825.046307] Lustre: Skipped 991 previous similar messages [Thu Dec 12 03:24:43 2019][248862.935916] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105702 to 0x1a00000402:1105729 [Thu Dec 12 03:24:51 2019][248871.200656] Lustre: fir-OST0054: deleting orphan objects from 0x0:27457989 to 0x0:27458017 [Thu Dec 12 03:24:59 2019][248878.857548] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/-150), not sending early reply [Thu Dec 12 03:24:59 2019][248878.857548] req@ffff88e71576d850 x1648846800564208/t0(0) o4->9a91b993-1399-1978-f4a8-fbbdfe7e9dbc@10.9.105.36@o2ib4:579/0 lens 488/0 e 0 to 0 dl 1576149904 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 03:24:59 2019][248878.886624] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1974 previous similar messages [Thu Dec 12 03:25:06 2019][248885.911390] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842313 to 0x1980000401:11842337 [Thu Dec 12 03:25:23 2019][248903.615297] Lustre: fir-OST0058: Connection restored to 60f9d14b-3c44-6ec6-f712-fe240a1f47a0 (at 10.9.104.30@o2ib4) [Thu Dec 12 03:25:23 2019][248903.625821] Lustre: Skipped 1007 previous similar messages [Thu Dec 12 03:26:03 2019][248943.591555] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 03:26:03 2019][248943.600519] Lustre: Skipped 58 previous similar messages [Thu Dec 12 03:26:46 2019][248986.713378] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797643 to 0x1900000401:11797665 [Thu Dec 12 03:26:48 2019][248987.913454] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086122 to 0x1900000402:3086145 [Thu Dec 12 03:27:32 2019][249032.355816] Lustre: fir-OST005a: deleting orphan objects from 0x0:27562339 to 0x0:27562401 [Thu Dec 12 03:27:36 2019][249036.597929] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082282 to 0x1a80000402:3082305 [Thu Dec 12 03:27:55 2019][249055.203014] LNet: Service thread pid 93055 was inactive for 1201.99s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 03:27:55 2019][249055.220143] LNet: Skipped 8 previous similar messages [Thu Dec 12 03:27:55 2019][249055.225293] Pid: 93055, comm: ll_ost02_067 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:27:55 2019][249055.235854] Call Trace: [Thu Dec 12 03:27:55 2019][249055.238404] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:27:55 2019][249055.244806] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:27:55 2019][249055.251875] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:27:55 2019][249055.259682] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:27:55 2019][249055.266118] [] kthread+0xd1/0xe0 [Thu Dec 12 03:27:55 2019][249055.271122] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:27:55 2019][249055.277693] [] 0xffffffffffffffff [Thu Dec 12 03:27:55 2019][249055.282829] LustreError: dumping log to /tmp/lustre-log.1576150075.93055 [Thu Dec 12 03:28:15 2019][249074.987177] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089992 to 0x1980000402:3090017 [Thu Dec 12 03:28:43 2019][249102.988655] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122372 to 0x1a80000401:1122401 [Thu Dec 12 03:29:09 2019][249128.932472] Pid: 67772, comm: ll_ost03_042 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:29:09 2019][249128.942990] Call Trace: [Thu Dec 12 03:29:09 2019][249128.945548] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 03:29:09 2019][249128.952580] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 03:29:09 2019][249128.959860] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 03:29:09 2019][249128.966509] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 03:29:09 2019][249128.972887] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:29:09 2019][249128.979944] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:29:09 2019][249128.987758] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:29:09 2019][249128.994173] [] kthread+0xd1/0xe0 [Thu Dec 12 03:29:09 2019][249128.999190] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:29:09 2019][249129.005747] [] 0xffffffffffffffff [Thu Dec 12 03:29:09 2019][249129.010874] LustreError: dumping log to /tmp/lustre-log.1576150149.67772 [Thu Dec 12 03:29:13 2019][249133.028554] Pid: 67838, comm: ll_ost01_061 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:29:13 2019][249133.039079] Call Trace: [Thu Dec 12 03:29:13 2019][249133.041642] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:29:13 2019][249133.048038] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:29:13 2019][249133.055129] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:29:13 2019][249133.062928] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:29:13 2019][249133.069357] [] kthread+0xd1/0xe0 [Thu Dec 12 03:29:13 2019][249133.074360] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:29:13 2019][249133.080935] [] 0xffffffffffffffff [Thu Dec 12 03:29:13 2019][249133.086053] LustreError: dumping log to /tmp/lustre-log.1576150153.67838 [Thu Dec 12 03:30:17 2019][249196.957637] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3070853 to 0x1800000400:3070881 [Thu Dec 12 03:30:30 2019][249209.766794] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124617 to 0x1900000400:1124641 [Thu Dec 12 03:30:37 2019][249216.814894] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117670 to 0x1800000402:1117697 [Thu Dec 12 03:31:36 2019][249276.235748] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126019 to 0x1980000400:1126049 [Thu Dec 12 03:32:09 2019][249309.160033] Pid: 67809, comm: ll_ost02_046 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:32:09 2019][249309.170550] Call Trace: [Thu Dec 12 03:32:09 2019][249309.173103] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 03:32:09 2019][249309.180155] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 03:32:09 2019][249309.187457] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 03:32:09 2019][249309.194124] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 03:32:09 2019][249309.200533] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:32:09 2019][249309.207588] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:32:09 2019][249309.215418] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:32:09 2019][249309.221846] [] kthread+0xd1/0xe0 [Thu Dec 12 03:32:09 2019][249309.226849] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:32:09 2019][249309.233425] [] 0xffffffffffffffff [Thu Dec 12 03:32:09 2019][249309.238534] LustreError: dumping log to /tmp/lustre-log.1576150329.67809 [Thu Dec 12 03:32:18 2019][249317.823933] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800747 to 0x1a80000400:11800769 [Thu Dec 12 03:32:45 2019][249344.866047] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467684 to 0x0:27467713 [Thu Dec 12 03:33:58 2019][249418.734208] LustreError: 113350:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576150138, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005e_UUID lock: ffff891b11266e40/0x7066c9c190b6827c lrc: 3/0,1 mode: --/PW res: [0x1a31f84:0x0:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 113350 timeout: 0 lvb_type: 0 [Thu Dec 12 03:33:58 2019][249418.777406] LustreError: 113350:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 3 previous similar messages [Thu Dec 12 03:34:04 2019][249423.850298] Pid: 66392, comm: ll_ost02_004 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:34:04 2019][249423.860821] Call Trace: [Thu Dec 12 03:34:04 2019][249423.863372] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:34:04 2019][249423.869762] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:34:04 2019][249423.876823] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:34:04 2019][249423.884625] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:34:04 2019][249423.891056] [] kthread+0xd1/0xe0 [Thu Dec 12 03:34:04 2019][249423.896057] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:34:04 2019][249423.902626] [] 0xffffffffffffffff [Thu Dec 12 03:34:04 2019][249423.907734] LustreError: dumping log to /tmp/lustre-log.1576150444.66392 [Thu Dec 12 03:34:06 2019][249426.031367] Lustre: fir-OST005c: Client 6fe05dcf-b9e2-99d7-33ce-acbd0a395824 (at 10.9.117.43@o2ib4) reconnecting [Thu Dec 12 03:34:06 2019][249426.041627] Lustre: Skipped 1104 previous similar messages [Thu Dec 12 03:34:39 2019][249459.044275] Lustre: fir-OST005c: deleting orphan objects from 0x0:27192644 to 0x0:27192705 [Thu Dec 12 03:34:45 2019][249464.963860] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105733 to 0x1a00000402:1105761 [Thu Dec 12 03:35:00 2019][249479.839421] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 03:35:00 2019][249479.839421] req@ffff88fb8d1ee050 x1649407019631840/t0(0) o4->022acf30-b33d-ab48-4fa4-ec70c96ae93e@10.9.114.2@o2ib4:425/0 lens 840/0 e 0 to 0 dl 1576150505 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 03:35:00 2019][249479.868169] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 2153 previous similar messages [Thu Dec 12 03:35:25 2019][249504.860836] Lustre: fir-OST005a: Connection restored to f5f4e1fb-09a1-cbb0-925c-f94d3727005b (at 10.9.101.44@o2ib4) [Thu Dec 12 03:35:25 2019][249504.871363] Lustre: Skipped 1080 previous similar messages [Thu Dec 12 03:35:26 2019][249506.141235] Lustre: fir-OST0058: deleting orphan objects from 0x0:27506982 to 0x0:27507009 [Thu Dec 12 03:35:38 2019][249518.060183] Pid: 67606, comm: ll_ost03_011 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:35:38 2019][249518.070700] Call Trace: [Thu Dec 12 03:35:38 2019][249518.073252] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:35:38 2019][249518.079651] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:35:38 2019][249518.086712] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:35:38 2019][249518.094514] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:35:38 2019][249518.100945] [] kthread+0xd1/0xe0 [Thu Dec 12 03:35:38 2019][249518.105948] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:35:38 2019][249518.112522] [] 0xffffffffffffffff [Thu Dec 12 03:35:38 2019][249518.117649] LustreError: dumping log to /tmp/lustre-log.1576150538.67606 [Thu Dec 12 03:35:59 2019][249539.188283] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785582 to 0x1800000401:11785601 [Thu Dec 12 03:36:01 2019][249540.820447] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049157 to 0x1a00000401:3049185 [Thu Dec 12 03:36:19 2019][249558.976305] Lustre: fir-OST0056: Export ffff891b8f069c00 already connecting from 10.8.22.14@o2ib6 [Thu Dec 12 03:36:19 2019][249558.985275] Lustre: Skipped 55 previous similar messages [Thu Dec 12 03:37:28 2019][249627.983628] Lustre: fir-OST0054: deleting orphan objects from 0x0:27457989 to 0x0:27458049 [Thu Dec 12 03:37:37 2019][249637.649850] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082310 to 0x1a80000402:3082337 [Thu Dec 12 03:37:42 2019][249642.550332] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842359 to 0x1980000401:11842401 [Thu Dec 12 03:38:16 2019][249676.375086] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090019 to 0x1980000402:3090049 [Thu Dec 12 03:39:07 2019][249726.960325] LNet: Service thread pid 67839 was inactive for 1203.49s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 03:39:07 2019][249726.977429] LNet: Skipped 5 previous similar messages [Thu Dec 12 03:39:07 2019][249726.982578] Pid: 67839, comm: ll_ost03_054 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:39:07 2019][249726.993113] Call Trace: [Thu Dec 12 03:39:07 2019][249726.995663] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 03:39:07 2019][249727.002689] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 03:39:07 2019][249727.009977] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 03:39:07 2019][249727.016625] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 03:39:07 2019][249727.023028] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:39:07 2019][249727.030073] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:39:07 2019][249727.037890] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:39:07 2019][249727.044308] [] kthread+0xd1/0xe0 [Thu Dec 12 03:39:07 2019][249727.049321] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:39:07 2019][249727.055886] [] 0xffffffffffffffff [Thu Dec 12 03:39:07 2019][249727.061008] LustreError: dumping log to /tmp/lustre-log.1576150747.67839 [Thu Dec 12 03:39:07 2019][249727.069834] Pid: 112534, comm: ll_ost03_070 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:39:07 2019][249727.080475] Call Trace: [Thu Dec 12 03:39:07 2019][249727.083020] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 03:39:07 2019][249727.090035] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 03:39:07 2019][249727.097333] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 03:39:07 2019][249727.103979] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 03:39:07 2019][249727.110380] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:39:07 2019][249727.117403] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:39:07 2019][249727.125217] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:39:07 2019][249727.131635] [] kthread+0xd1/0xe0 [Thu Dec 12 03:39:07 2019][249727.136640] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:39:07 2019][249727.143197] [] 0xffffffffffffffff [Thu Dec 12 03:39:22 2019][249741.928311] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797679 to 0x1900000401:11797697 [Thu Dec 12 03:39:24 2019][249743.944442] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086157 to 0x1900000402:3086177 [Thu Dec 12 03:40:09 2019][249789.250788] Lustre: fir-OST005a: deleting orphan objects from 0x0:27562339 to 0x0:27562433 [Thu Dec 12 03:40:20 2019][249800.689767] Pid: 67850, comm: ll_ost02_054 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:40:20 2019][249800.700284] Call Trace: [Thu Dec 12 03:40:20 2019][249800.702860] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 03:40:20 2019][249800.709866] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 03:40:20 2019][249800.717057] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 03:40:20 2019][249800.723713] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 03:40:20 2019][249800.730469] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 03:40:20 2019][249800.737994] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 03:40:20 2019][249800.745102] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 03:40:20 2019][249800.751325] [] ofd_destroy+0x5d0/0x960 [ofd] [Thu Dec 12 03:40:20 2019][249800.757387] [] ofd_destroy_by_fid+0x1f4/0x4a0 [ofd] [Thu Dec 12 03:40:20 2019][249800.764053] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 03:40:20 2019][249800.770459] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:40:20 2019][249800.777506] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:40:21 2019][249800.785331] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:40:21 2019][249800.791747] [] kthread+0xd1/0xe0 [Thu Dec 12 03:40:21 2019][249800.796771] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:40:21 2019][249800.803333] [] 0xffffffffffffffff [Thu Dec 12 03:40:21 2019][249800.808444] LustreError: dumping log to /tmp/lustre-log.1576150821.67850 [Thu Dec 12 03:40:33 2019][249812.978007] Pid: 67926, comm: ll_ost02_066 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:40:33 2019][249812.988523] Call Trace: [Thu Dec 12 03:40:33 2019][249812.991076] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:40:33 2019][249812.997466] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:40:33 2019][249813.004533] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:40:33 2019][249813.012339] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:40:33 2019][249813.018768] [] kthread+0xd1/0xe0 [Thu Dec 12 03:40:33 2019][249813.023770] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:40:33 2019][249813.030345] [] 0xffffffffffffffff [Thu Dec 12 03:40:33 2019][249813.035457] LustreError: dumping log to /tmp/lustre-log.1576150833.67926 [Thu Dec 12 03:41:19 2019][249859.339646] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122372 to 0x1a80000401:1122433 [Thu Dec 12 03:41:37 2019][249877.023672] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126052 to 0x1980000400:1126081 [Thu Dec 12 03:41:51 2019][249890.803544] Pid: 67798, comm: ll_ost01_054 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:41:51 2019][249890.814064] Call Trace: [Thu Dec 12 03:41:51 2019][249890.816614] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:41:51 2019][249890.823015] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:41:51 2019][249890.830078] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:41:51 2019][249890.837879] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:41:51 2019][249890.844309] [] kthread+0xd1/0xe0 [Thu Dec 12 03:41:51 2019][249890.849312] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:41:51 2019][249890.855887] [] 0xffffffffffffffff [Thu Dec 12 03:41:51 2019][249890.860995] LustreError: dumping log to /tmp/lustre-log.1576150911.67798 [Thu Dec 12 03:42:53 2019][249952.876534] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3070892 to 0x1800000400:3070913 [Thu Dec 12 03:43:05 2019][249964.941770] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124645 to 0x1900000400:1124673 [Thu Dec 12 03:43:13 2019][249973.325887] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117700 to 0x1800000402:1117729 [Thu Dec 12 03:44:07 2019][250026.824362] Lustre: fir-OST005a: Client 8ea9d9cd-8086-16f1-7cea-3b482c5d9f4c (at 10.9.108.20@o2ib4) reconnecting [Thu Dec 12 03:44:07 2019][250026.834629] Lustre: Skipped 1127 previous similar messages [Thu Dec 12 03:44:46 2019][250065.919743] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105733 to 0x1a00000402:1105793 [Thu Dec 12 03:44:54 2019][250073.806836] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800781 to 0x1a80000400:11800801 [Thu Dec 12 03:45:01 2019][250080.871297] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/-150), not sending early reply [Thu Dec 12 03:45:01 2019][250080.871297] req@ffff8910cd0c1850 x1648858225853520/t0(0) o4->8e4fe161-7440-1bc3-60cf-ef16452a7501@10.9.105.43@o2ib4:271/0 lens 512/0 e 0 to 0 dl 1576151106 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 03:45:01 2019][250080.900372] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 2364 previous similar messages [Thu Dec 12 03:45:21 2019][250100.817003] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467716 to 0x0:27467745 [Thu Dec 12 03:45:26 2019][250105.856716] Lustre: fir-OST005a: Connection restored to f5f4e1fb-09a1-cbb0-925c-f94d3727005b (at 10.9.101.44@o2ib4) [Thu Dec 12 03:45:26 2019][250105.867249] Lustre: Skipped 1149 previous similar messages [Thu Dec 12 03:46:19 2019][250159.733920] Lustre: fir-OST0056: Export ffff891d99820800 already connecting from 10.8.23.26@o2ib6 [Thu Dec 12 03:46:19 2019][250159.742895] Lustre: Skipped 52 previous similar messages [Thu Dec 12 03:46:37 2019][250177.529194] Pid: 67659, comm: ll_ost02_024 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:46:37 2019][250177.539718] Call Trace: [Thu Dec 12 03:46:37 2019][250177.542276] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:46:37 2019][250177.548677] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:46:37 2019][250177.555747] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:46:37 2019][250177.563550] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:46:37 2019][250177.569977] [] kthread+0xd1/0xe0 [Thu Dec 12 03:46:37 2019][250177.574981] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:46:37 2019][250177.581556] [] 0xffffffffffffffff [Thu Dec 12 03:46:37 2019][250177.586668] LustreError: dumping log to /tmp/lustre-log.1576151197.67659 [Thu Dec 12 03:47:15 2019][250215.363202] Lustre: fir-OST005c: deleting orphan objects from 0x0:27192708 to 0x0:27192737 [Thu Dec 12 03:47:38 2019][250238.581657] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082349 to 0x1a80000402:3082369 [Thu Dec 12 03:48:02 2019][250262.364153] Lustre: fir-OST0058: deleting orphan objects from 0x0:27507013 to 0x0:27507105 [Thu Dec 12 03:48:16 2019][250275.835129] Pid: 112555, comm: ll_ost03_077 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:48:16 2019][250275.845734] Call Trace: [Thu Dec 12 03:48:16 2019][250275.848288] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:48:16 2019][250275.854687] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:48:16 2019][250275.861747] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:48:16 2019][250275.869552] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:48:16 2019][250275.875977] [] kthread+0xd1/0xe0 [Thu Dec 12 03:48:16 2019][250275.880982] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:48:16 2019][250275.887559] [] 0xffffffffffffffff [Thu Dec 12 03:48:16 2019][250275.892682] LustreError: dumping log to /tmp/lustre-log.1576151296.112555 [Thu Dec 12 03:48:18 2019][250277.802927] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090058 to 0x1980000402:3090081 [Thu Dec 12 03:48:35 2019][250295.675238] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785609 to 0x1800000401:11785633 [Thu Dec 12 03:48:37 2019][250296.923336] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049197 to 0x1a00000401:3049217 [Thu Dec 12 03:49:01 2019][250320.892016] Pid: 113350, comm: ll_ost02_088 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:49:01 2019][250320.902620] Call Trace: [Thu Dec 12 03:49:01 2019][250320.905175] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 03:49:01 2019][250320.912218] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 03:49:01 2019][250320.919515] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 03:49:01 2019][250320.926164] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 03:49:01 2019][250320.932567] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:49:01 2019][250320.939605] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:49:01 2019][250320.947421] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:49:01 2019][250320.953839] [] kthread+0xd1/0xe0 [Thu Dec 12 03:49:01 2019][250320.958836] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:49:01 2019][250320.965413] [] 0xffffffffffffffff [Thu Dec 12 03:49:01 2019][250320.970531] LustreError: dumping log to /tmp/lustre-log.1576151341.113350 [Thu Dec 12 03:50:04 2019][250383.894535] Lustre: fir-OST0054: deleting orphan objects from 0x0:27458051 to 0x0:27458081 [Thu Dec 12 03:50:18 2019][250397.949266] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842407 to 0x1980000401:11842433 [Thu Dec 12 03:50:46 2019][250425.790104] LustreError: 112507:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576151145, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005e_UUID lock: ffff89135ff21440/0x7066c9c190b6bfae lrc: 3/0,1 mode: --/PW res: [0x1a80000402:0x2f0864:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 112507 timeout: 0 lvb_type: 0 [Thu Dec 12 03:50:46 2019][250425.833993] LustreError: 112507:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 4 previous similar messages [Thu Dec 12 03:51:32 2019][250472.447025] LNet: Service thread pid 67864 was inactive for 1201.95s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 03:51:32 2019][250472.464135] LNet: Skipped 7 previous similar messages [Thu Dec 12 03:51:32 2019][250472.469283] Pid: 67864, comm: ll_ost02_056 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:51:32 2019][250472.479822] Call Trace: [Thu Dec 12 03:51:32 2019][250472.482379] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 03:51:32 2019][250472.489423] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 03:51:32 2019][250472.496717] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 03:51:32 2019][250472.503374] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 03:51:32 2019][250472.509777] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:51:32 2019][250472.516809] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:51:32 2019][250472.524622] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:51:32 2019][250472.531039] [] kthread+0xd1/0xe0 [Thu Dec 12 03:51:32 2019][250472.536069] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:51:32 2019][250472.542637] [] 0xffffffffffffffff [Thu Dec 12 03:51:32 2019][250472.547766] LustreError: dumping log to /tmp/lustre-log.1576151492.67864 [Thu Dec 12 03:51:38 2019][250478.307497] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126085 to 0x1980000400:1126113 [Thu Dec 12 03:51:53 2019][250492.927428] Pid: 67737, comm: ll_ost01_043 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:51:53 2019][250492.937948] Call Trace: [Thu Dec 12 03:51:53 2019][250492.940508] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:51:53 2019][250492.946897] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:51:53 2019][250492.953961] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:51:53 2019][250492.961761] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:51:53 2019][250492.968193] [] kthread+0xd1/0xe0 [Thu Dec 12 03:51:53 2019][250492.973194] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:51:53 2019][250492.979767] [] 0xffffffffffffffff [Thu Dec 12 03:51:53 2019][250492.984878] LustreError: dumping log to /tmp/lustre-log.1576151513.67737 [Thu Dec 12 03:51:58 2019][250498.495240] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797702 to 0x1900000401:11797729 [Thu Dec 12 03:52:00 2019][250499.855320] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086185 to 0x1900000402:3086209 [Thu Dec 12 03:52:25 2019][250525.696079] Pid: 67717, comm: ll_ost03_035 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:52:25 2019][250525.706597] Call Trace: [Thu Dec 12 03:52:25 2019][250525.709150] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 03:52:25 2019][250525.716189] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 03:52:25 2019][250525.723486] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 03:52:25 2019][250525.730149] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 03:52:25 2019][250525.736553] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:52:25 2019][250525.743595] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:52:25 2019][250525.751410] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:52:25 2019][250525.757826] [] kthread+0xd1/0xe0 [Thu Dec 12 03:52:25 2019][250525.762826] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:52:26 2019][250525.769403] [] 0xffffffffffffffff [Thu Dec 12 03:52:26 2019][250525.774511] LustreError: dumping log to /tmp/lustre-log.1576151545.67717 [Thu Dec 12 03:52:45 2019][250545.097706] Lustre: fir-OST005a: deleting orphan objects from 0x0:27562438 to 0x0:27562465 [Thu Dec 12 03:53:10 2019][250570.752967] Pid: 112498, comm: ll_ost02_069 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:53:10 2019][250570.763571] Call Trace: [Thu Dec 12 03:53:10 2019][250570.766122] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:53:10 2019][250570.772525] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:53:10 2019][250570.779587] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:53:10 2019][250570.787389] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:53:10 2019][250570.793816] [] kthread+0xd1/0xe0 [Thu Dec 12 03:53:11 2019][250570.798819] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:53:11 2019][250570.805396] [] 0xffffffffffffffff [Thu Dec 12 03:53:11 2019][250570.810529] LustreError: dumping log to /tmp/lustre-log.1576151590.112498 [Thu Dec 12 03:53:31 2019][250591.233377] Pid: 67665, comm: ll_ost02_026 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:53:31 2019][250591.243897] Call Trace: [Thu Dec 12 03:53:31 2019][250591.246455] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 03:53:31 2019][250591.253502] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 03:53:31 2019][250591.260784] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 03:53:31 2019][250591.267446] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 03:53:31 2019][250591.273837] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:53:31 2019][250591.280890] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:53:31 2019][250591.288692] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:53:31 2019][250591.295133] [] kthread+0xd1/0xe0 [Thu Dec 12 03:53:31 2019][250591.300134] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:53:31 2019][250591.306706] [] 0xffffffffffffffff [Thu Dec 12 03:53:31 2019][250591.311817] LustreError: dumping log to /tmp/lustre-log.1576151611.67665 [Thu Dec 12 03:53:55 2019][250615.482517] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122436 to 0x1a80000401:1122465 [Thu Dec 12 03:54:08 2019][250628.269594] Lustre: fir-OST0058: Client ce5ee768-37d0-d480-6e14-e3a25f5ac36c (at 10.9.117.30@o2ib4) reconnecting [Thu Dec 12 03:54:08 2019][250628.279869] Lustre: Skipped 1173 previous similar messages [Thu Dec 12 03:54:47 2019][250667.155551] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105797 to 0x1a00000402:1105825 [Thu Dec 12 03:55:02 2019][250682.691180] Lustre: 27223:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 03:55:02 2019][250682.691180] req@ffff8921db043050 x1652298167675392/t0(0) o4->dc9dbc94-e0be-4@10.9.113.14@o2ib4:117/0 lens 488/0 e 1 to 0 dl 1576151707 ref 2 fl New:/0/ffffffff rc 0/-1 [Thu Dec 12 03:55:02 2019][250682.718170] Lustre: 27223:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 2407 previous similar messages [Thu Dec 12 03:55:27 2019][250707.061657] Lustre: fir-OST005a: Connection restored to f5f4e1fb-09a1-cbb0-925c-f94d3727005b (at 10.9.101.44@o2ib4) [Thu Dec 12 03:55:27 2019][250707.072178] Lustre: Skipped 1178 previous similar messages [Thu Dec 12 03:55:29 2019][250709.547455] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3070921 to 0x1800000400:3070945 [Thu Dec 12 03:55:41 2019][250721.236613] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124676 to 0x1900000400:1124705 [Thu Dec 12 03:55:49 2019][250728.860747] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117734 to 0x1800000402:1117761 [Thu Dec 12 03:56:29 2019][250769.093211] Lustre: fir-OST0056: Export ffff8902f7fafc00 already connecting from 10.8.22.24@o2ib6 [Thu Dec 12 03:56:29 2019][250769.102174] Lustre: Skipped 61 previous similar messages [Thu Dec 12 03:57:28 2019][250828.806053] Pid: 66109, comm: ll_ost02_002 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:57:29 2019][250828.816569] Call Trace: [Thu Dec 12 03:57:29 2019][250828.819127] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 03:57:29 2019][250828.826159] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 03:57:29 2019][250828.833448] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 03:57:29 2019][250828.840109] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 03:57:29 2019][250828.846518] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:57:29 2019][250828.853557] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:57:29 2019][250828.861371] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:57:29 2019][250828.867787] [] kthread+0xd1/0xe0 [Thu Dec 12 03:57:29 2019][250828.872789] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:57:29 2019][250828.879382] [] 0xffffffffffffffff [Thu Dec 12 03:57:29 2019][250828.884492] LustreError: dumping log to /tmp/lustre-log.1576151849.66109 [Thu Dec 12 03:57:30 2019][250830.789793] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800805 to 0x1a80000400:11800833 [Thu Dec 12 03:57:39 2019][250839.617532] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082371 to 0x1a80000402:3082401 [Thu Dec 12 03:57:45 2019][250845.190372] Pid: 67816, comm: ll_ost03_053 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:57:45 2019][250845.200886] Call Trace: [Thu Dec 12 03:57:45 2019][250845.203456] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 03:57:45 2019][250845.210454] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 03:57:45 2019][250845.217637] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 03:57:45 2019][250845.224288] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 03:57:45 2019][250845.231036] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 03:57:45 2019][250845.238561] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 03:57:45 2019][250845.245656] [] dqget+0x3fa/0x450 [Thu Dec 12 03:57:45 2019][250845.250659] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 03:57:45 2019][250845.256446] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 03:57:45 2019][250845.264060] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 03:57:45 2019][250845.270545] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 03:57:45 2019][250845.276677] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:57:45 2019][250845.283738] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:57:45 2019][250845.291543] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:57:45 2019][250845.297983] [] kthread+0xd1/0xe0 [Thu Dec 12 03:57:45 2019][250845.302982] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:57:45 2019][250845.309556] [] 0xffffffffffffffff [Thu Dec 12 03:57:45 2019][250845.314659] LustreError: dumping log to /tmp/lustre-log.1576151865.67816 [Thu Dec 12 03:57:57 2019][250856.903873] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467748 to 0x0:27467777 [Thu Dec 12 03:58:18 2019][250878.318764] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090085 to 0x1980000402:3090113 [Thu Dec 12 03:58:38 2019][250898.439433] Pid: 67624, comm: ll_ost01_016 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:58:38 2019][250898.449956] Call Trace: [Thu Dec 12 03:58:38 2019][250898.452524] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 03:58:38 2019][250898.459523] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 03:58:38 2019][250898.466708] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 03:58:38 2019][250898.473371] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 03:58:38 2019][250898.480122] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 03:58:38 2019][250898.487646] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 03:58:38 2019][250898.494741] [] dqget+0x3fa/0x450 [Thu Dec 12 03:58:38 2019][250898.499745] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 03:58:38 2019][250898.505519] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 03:58:38 2019][250898.513143] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 03:58:38 2019][250898.519618] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 03:58:38 2019][250898.525760] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:58:38 2019][250898.532809] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:58:38 2019][250898.540640] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:58:38 2019][250898.547059] [] kthread+0xd1/0xe0 [Thu Dec 12 03:58:38 2019][250898.552072] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:58:38 2019][250898.558637] [] 0xffffffffffffffff [Thu Dec 12 03:58:38 2019][250898.563751] LustreError: dumping log to /tmp/lustre-log.1576151918.67624 [Thu Dec 12 03:59:15 2019][250935.304148] Pid: 67616, comm: ll_ost02_013 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 03:59:15 2019][250935.314666] Call Trace: [Thu Dec 12 03:59:15 2019][250935.317219] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 03:59:15 2019][250935.323620] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 03:59:15 2019][250935.330680] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 03:59:15 2019][250935.338481] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 03:59:15 2019][250935.344911] [] kthread+0xd1/0xe0 [Thu Dec 12 03:59:15 2019][250935.349915] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 03:59:15 2019][250935.356490] [] 0xffffffffffffffff [Thu Dec 12 03:59:15 2019][250935.361599] LustreError: dumping log to /tmp/lustre-log.1576151955.67616 [Thu Dec 12 03:59:51 2019][250971.263225] Lustre: fir-OST005c: deleting orphan objects from 0x0:27192708 to 0x0:27192769 [Thu Dec 12 04:00:38 2019][251018.571081] Lustre: fir-OST0058: deleting orphan objects from 0x0:27507110 to 0x0:27507137 [Thu Dec 12 04:00:49 2019][251029.513992] Pid: 67610, comm: ll_ost03_012 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:00:49 2019][251029.524510] Call Trace: [Thu Dec 12 04:00:49 2019][251029.527061] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 04:00:49 2019][251029.533478] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:00:49 2019][251029.540541] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:00:49 2019][251029.548358] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:00:49 2019][251029.554789] [] kthread+0xd1/0xe0 [Thu Dec 12 04:00:49 2019][251029.559810] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:00:49 2019][251029.566383] [] 0xffffffffffffffff [Thu Dec 12 04:00:49 2019][251029.571497] LustreError: dumping log to /tmp/lustre-log.1576152049.67610 [Thu Dec 12 04:01:11 2019][251050.898116] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785637 to 0x1800000401:11785665 [Thu Dec 12 04:01:13 2019][251052.994241] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049223 to 0x1a00000401:3049249 [Thu Dec 12 04:01:39 2019][251079.279393] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126116 to 0x1980000400:1126145 [Thu Dec 12 04:01:55 2019][251095.051283] LNet: Service thread pid 67656 was inactive for 1202.85s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 04:01:55 2019][251095.064318] LustreError: dumping log to /tmp/lustre-log.1576152115.67656 [Thu Dec 12 04:02:40 2019][251139.909431] Lustre: fir-OST0054: deleting orphan objects from 0x0:27458083 to 0x0:27458113 [Thu Dec 12 04:02:54 2019][251153.836208] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842439 to 0x1980000401:11842465 [Thu Dec 12 04:04:08 2019][251228.308143] Lustre: fir-OST005c: Client ff8445d1-f99d-03b2-7c66-3abfa27fa6d1 (at 10.8.27.23@o2ib6) reconnecting [Thu Dec 12 04:04:08 2019][251228.318323] Lustre: Skipped 1166 previous similar messages [Thu Dec 12 04:04:34 2019][251254.758148] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797736 to 0x1900000401:11797761 [Thu Dec 12 04:04:36 2019][251255.910215] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086214 to 0x1900000402:3086241 [Thu Dec 12 04:04:48 2019][251267.943423] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105828 to 0x1a00000402:1105857 [Thu Dec 12 04:05:04 2019][251283.815029] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 04:05:04 2019][251283.815029] req@ffff88e3f7ce9850 x1649407019631840/t0(0) o4->022acf30-b33d-ab48-4fa4-ec70c96ae93e@10.9.114.2@o2ib4:718/0 lens 840/0 e 0 to 0 dl 1576152308 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 04:05:04 2019][251283.843752] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 2557 previous similar messages [Thu Dec 12 04:05:21 2019][251301.248618] Lustre: fir-OST005a: deleting orphan objects from 0x0:27562438 to 0x0:27562497 [Thu Dec 12 04:05:28 2019][251308.051385] Lustre: fir-OST005a: Connection restored to f5f4e1fb-09a1-cbb0-925c-f94d3727005b (at 10.9.101.44@o2ib4) [Thu Dec 12 04:05:28 2019][251308.061923] Lustre: Skipped 1190 previous similar messages [Thu Dec 12 04:05:43 2019][251323.407810] LNet: Service thread pid 67774 was inactive for 200.21s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 04:05:43 2019][251323.424831] LNet: Skipped 9 previous similar messages [Thu Dec 12 04:05:43 2019][251323.429982] Pid: 67774, comm: ll_ost03_043 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:05:43 2019][251323.440516] Call Trace: [Thu Dec 12 04:05:43 2019][251323.443067] [] ldlm_completion_ast+0x430/0x860 [ptlrpc] [Thu Dec 12 04:05:43 2019][251323.450107] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 04:05:43 2019][251323.457404] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 04:05:43 2019][251323.464052] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 04:05:43 2019][251323.470457] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:05:43 2019][251323.477496] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:05:43 2019][251323.485319] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:05:43 2019][251323.491734] [] kthread+0xd1/0xe0 [Thu Dec 12 04:05:43 2019][251323.496737] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:05:43 2019][251323.503327] [] 0xffffffffffffffff [Thu Dec 12 04:05:43 2019][251323.508447] LustreError: dumping log to /tmp/lustre-log.1576152343.67774 [Thu Dec 12 04:05:44 2019][251324.431822] Pid: 30840, comm: ll_ost02_098 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:05:44 2019][251324.442336] Call Trace: [Thu Dec 12 04:05:44 2019][251324.444890] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 04:05:44 2019][251324.451289] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:05:44 2019][251324.458349] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:05:44 2019][251324.466153] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:05:44 2019][251324.472580] [] kthread+0xd1/0xe0 [Thu Dec 12 04:05:44 2019][251324.477586] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:05:44 2019][251324.484174] [] 0xffffffffffffffff [Thu Dec 12 04:05:44 2019][251324.489280] LustreError: dumping log to /tmp/lustre-log.1576152344.30840 [Thu Dec 12 04:05:48 2019][251328.527913] Pid: 112507, comm: ll_ost03_066 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:05:48 2019][251328.538522] Call Trace: [Thu Dec 12 04:05:48 2019][251328.541071] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 04:05:48 2019][251328.548106] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 04:05:48 2019][251328.555402] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 04:05:48 2019][251328.562051] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 04:05:48 2019][251328.568453] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:05:48 2019][251328.575484] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:05:48 2019][251328.583314] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:05:48 2019][251328.589733] [] kthread+0xd1/0xe0 [Thu Dec 12 04:05:48 2019][251328.594734] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:05:48 2019][251328.601303] [] 0xffffffffffffffff [Thu Dec 12 04:05:48 2019][251328.606402] LustreError: dumping log to /tmp/lustre-log.1576152348.112507 [Thu Dec 12 04:06:31 2019][251371.217025] Lustre: fir-OST0056: Export ffff8902f7fafc00 already connecting from 10.8.22.24@o2ib6 [Thu Dec 12 04:06:31 2019][251371.225989] Lustre: Skipped 66 previous similar messages [Thu Dec 12 04:06:31 2019][251371.545464] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122470 to 0x1a80000401:1122497 [Thu Dec 12 04:07:23 2019][251423.194794] LustreError: 67774:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576152143, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0054_UUID lock: ffff891ba507bf00/0x7066c9c190b6f817 lrc: 3/0,1 mode: --/PW res: [0x1800000400:0x2edbc5:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67774 timeout: 0 lvb_type: 0 [Thu Dec 12 04:07:23 2019][251423.238515] LustreError: 67774:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 4 previous similar messages [Thu Dec 12 04:07:40 2019][251440.533389] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082408 to 0x1a80000402:3082433 [Thu Dec 12 04:08:05 2019][251464.954358] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3070952 to 0x1800000400:3070977 [Thu Dec 12 04:08:17 2019][251477.443561] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124710 to 0x1900000400:1124737 [Thu Dec 12 04:08:19 2019][251479.402596] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090117 to 0x1980000402:3090145 [Thu Dec 12 04:08:25 2019][251485.083690] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117765 to 0x1800000402:1117793 [Thu Dec 12 04:08:40 2019][251500.563303] Pid: 30854, comm: ll_ost02_099 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:08:40 2019][251500.573824] Call Trace: [Thu Dec 12 04:08:40 2019][251500.576375] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 04:08:40 2019][251500.583408] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 04:08:40 2019][251500.590706] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 04:08:40 2019][251500.597370] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 04:08:40 2019][251500.603773] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:08:40 2019][251500.610812] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:08:40 2019][251500.618627] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:08:40 2019][251500.625044] [] kthread+0xd1/0xe0 [Thu Dec 12 04:08:40 2019][251500.630043] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:08:40 2019][251500.636614] [] 0xffffffffffffffff [Thu Dec 12 04:08:40 2019][251500.641722] LustreError: dumping log to /tmp/lustre-log.1576152520.30854 [Thu Dec 12 04:10:06 2019][251586.140636] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800839 to 0x1a80000400:11800865 [Thu Dec 12 04:10:33 2019][251612.846821] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467748 to 0x0:27467809 [Thu Dec 12 04:11:40 2019][251680.690394] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126150 to 0x1980000400:1126177 [Thu Dec 12 04:11:53 2019][251693.079093] Pid: 112510, comm: ll_ost02_073 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:11:53 2019][251693.089692] Call Trace: [Thu Dec 12 04:11:53 2019][251693.092245] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 04:11:53 2019][251693.098648] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:11:53 2019][251693.105721] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:11:53 2019][251693.113527] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:11:53 2019][251693.119954] [] kthread+0xd1/0xe0 [Thu Dec 12 04:11:53 2019][251693.124958] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:11:53 2019][251693.131533] [] 0xffffffffffffffff [Thu Dec 12 04:11:53 2019][251693.136643] LustreError: dumping log to /tmp/lustre-log.1576152713.112510 [Thu Dec 12 04:11:57 2019][251697.175176] Pid: 67643, comm: ll_ost01_021 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:11:57 2019][251697.185697] Call Trace: [Thu Dec 12 04:11:57 2019][251697.188250] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 04:11:57 2019][251697.194649] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:11:57 2019][251697.201725] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:11:57 2019][251697.209530] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:11:57 2019][251697.215957] [] kthread+0xd1/0xe0 [Thu Dec 12 04:11:57 2019][251697.220960] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:11:57 2019][251697.227537] [] 0xffffffffffffffff [Thu Dec 12 04:11:57 2019][251697.232646] LustreError: dumping log to /tmp/lustre-log.1576152717.67643 [Thu Dec 12 04:12:27 2019][251727.713043] Lustre: fir-OST005c: deleting orphan objects from 0x0:27192771 to 0x0:27192801 [Thu Dec 12 04:12:30 2019][251729.943809] Pid: 67801, comm: ll_ost01_055 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:12:30 2019][251729.954330] Call Trace: [Thu Dec 12 04:12:30 2019][251729.956906] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 04:12:30 2019][251729.963903] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 04:12:30 2019][251729.971106] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 04:12:30 2019][251729.977754] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 04:12:30 2019][251729.984518] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 04:12:30 2019][251729.992045] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 04:12:30 2019][251729.999155] [] dqget+0x3fa/0x450 [Thu Dec 12 04:12:30 2019][251730.004161] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 04:12:30 2019][251730.009934] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 04:12:30 2019][251730.017559] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 04:12:30 2019][251730.024035] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 04:12:30 2019][251730.030179] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:12:30 2019][251730.037242] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:12:30 2019][251730.045057] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:12:30 2019][251730.051475] [] kthread+0xd1/0xe0 [Thu Dec 12 04:12:30 2019][251730.056488] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:12:30 2019][251730.063054] [] 0xffffffffffffffff [Thu Dec 12 04:12:30 2019][251730.068166] LustreError: dumping log to /tmp/lustre-log.1576152750.67801 [Thu Dec 12 04:12:42 2019][251742.232048] Pid: 31047, comm: ll_ost02_101 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:12:42 2019][251742.242567] Call Trace: [Thu Dec 12 04:12:42 2019][251742.245122] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 04:12:42 2019][251742.252163] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 04:12:42 2019][251742.259472] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 04:12:42 2019][251742.266125] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 04:12:42 2019][251742.272525] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:12:42 2019][251742.279565] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:12:42 2019][251742.287382] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:12:42 2019][251742.293798] [] kthread+0xd1/0xe0 [Thu Dec 12 04:12:42 2019][251742.298797] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:12:42 2019][251742.305367] [] 0xffffffffffffffff [Thu Dec 12 04:12:42 2019][251742.310474] LustreError: dumping log to /tmp/lustre-log.1576152762.31047 [Thu Dec 12 04:12:42 2019][251742.317918] Pid: 66107, comm: ll_ost02_000 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:12:42 2019][251742.328448] Call Trace: [Thu Dec 12 04:12:42 2019][251742.330992] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 04:12:42 2019][251742.338016] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 04:12:42 2019][251742.345304] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 04:12:42 2019][251742.351951] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 04:12:42 2019][251742.358352] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:12:42 2019][251742.365386] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:12:42 2019][251742.373199] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:12:42 2019][251742.379618] [] kthread+0xd1/0xe0 [Thu Dec 12 04:12:42 2019][251742.384621] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:12:42 2019][251742.391201] [] 0xffffffffffffffff [Thu Dec 12 04:12:42 2019][251742.396310] LNet: Service thread pid 67844 was inactive for 1204.37s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 04:13:14 2019][251773.929968] Lustre: fir-OST0058: deleting orphan objects from 0x0:27507140 to 0x0:27507169 [Thu Dec 12 04:13:27 2019][251787.288941] LustreError: dumping log to /tmp/lustre-log.1576152807.67924 [Thu Dec 12 04:13:48 2019][251808.049072] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785679 to 0x1800000401:11785697 [Thu Dec 12 04:13:49 2019][251809.249153] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049259 to 0x1a00000401:3049281 [Thu Dec 12 04:14:08 2019][251828.393886] Lustre: fir-OST0054: Client 016cfe19-2250-799b-d8ad-887e11d25409 (at 10.8.30.32@o2ib6) reconnecting [Thu Dec 12 04:14:08 2019][251828.404063] Lustre: Skipped 1157 previous similar messages [Thu Dec 12 04:14:50 2019][251869.835282] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105859 to 0x1a00000402:1105889 [Thu Dec 12 04:15:04 2019][251884.440856] Lustre: 29439:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/-150), not sending early reply [Thu Dec 12 04:15:04 2019][251884.440856] req@ffff88e6a228c050 x1649533346266960/t0(0) o101->3d41db68-318d-91b5-b35b-0dc2e1801091@10.9.114.11@o2ib4:564/0 lens 328/0 e 0 to 0 dl 1576152909 ref 2 fl New:/0/ffffffff rc 0/-1 [Thu Dec 12 04:15:04 2019][251884.470099] Lustre: 29439:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 2434 previous similar messages [Thu Dec 12 04:15:17 2019][251896.924393] Lustre: fir-OST0054: deleting orphan objects from 0x0:27458083 to 0x0:27458145 [Thu Dec 12 04:15:29 2019][251908.876888] Lustre: fir-OST0054: Connection restored to 9a91b993-1399-1978-f4a8-fbbdfe7e9dbc (at 10.9.105.36@o2ib4) [Thu Dec 12 04:15:29 2019][251908.887413] Lustre: Skipped 1157 previous similar messages [Thu Dec 12 04:15:30 2019][251910.003072] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842474 to 0x1980000401:11842497 [Thu Dec 12 04:16:38 2019][251978.275300] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 04:16:38 2019][251978.284262] Lustre: Skipped 70 previous similar messages [Thu Dec 12 04:17:10 2019][252010.469034] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797774 to 0x1900000401:11797793 [Thu Dec 12 04:17:12 2019][252011.925153] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086245 to 0x1900000402:3086273 [Thu Dec 12 04:17:41 2019][252040.906257] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082437 to 0x1a80000402:3082465 [Thu Dec 12 04:17:57 2019][252057.386556] Lustre: fir-OST005a: deleting orphan objects from 0x0:27562499 to 0x0:27562529 [Thu Dec 12 04:18:20 2019][252080.022519] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090150 to 0x1980000402:3090177 [Thu Dec 12 04:18:22 2019][252082.206789] LNet: Service thread pid 31053 was inactive for 1200.89s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 04:18:22 2019][252082.223897] LNet: Skipped 8 previous similar messages [Thu Dec 12 04:18:22 2019][252082.229046] Pid: 31053, comm: ll_ost02_102 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:18:22 2019][252082.239581] Call Trace: [Thu Dec 12 04:18:22 2019][252082.242130] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 04:18:22 2019][252082.248525] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:18:22 2019][252082.255599] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:18:22 2019][252082.263396] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:18:22 2019][252082.269824] [] kthread+0xd1/0xe0 [Thu Dec 12 04:18:22 2019][252082.274829] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:18:22 2019][252082.281403] [] 0xffffffffffffffff [Thu Dec 12 04:18:22 2019][252082.286513] LustreError: dumping log to /tmp/lustre-log.1576153102.31053 [Thu Dec 12 04:18:34 2019][252094.495042] Pid: 67920, comm: ll_ost02_064 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:18:34 2019][252094.505558] Call Trace: [Thu Dec 12 04:18:34 2019][252094.508127] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 04:18:34 2019][252094.515123] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 04:18:34 2019][252094.522342] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 04:18:34 2019][252094.529010] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 04:18:34 2019][252094.535770] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 04:18:34 2019][252094.543294] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 04:18:34 2019][252094.550403] [] dqget+0x3fa/0x450 [Thu Dec 12 04:18:34 2019][252094.555408] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 04:18:34 2019][252094.561189] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 04:18:34 2019][252094.568808] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 04:18:34 2019][252094.575282] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 04:18:34 2019][252094.581436] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:18:34 2019][252094.588480] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:18:34 2019][252094.596296] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:18:34 2019][252094.602713] [] kthread+0xd1/0xe0 [Thu Dec 12 04:18:34 2019][252094.607729] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:18:34 2019][252094.614292] [] 0xffffffffffffffff [Thu Dec 12 04:18:34 2019][252094.619408] LustreError: dumping log to /tmp/lustre-log.1576153114.67920 [Thu Dec 12 04:18:46 2019][252106.783291] Pid: 31124, comm: ll_ost02_105 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:18:46 2019][252106.793809] Call Trace: [Thu Dec 12 04:18:46 2019][252106.796384] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 04:18:46 2019][252106.803389] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 04:18:46 2019][252106.810589] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 04:18:46 2019][252106.817240] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 04:18:46 2019][252106.823989] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 04:18:47 2019][252106.831513] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 04:18:47 2019][252106.838610] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 04:18:47 2019][252106.844826] [] ofd_destroy+0x5d0/0x960 [ofd] [Thu Dec 12 04:18:47 2019][252106.850880] [] ofd_destroy_by_fid+0x1f4/0x4a0 [ofd] [Thu Dec 12 04:18:47 2019][252106.857532] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 04:18:47 2019][252106.863934] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:18:47 2019][252106.870998] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:18:47 2019][252106.878815] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:18:47 2019][252106.885231] [] kthread+0xd1/0xe0 [Thu Dec 12 04:18:47 2019][252106.890233] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:18:47 2019][252106.896800] [] 0xffffffffffffffff [Thu Dec 12 04:18:47 2019][252106.901900] LustreError: dumping log to /tmp/lustre-log.1576153127.31124 [Thu Dec 12 04:19:08 2019][252127.880420] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122501 to 0x1a80000401:1122529 [Thu Dec 12 04:20:19 2019][252199.053140] LustreError: 31187:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576152919, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0058_UUID lock: ffff890945339200/0x7066c9c190b76cd9 lrc: 3/0,1 mode: --/PW res: [0x1a3b9c2:0x0:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 31187 timeout: 0 lvb_type: 0 [Thu Dec 12 04:20:41 2019][252220.921297] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3070984 to 0x1800000400:3071009 [Thu Dec 12 04:20:53 2019][252233.618531] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124739 to 0x1900000400:1124769 [Thu Dec 12 04:21:01 2019][252240.858691] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117798 to 0x1800000402:1117825 [Thu Dec 12 04:21:42 2019][252281.863168] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126179 to 0x1980000400:1126209 [Thu Dec 12 04:22:11 2019][252311.587397] Pid: 31102, comm: ll_ost02_103 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:22:11 2019][252311.597913] Call Trace: [Thu Dec 12 04:22:11 2019][252311.600483] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 04:22:11 2019][252311.607480] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 04:22:11 2019][252311.614676] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 04:22:11 2019][252311.621312] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 04:22:11 2019][252311.628051] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 04:22:11 2019][252311.635576] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 04:22:11 2019][252311.642671] [] tgt_client_data_update+0x303/0x5e0 [ptlrpc] [Thu Dec 12 04:22:11 2019][252311.649982] [] tgt_client_new+0x41b/0x610 [ptlrpc] [Thu Dec 12 04:22:11 2019][252311.656582] [] ofd_obd_connect+0x3a3/0x4c0 [ofd] [Thu Dec 12 04:22:11 2019][252311.662973] [] target_handle_connect+0xecb/0x2b10 [ptlrpc] [Thu Dec 12 04:22:11 2019][252311.670260] [] tgt_request_handle+0x50a/0x1580 [ptlrpc] [Thu Dec 12 04:22:11 2019][252311.677283] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:22:11 2019][252311.685109] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:22:11 2019][252311.691521] [] kthread+0xd1/0xe0 [Thu Dec 12 04:22:11 2019][252311.696536] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:22:11 2019][252311.703101] [] 0xffffffffffffffff [Thu Dec 12 04:22:11 2019][252311.708213] LustreError: dumping log to /tmp/lustre-log.1576153331.31102 [Thu Dec 12 04:22:32 2019][252332.067802] Pid: 113356, comm: ll_ost02_094 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:22:32 2019][252332.078409] Call Trace: [Thu Dec 12 04:22:32 2019][252332.080982] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 04:22:32 2019][252332.087976] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 04:22:32 2019][252332.095161] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 04:22:32 2019][252332.101821] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 04:22:32 2019][252332.108574] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 04:22:32 2019][252332.116098] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 04:22:32 2019][252332.123194] [] tgt_client_data_update+0x303/0x5e0 [ptlrpc] [Thu Dec 12 04:22:32 2019][252332.130503] [] tgt_client_new+0x41b/0x610 [ptlrpc] [Thu Dec 12 04:22:32 2019][252332.137104] [] ofd_obd_connect+0x3a3/0x4c0 [ofd] [Thu Dec 12 04:22:32 2019][252332.143496] [] target_handle_connect+0xecb/0x2b10 [ptlrpc] [Thu Dec 12 04:22:32 2019][252332.150766] [] tgt_request_handle+0x50a/0x1580 [ptlrpc] [Thu Dec 12 04:22:32 2019][252332.157801] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:22:32 2019][252332.165603] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:22:32 2019][252332.172047] [] kthread+0xd1/0xe0 [Thu Dec 12 04:22:32 2019][252332.177052] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:22:32 2019][252332.183629] [] 0xffffffffffffffff [Thu Dec 12 04:22:32 2019][252332.188729] LustreError: dumping log to /tmp/lustre-log.1576153352.113356 [Thu Dec 12 04:22:42 2019][252342.235601] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800880 to 0x1a80000400:11800897 [Thu Dec 12 04:24:09 2019][252428.973807] Lustre: fir-OST0054: Client 016cfe19-2250-799b-d8ad-887e11d25409 (at 10.8.30.32@o2ib6) reconnecting [Thu Dec 12 04:24:09 2019][252428.983982] Lustre: Skipped 1246 previous similar messages [Thu Dec 12 04:24:31 2019][252450.854153] Pid: 31229, comm: ll_ost02_109 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:24:31 2019][252450.864670] Call Trace: [Thu Dec 12 04:24:31 2019][252450.867221] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 04:24:31 2019][252450.873621] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:24:31 2019][252450.880697] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:24:31 2019][252450.888504] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:24:31 2019][252450.894928] [] kthread+0xd1/0xe0 [Thu Dec 12 04:24:31 2019][252450.899934] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:24:31 2019][252450.906509] [] 0xffffffffffffffff [Thu Dec 12 04:24:31 2019][252450.911622] LustreError: dumping log to /tmp/lustre-log.1576153471.31229 [Thu Dec 12 04:24:35 2019][252454.950234] Pid: 112523, comm: ll_ost01_083 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:24:35 2019][252454.960837] Call Trace: [Thu Dec 12 04:24:35 2019][252454.963391] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 04:24:35 2019][252454.969781] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:24:35 2019][252454.976860] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:24:35 2019][252454.984661] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:24:35 2019][252454.991088] [] kthread+0xd1/0xe0 [Thu Dec 12 04:24:35 2019][252454.996109] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:24:35 2019][252455.002685] [] 0xffffffffffffffff [Thu Dec 12 04:24:35 2019][252455.007795] LustreError: dumping log to /tmp/lustre-log.1576153475.112523 [Thu Dec 12 04:24:50 2019][252470.263219] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105893 to 0x1a00000402:1105921 [Thu Dec 12 04:25:05 2019][252485.198845] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/-150), not sending early reply [Thu Dec 12 04:25:05 2019][252485.198845] req@ffff88f17fb1c050 x1649533345480736/t0(0) o3->3d41db68-318d-91b5-b35b-0dc2e1801091@10.9.114.11@o2ib4:410/0 lens 488/0 e 0 to 0 dl 1576153510 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 04:25:05 2019][252485.227918] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 2588 previous similar messages [Thu Dec 12 04:25:29 2019][252509.350433] Lustre: fir-OST005a: Connection restored to cc43915b-6aa0-7796-18f9-1827e6f9b899 (at 10.8.18.12@o2ib6) [Thu Dec 12 04:25:29 2019][252509.360884] Lustre: Skipped 1225 previous similar messages [Thu Dec 12 04:26:05 2019][252545.064043] Pid: 67806, comm: ll_ost03_052 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:26:05 2019][252545.074566] Call Trace: [Thu Dec 12 04:26:05 2019][252545.077125] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 04:26:05 2019][252545.083525] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:26:05 2019][252545.090590] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:26:05 2019][252545.098388] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:26:05 2019][252545.104817] [] kthread+0xd1/0xe0 [Thu Dec 12 04:26:05 2019][252545.109820] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:26:05 2019][252545.116396] [] 0xffffffffffffffff [Thu Dec 12 04:26:05 2019][252545.121504] LustreError: dumping log to /tmp/lustre-log.1576153565.67806 [Thu Dec 12 04:26:09 2019][252549.160129] Pid: 66106, comm: ll_ost01_002 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:26:09 2019][252549.170646] Call Trace: [Thu Dec 12 04:26:09 2019][252549.173224] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 04:26:09 2019][252549.180237] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 04:26:09 2019][252549.187436] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 04:26:09 2019][252549.194088] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 04:26:09 2019][252549.200852] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 04:26:09 2019][252549.208379] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 04:26:09 2019][252549.215489] [] dqget+0x3fa/0x450 [Thu Dec 12 04:26:09 2019][252549.220496] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 04:26:09 2019][252549.226277] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 04:26:09 2019][252549.233901] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 04:26:09 2019][252549.240385] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 04:26:09 2019][252549.246528] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:26:09 2019][252549.253594] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:26:09 2019][252549.261408] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:26:09 2019][252549.267840] [] kthread+0xd1/0xe0 [Thu Dec 12 04:26:09 2019][252549.272860] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:26:09 2019][252549.279437] [] 0xffffffffffffffff [Thu Dec 12 04:26:09 2019][252549.284550] LustreError: dumping log to /tmp/lustre-log.1576153569.66106 [Thu Dec 12 04:26:25 2019][252564.992177] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049288 to 0x1a00000401:3049313 [Thu Dec 12 04:26:43 2019][252583.815269] Lustre: fir-OST0056: Export ffff88e42919e800 already connecting from 10.8.22.22@o2ib6 [Thu Dec 12 04:26:43 2019][252583.824232] Lustre: Skipped 52 previous similar messages [Thu Dec 12 04:27:42 2019][252642.493198] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082476 to 0x1a80000402:3082497 [Thu Dec 12 04:28:21 2019][252681.298446] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090185 to 0x1980000402:3090209 [Thu Dec 12 04:29:48 2019][252768.052197] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086284 to 0x1900000402:3086305 [Thu Dec 12 04:31:00 2019][252839.981857] LNet: Service thread pid 31123 was inactive for 1201.63s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 04:31:00 2019][252839.998986] LNet: Skipped 8 previous similar messages [Thu Dec 12 04:31:00 2019][252840.004146] Pid: 31123, comm: ll_ost02_104 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:31:00 2019][252840.014682] Call Trace: [Thu Dec 12 04:31:00 2019][252840.017237] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 04:31:00 2019][252840.023650] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:31:00 2019][252840.030729] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:31:00 2019][252840.038551] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:31:00 2019][252840.045016] [] kthread+0xd1/0xe0 [Thu Dec 12 04:31:00 2019][252840.050017] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:31:00 2019][252840.056592] [] 0xffffffffffffffff [Thu Dec 12 04:31:00 2019][252840.061701] LustreError: dumping log to /tmp/lustre-log.1576153860.31123 [Thu Dec 12 04:31:16 2019][252856.366183] Pid: 67923, comm: ll_ost01_074 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:31:16 2019][252856.376705] Call Trace: [Thu Dec 12 04:31:16 2019][252856.379283] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 04:31:16 2019][252856.386286] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 04:31:16 2019][252856.393472] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 04:31:16 2019][252856.400120] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 04:31:16 2019][252856.406869] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 04:31:16 2019][252856.414394] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 04:31:16 2019][252856.421489] [] dqget+0x3fa/0x450 [Thu Dec 12 04:31:16 2019][252856.426493] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 04:31:16 2019][252856.432287] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 04:31:16 2019][252856.439917] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 04:31:16 2019][252856.446428] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 04:31:16 2019][252856.452556] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:31:16 2019][252856.459615] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:31:16 2019][252856.467418] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:31:16 2019][252856.473847] [] kthread+0xd1/0xe0 [Thu Dec 12 04:31:16 2019][252856.478849] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:31:16 2019][252856.485425] [] 0xffffffffffffffff [Thu Dec 12 04:31:16 2019][252856.490527] LustreError: dumping log to /tmp/lustre-log.1576153876.67923 [Thu Dec 12 04:31:43 2019][252883.283076] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126212 to 0x1980000400:1126241 [Thu Dec 12 04:31:44 2019][252884.007411] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122531 to 0x1a80000401:1122561 [Thu Dec 12 04:33:17 2019][252976.984385] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071021 to 0x1800000400:3071041 [Thu Dec 12 04:33:29 2019][252989.049543] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124773 to 0x1900000400:1124801 [Thu Dec 12 04:33:37 2019][252996.873667] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117830 to 0x1800000402:1117857 [Thu Dec 12 04:34:09 2019][253029.118883] Lustre: fir-OST0054: Client c681c8c8-a3bd-4f09-2cf4-358a58ae71d2 (at 10.9.117.22@o2ib4) reconnecting [Thu Dec 12 04:34:09 2019][253029.129148] Lustre: Skipped 1290 previous similar messages [Thu Dec 12 04:34:37 2019][253057.074156] Pid: 67690, comm: ll_ost01_036 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:34:37 2019][253057.084673] Call Trace: [Thu Dec 12 04:34:37 2019][253057.087233] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 04:34:37 2019][253057.093633] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:34:37 2019][253057.100706] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:34:37 2019][253057.108524] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:34:37 2019][253057.114951] [] kthread+0xd1/0xe0 [Thu Dec 12 04:34:37 2019][253057.119957] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:34:37 2019][253057.126538] [] 0xffffffffffffffff [Thu Dec 12 04:34:37 2019][253057.131650] LustreError: dumping log to /tmp/lustre-log.1576154077.67690 [Thu Dec 12 04:34:51 2019][253071.051174] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105925 to 0x1a00000402:1105953 [Thu Dec 12 04:35:05 2019][253085.282734] Lustre: 27223:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/-150), not sending early reply [Thu Dec 12 04:35:05 2019][253085.282734] req@ffff891cbfca1850 x1649559496365360/t0(0) o4->fe16bc49-4bbe-dc30-a069-fee92bf3e984@10.9.104.23@o2ib4:255/0 lens 488/0 e 0 to 0 dl 1576154110 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 04:35:05 2019][253085.311801] Lustre: 27223:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 2776 previous similar messages [Thu Dec 12 04:35:22 2019][253102.131056] Pid: 31187, comm: ll_ost02_106 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:35:22 2019][253102.141578] Call Trace: [Thu Dec 12 04:35:22 2019][253102.144133] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 04:35:22 2019][253102.151173] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 04:35:22 2019][253102.158467] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 04:35:22 2019][253102.165118] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 04:35:22 2019][253102.171519] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:35:22 2019][253102.178560] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:35:22 2019][253102.186375] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:35:22 2019][253102.192790] [] kthread+0xd1/0xe0 [Thu Dec 12 04:35:22 2019][253102.197790] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:35:22 2019][253102.204368] [] 0xffffffffffffffff [Thu Dec 12 04:35:22 2019][253102.209484] LustreError: dumping log to /tmp/lustre-log.1576154122.31187 [Thu Dec 12 04:35:30 2019][253110.571598] Lustre: fir-OST005a: Connection restored to cc43915b-6aa0-7796-18f9-1827e6f9b899 (at 10.8.18.12@o2ib6) [Thu Dec 12 04:35:30 2019][253110.582033] Lustre: Skipped 1317 previous similar messages [Thu Dec 12 04:36:07 2019][253147.187951] Pid: 112553, comm: ll_ost03_076 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:36:07 2019][253147.198557] Call Trace: [Thu Dec 12 04:36:07 2019][253147.201119] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 04:36:07 2019][253147.207516] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:36:07 2019][253147.214581] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:36:07 2019][253147.222381] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:36:07 2019][253147.228810] [] kthread+0xd1/0xe0 [Thu Dec 12 04:36:07 2019][253147.233812] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:36:07 2019][253147.240389] [] 0xffffffffffffffff [Thu Dec 12 04:36:07 2019][253147.245497] LustreError: dumping log to /tmp/lustre-log.1576154167.112553 [Thu Dec 12 04:37:07 2019][253207.103912] Lustre: fir-OST0056: Export ffff88f2fa374c00 already connecting from 10.9.113.13@o2ib4 [Thu Dec 12 04:37:07 2019][253207.112958] Lustre: Skipped 52 previous similar messages [Thu Dec 12 04:37:08 2019][253208.629166] Pid: 30965, comm: ll_ost02_100 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:37:08 2019][253208.639692] Call Trace: [Thu Dec 12 04:37:08 2019][253208.642250] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 04:37:08 2019][253208.648649] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:37:08 2019][253208.655709] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:37:08 2019][253208.663511] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:37:08 2019][253208.669940] [] kthread+0xd1/0xe0 [Thu Dec 12 04:37:08 2019][253208.674943] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:37:08 2019][253208.681518] [] 0xffffffffffffffff [Thu Dec 12 04:37:08 2019][253208.686629] LustreError: dumping log to /tmp/lustre-log.1576154228.30965 [Thu Dec 12 04:37:29 2019][253229.109573] Pid: 31223, comm: ll_ost02_108 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:37:29 2019][253229.120088] Call Trace: [Thu Dec 12 04:37:29 2019][253229.122641] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 04:37:29 2019][253229.129680] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 04:37:29 2019][253229.136972] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 04:37:29 2019][253229.143625] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 04:37:29 2019][253229.150027] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:37:29 2019][253229.157070] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:37:29 2019][253229.164909] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:37:29 2019][253229.171327] [] kthread+0xd1/0xe0 [Thu Dec 12 04:37:29 2019][253229.176343] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:37:29 2019][253229.182923] [] 0xffffffffffffffff [Thu Dec 12 04:37:29 2019][253229.188044] LustreError: dumping log to /tmp/lustre-log.1576154249.31223 [Thu Dec 12 04:37:43 2019][253243.121158] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082505 to 0x1a80000402:3082529 [Thu Dec 12 04:38:22 2019][253282.670429] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090212 to 0x1980000402:3090241 [Thu Dec 12 04:39:01 2019][253320.967211] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049319 to 0x1a00000401:3049345 [Thu Dec 12 04:39:07 2019][253327.415533] Pid: 67886, comm: ll_ost03_063 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:39:07 2019][253327.426055] Call Trace: [Thu Dec 12 04:39:07 2019][253327.428617] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 04:39:07 2019][253327.435655] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 04:39:07 2019][253327.442951] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 04:39:07 2019][253327.449625] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 04:39:07 2019][253327.456029] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:39:07 2019][253327.463068] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:39:07 2019][253327.470885] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:39:07 2019][253327.477299] [] kthread+0xd1/0xe0 [Thu Dec 12 04:39:07 2019][253327.482301] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:39:07 2019][253327.488870] [] 0xffffffffffffffff [Thu Dec 12 04:39:07 2019][253327.493979] LustreError: dumping log to /tmp/lustre-log.1576154347.67886 [Thu Dec 12 04:39:11 2019][253331.511606] Pid: 67851, comm: ll_ost03_057 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:39:11 2019][253331.522125] Call Trace: [Thu Dec 12 04:39:11 2019][253331.524677] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 04:39:11 2019][253331.531721] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 04:39:11 2019][253331.539026] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 04:39:11 2019][253331.545682] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 04:39:11 2019][253331.552085] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:39:11 2019][253331.559123] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:39:11 2019][253331.566938] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:39:11 2019][253331.573353] [] kthread+0xd1/0xe0 [Thu Dec 12 04:39:11 2019][253331.578357] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:39:11 2019][253331.584924] [] 0xffffffffffffffff [Thu Dec 12 04:39:11 2019][253331.590032] LustreError: dumping log to /tmp/lustre-log.1576154351.67851 [Thu Dec 12 04:41:44 2019][253484.480326] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126243 to 0x1980000400:1126273 [Thu Dec 12 04:41:55 2019][253495.354881] LNet: Service thread pid 112556 was inactive for 1200.75s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 04:41:55 2019][253495.372075] LNet: Skipped 8 previous similar messages [Thu Dec 12 04:41:55 2019][253495.377224] Pid: 112556, comm: ll_ost01_091 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:41:55 2019][253495.387850] Call Trace: [Thu Dec 12 04:41:55 2019][253495.390425] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 04:41:55 2019][253495.397420] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 04:41:55 2019][253495.404606] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 04:41:55 2019][253495.411253] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 04:41:55 2019][253495.418002] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 04:41:55 2019][253495.425527] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 04:41:55 2019][253495.432632] [] tgt_client_data_update+0x303/0x5e0 [ptlrpc] [Thu Dec 12 04:41:55 2019][253495.439941] [] tgt_client_new+0x41b/0x610 [ptlrpc] [Thu Dec 12 04:41:55 2019][253495.446551] [] ofd_obd_connect+0x3a3/0x4c0 [ofd] [Thu Dec 12 04:41:55 2019][253495.452941] [] target_handle_connect+0xecb/0x2b10 [ptlrpc] [Thu Dec 12 04:41:55 2019][253495.460237] [] tgt_request_handle+0x50a/0x1580 [ptlrpc] [Thu Dec 12 04:41:55 2019][253495.467284] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:41:55 2019][253495.475101] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:41:55 2019][253495.481517] [] kthread+0xd1/0xe0 [Thu Dec 12 04:41:55 2019][253495.486532] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:41:55 2019][253495.493094] [] 0xffffffffffffffff [Thu Dec 12 04:41:55 2019][253495.498208] LustreError: dumping log to /tmp/lustre-log.1576154515.112556 [Thu Dec 12 04:42:24 2019][253524.163207] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086309 to 0x1900000402:3086337 [Thu Dec 12 04:42:38 2019][253538.363714] Pid: 67761, comm: ll_ost01_047 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:42:38 2019][253538.374236] Call Trace: [Thu Dec 12 04:42:38 2019][253538.376815] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 04:42:38 2019][253538.383812] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 04:42:38 2019][253538.391011] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 04:42:38 2019][253538.397661] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 04:42:38 2019][253538.404409] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 04:42:38 2019][253538.411935] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 04:42:38 2019][253538.419040] [] dqget+0x3fa/0x450 [Thu Dec 12 04:42:38 2019][253538.424041] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 04:42:38 2019][253538.429823] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 04:42:38 2019][253538.437449] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 04:42:38 2019][253538.443934] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 04:42:38 2019][253538.450075] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:42:38 2019][253538.457137] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:42:38 2019][253538.464955] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:42:38 2019][253538.471371] [] kthread+0xd1/0xe0 [Thu Dec 12 04:42:38 2019][253538.476386] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:42:38 2019][253538.482952] [] 0xffffffffffffffff [Thu Dec 12 04:42:38 2019][253538.488063] LustreError: dumping log to /tmp/lustre-log.1576154558.67761 [Thu Dec 12 04:43:09 2019][253569.084314] Pid: 31222, comm: ll_ost02_107 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:43:09 2019][253569.094833] Call Trace: [Thu Dec 12 04:43:09 2019][253569.097395] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 04:43:09 2019][253569.104432] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 04:43:09 2019][253569.111728] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 04:43:09 2019][253569.118377] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 04:43:09 2019][253569.124780] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:43:09 2019][253569.131820] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:43:09 2019][253569.139634] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:43:09 2019][253569.146050] [] kthread+0xd1/0xe0 [Thu Dec 12 04:43:09 2019][253569.151052] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:43:09 2019][253569.157619] [] 0xffffffffffffffff [Thu Dec 12 04:43:09 2019][253569.162727] LustreError: dumping log to /tmp/lustre-log.1576154589.31222 [Thu Dec 12 04:44:06 2019][253626.429452] Pid: 31584, comm: ll_ost03_081 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:44:06 2019][253626.439970] Call Trace: [Thu Dec 12 04:44:06 2019][253626.442521] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 04:44:06 2019][253626.449552] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 04:44:06 2019][253626.456859] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 04:44:06 2019][253626.463507] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 04:44:06 2019][253626.469908] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:44:06 2019][253626.476950] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:44:06 2019][253626.484763] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:44:06 2019][253626.491180] [] kthread+0xd1/0xe0 [Thu Dec 12 04:44:06 2019][253626.496180] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:44:06 2019][253626.502749] [] 0xffffffffffffffff [Thu Dec 12 04:44:06 2019][253626.507858] LustreError: dumping log to /tmp/lustre-log.1576154646.31584 [Thu Dec 12 04:44:09 2019][253629.377852] Lustre: fir-OST0054: Client a31c4d05-c2c1-d128-e70d-4b9b8b78ea7d (at 10.8.8.22@o2ib6) reconnecting [Thu Dec 12 04:44:09 2019][253629.387943] Lustre: Skipped 1351 previous similar messages [Thu Dec 12 04:44:20 2019][253640.534429] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122531 to 0x1a80000401:1122593 [Thu Dec 12 04:44:52 2019][253672.655064] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105956 to 0x1a00000402:1105985 [Thu Dec 12 04:45:06 2019][253685.854640] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 04:45:06 2019][253685.854640] req@ffff88f0e8215050 x1649533154028048/t0(0) o4->63828333-ab85-9660-a339-05c4e4362ad0@10.9.102.13@o2ib4:100/0 lens 840/0 e 0 to 0 dl 1576154710 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 04:45:06 2019][253685.883448] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 2924 previous similar messages [Thu Dec 12 04:45:30 2019][253710.854945] Lustre: fir-OST0054: Connection restored to (at 10.9.107.70@o2ib4) [Thu Dec 12 04:45:30 2019][253710.862364] Lustre: Skipped 1354 previous similar messages [Thu Dec 12 04:45:53 2019][253733.239358] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071051 to 0x1800000400:3071073 [Thu Dec 12 04:46:05 2019][253745.224524] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124805 to 0x1900000400:1124833 [Thu Dec 12 04:46:09 2019][253749.311879] Pid: 67490, comm: ll_ost03_005 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:46:09 2019][253749.322398] Call Trace: [Thu Dec 12 04:46:09 2019][253749.324951] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 04:46:09 2019][253749.331350] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:46:09 2019][253749.338414] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:46:09 2019][253749.346230] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:46:09 2019][253749.352659] [] kthread+0xd1/0xe0 [Thu Dec 12 04:46:09 2019][253749.357663] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:46:09 2019][253749.364238] [] 0xffffffffffffffff [Thu Dec 12 04:46:09 2019][253749.369348] LustreError: dumping log to /tmp/lustre-log.1576154769.67490 [Thu Dec 12 04:46:13 2019][253753.440679] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117859 to 0x1800000402:1117889 [Thu Dec 12 04:46:54 2019][253794.368771] LNet: Service thread pid 66252 was inactive for 1201.94s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 04:46:54 2019][253794.381803] LNet: Skipped 1 previous similar message [Thu Dec 12 04:46:54 2019][253794.386866] LustreError: dumping log to /tmp/lustre-log.1576154814.66252 [Thu Dec 12 04:47:13 2019][253813.151776] Lustre: fir-OST0056: Export ffff88e42919e800 already connecting from 10.8.22.22@o2ib6 [Thu Dec 12 04:47:13 2019][253813.160739] Lustre: Skipped 45 previous similar messages [Thu Dec 12 04:47:15 2019][253814.849177] Pid: 67792, comm: ll_ost01_052 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:47:15 2019][253814.859697] Call Trace: [Thu Dec 12 04:47:15 2019][253814.862247] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 04:47:15 2019][253814.868640] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:47:15 2019][253814.875700] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:47:15 2019][253814.883502] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:47:15 2019][253814.889930] [] kthread+0xd1/0xe0 [Thu Dec 12 04:47:15 2019][253814.894935] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:47:15 2019][253814.901514] [] 0xffffffffffffffff [Thu Dec 12 04:47:15 2019][253814.906620] LustreError: dumping log to /tmp/lustre-log.1576154835.67792 [Thu Dec 12 04:47:23 2019][253823.169346] LustreError: 112570:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576154543, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005e_UUID lock: ffff8921e7397bc0/0x7066c9c190b7be78 lrc: 3/0,1 mode: --/PW res: [0x1a80000402:0x2f08e2:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 112570 timeout: 0 lvb_type: 0 [Thu Dec 12 04:47:23 2019][253823.213239] LustreError: 112570:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 5 previous similar messages [Thu Dec 12 04:47:44 2019][253844.461075] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082533 to 0x1a80000402:3082561 [Thu Dec 12 04:48:23 2019][253883.274330] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090249 to 0x1980000402:3090273 [Thu Dec 12 04:51:37 2019][254076.886196] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049352 to 0x1a00000401:3049377 [Thu Dec 12 04:51:46 2019][254085.994919] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126275 to 0x1980000400:1126305 [Thu Dec 12 04:54:09 2019][254229.549668] Lustre: fir-OST005a: Client 84451726-da5e-16d9-ee63-43bfe8a9f835 (at 10.8.27.27@o2ib6) reconnecting [Thu Dec 12 04:54:09 2019][254229.559856] Lustre: Skipped 1342 previous similar messages [Thu Dec 12 04:54:53 2019][254273.346973] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105988 to 0x1a00000402:1106017 [Thu Dec 12 04:55:00 2019][254279.954169] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086345 to 0x1900000402:3086369 [Thu Dec 12 04:55:07 2019][254287.146557] Lustre: 31346:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/-746), not sending early reply [Thu Dec 12 04:55:07 2019][254287.146557] req@ffff8905584ca050 x1651830967511744/t0(0) o101->12e2c9b6-7b56-574c-526a-a98d62b67a85@10.9.103.31@o2ib4:702/0 lens 328/0 e 0 to 0 dl 1576155312 ref 2 fl New:/0/ffffffff rc 0/-1 [Thu Dec 12 04:55:07 2019][254287.175802] Lustre: 31346:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 2938 previous similar messages [Thu Dec 12 04:55:31 2019][254310.911146] Lustre: fir-OST0056: Connection restored to 35ba350a-bccc-3fd9-39f0-a94eca80785d (at 10.9.107.33@o2ib4) [Thu Dec 12 04:55:31 2019][254310.921678] Lustre: Skipped 1367 previous similar messages [Thu Dec 12 04:56:56 2019][254396.757393] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122598 to 0x1a80000401:1122625 [Thu Dec 12 04:57:26 2019][254426.144873] Lustre: fir-OST0056: Export ffff891b8f069c00 already connecting from 10.8.22.14@o2ib6 [Thu Dec 12 04:57:26 2019][254426.153831] Lustre: Skipped 46 previous similar messages [Thu Dec 12 04:57:45 2019][254445.425164] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082576 to 0x1a80000402:3082593 [Thu Dec 12 04:58:24 2019][254484.534247] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090289 to 0x1980000402:3090305 [Thu Dec 12 04:58:29 2019][254489.254320] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071085 to 0x1800000400:3071105 [Thu Dec 12 04:58:41 2019][254501.327465] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124836 to 0x1900000400:1124865 [Thu Dec 12 04:58:47 2019][254507.086903] LNet: Service thread pid 31585 was inactive for 1203.21s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 04:58:47 2019][254507.104023] LNet: Skipped 5 previous similar messages [Thu Dec 12 04:58:47 2019][254507.109198] Pid: 31585, comm: ll_ost03_082 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:58:47 2019][254507.119729] Call Trace: [Thu Dec 12 04:58:47 2019][254507.122308] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 04:58:47 2019][254507.128704] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:58:47 2019][254507.135776] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:58:47 2019][254507.143597] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:58:47 2019][254507.150026] [] kthread+0xd1/0xe0 [Thu Dec 12 04:58:47 2019][254507.155040] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:58:47 2019][254507.161604] [] 0xffffffffffffffff [Thu Dec 12 04:58:47 2019][254507.166742] LustreError: dumping log to /tmp/lustre-log.1576155527.31585 [Thu Dec 12 04:58:49 2019][254508.943599] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117891 to 0x1800000402:1117921 [Thu Dec 12 04:59:48 2019][254568.528133] Pid: 67650, comm: ll_ost01_024 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 04:59:48 2019][254568.538655] Call Trace: [Thu Dec 12 04:59:48 2019][254568.541207] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 04:59:48 2019][254568.547607] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 04:59:48 2019][254568.554675] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 04:59:48 2019][254568.562481] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 04:59:48 2019][254568.568923] [] kthread+0xd1/0xe0 [Thu Dec 12 04:59:48 2019][254568.573930] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 04:59:48 2019][254568.580505] [] 0xffffffffffffffff [Thu Dec 12 04:59:48 2019][254568.585623] LustreError: dumping log to /tmp/lustre-log.1576155588.67650 [Thu Dec 12 05:00:45 2019][254625.807252] LustreError: 32063:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576155345, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005a_UUID lock: ffff8921e7392d00/0x7066c9c190b7e266 lrc: 3/0,1 mode: --/PW res: [0x1980000402:0x2f2766:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 32063 timeout: 0 lvb_type: 0 [Thu Dec 12 05:01:48 2019][254688.070865] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126307 to 0x1980000400:1126337 [Thu Dec 12 05:02:24 2019][254724.179194] Pid: 112570, comm: ll_ost03_080 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:02:24 2019][254724.189795] Call Trace: [Thu Dec 12 05:02:24 2019][254724.192349] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 05:02:24 2019][254724.199390] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 05:02:24 2019][254724.206703] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 05:02:24 2019][254724.213351] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 05:02:24 2019][254724.219754] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:02:24 2019][254724.226801] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:02:24 2019][254724.234601] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:02:24 2019][254724.241030] [] kthread+0xd1/0xe0 [Thu Dec 12 05:02:24 2019][254724.246034] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:02:24 2019][254724.252594] [] 0xffffffffffffffff [Thu Dec 12 05:02:24 2019][254724.257701] LustreError: dumping log to /tmp/lustre-log.1576155744.112570 [Thu Dec 12 05:04:03 2019][254823.335168] LustreError: 31675:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576155543, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005a_UUID lock: ffff8910f9018000/0x7066c9c190b7ecbc lrc: 3/0,1 mode: --/PW res: [0x1980000402:0x2f276e:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 31675 timeout: 0 lvb_type: 0 [Thu Dec 12 05:04:09 2019][254829.642746] Lustre: fir-OST0058: Client be4565a9-8448-ebff-ec7a-065a9a83593c (at 10.8.18.19@o2ib6) reconnecting [Thu Dec 12 05:04:09 2019][254829.652928] Lustre: Skipped 1327 previous similar messages [Thu Dec 12 05:04:13 2019][254832.909252] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049392 to 0x1a00000401:3049409 [Thu Dec 12 05:04:54 2019][254874.310856] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1106023 to 0x1a00000402:1106049 [Thu Dec 12 05:05:07 2019][254887.438428] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/-150), not sending early reply [Thu Dec 12 05:05:07 2019][254887.438428] req@ffff88ebc1f82050 x1649561036324336/t0(0) o4->c1504d4c-7504-c251-de3c-6f26c7b8e7d5@10.9.102.26@o2ib4:547/0 lens 488/0 e 0 to 0 dl 1576155912 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 05:05:07 2019][254887.467507] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 2840 previous similar messages [Thu Dec 12 05:05:31 2019][254911.448127] Lustre: fir-OST005a: Connection restored to 97102c2b-e0e2-553a-c933-88dc912145da (at 10.9.115.11@o2ib4) [Thu Dec 12 05:05:31 2019][254911.458650] Lustre: Skipped 1325 previous similar messages [Thu Dec 12 05:07:28 2019][255028.269015] Lustre: fir-OST0056: Export ffff891b8f069c00 already connecting from 10.8.22.14@o2ib6 [Thu Dec 12 05:07:28 2019][255028.277986] Lustre: Skipped 57 previous similar messages [Thu Dec 12 05:07:36 2019][255036.513165] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086374 to 0x1900000402:3086401 [Thu Dec 12 05:07:46 2019][255046.397080] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082601 to 0x1a80000402:3082625 [Thu Dec 12 05:08:26 2019][255086.578210] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090311 to 0x1980000402:3090337 [Thu Dec 12 05:08:49 2019][255109.210803] LNet: Service thread pid 31710 was inactive for 1203.31s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 05:08:49 2019][255109.227932] LNet: Skipped 2 previous similar messages [Thu Dec 12 05:08:49 2019][255109.233077] Pid: 31710, comm: ll_ost03_084 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:08:49 2019][255109.243625] Call Trace: [Thu Dec 12 05:08:49 2019][255109.246177] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 05:08:49 2019][255109.252579] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:08:49 2019][255109.259674] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:08:49 2019][255109.267495] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:08:49 2019][255109.273924] [] kthread+0xd1/0xe0 [Thu Dec 12 05:08:49 2019][255109.278927] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:08:49 2019][255109.285503] [] 0xffffffffffffffff [Thu Dec 12 05:08:49 2019][255109.290613] LustreError: dumping log to /tmp/lustre-log.1576156129.31710 [Thu Dec 12 05:09:33 2019][255152.980336] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122628 to 0x1a80000401:1122657 [Thu Dec 12 05:11:05 2019][255244.925272] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071109 to 0x1800000400:3071137 [Thu Dec 12 05:11:17 2019][255256.982391] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124870 to 0x1900000400:1124897 [Thu Dec 12 05:11:25 2019][255265.014509] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117928 to 0x1800000402:1117953 [Thu Dec 12 05:11:49 2019][255289.031011] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126345 to 0x1980000400:1126369 [Thu Dec 12 05:12:26 2019][255326.303078] Pid: 67646, comm: ll_ost01_023 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:12:26 2019][255326.313599] Call Trace: [Thu Dec 12 05:12:26 2019][255326.316150] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 05:12:26 2019][255326.322552] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:12:26 2019][255326.329616] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:12:26 2019][255326.337432] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:12:26 2019][255326.343862] [] kthread+0xd1/0xe0 [Thu Dec 12 05:12:26 2019][255326.348864] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:12:26 2019][255326.355439] [] 0xffffffffffffffff [Thu Dec 12 05:12:26 2019][255326.360549] LustreError: dumping log to /tmp/lustre-log.1576156346.67646 [Thu Dec 12 05:13:28 2019][255388.256292] Pid: 32390, comm: ll_ost03_092 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:13:28 2019][255388.266814] Call Trace: [Thu Dec 12 05:13:28 2019][255388.269384] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 05:13:28 2019][255388.276380] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 05:13:28 2019][255388.283581] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 05:13:28 2019][255388.290230] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 05:13:28 2019][255388.296979] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 05:13:28 2019][255388.304505] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 05:13:28 2019][255388.311600] [] dqget+0x3fa/0x450 [Thu Dec 12 05:13:28 2019][255388.316603] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 05:13:28 2019][255388.322383] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 05:13:28 2019][255388.330002] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 05:13:28 2019][255388.336486] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 05:13:28 2019][255388.342626] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:13:28 2019][255388.349678] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:13:28 2019][255388.357490] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:13:28 2019][255388.363909] [] kthread+0xd1/0xe0 [Thu Dec 12 05:13:28 2019][255388.368923] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:13:28 2019][255388.375487] [] 0xffffffffffffffff [Thu Dec 12 05:13:28 2019][255388.380601] LustreError: dumping log to /tmp/lustre-log.1576156408.32390 [Thu Dec 12 05:14:09 2019][255429.859079] Lustre: fir-OST005c: Client 9ff24344-feb6-8c0e-cb07-a92244c00aa4 (at 10.9.101.21@o2ib4) reconnecting [Thu Dec 12 05:14:09 2019][255429.869362] Lustre: Skipped 1453 previous similar messages [Thu Dec 12 05:14:55 2019][255475.154679] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1106051 to 0x1a00000402:1106081 [Thu Dec 12 05:15:08 2019][255487.942260] Lustre: 29548:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/-150), not sending early reply [Thu Dec 12 05:15:08 2019][255487.942260] req@ffff88f1bfb8f050 x1652161380591696/t0(0) o19->ae1d0080-04fa-5436-e145-ffdf0db9990d@10.0.10.3@o2ib7:393/0 lens 336/0 e 0 to 0 dl 1576156513 ref 2 fl New:/0/ffffffff rc 0/-1 [Thu Dec 12 05:15:08 2019][255487.971246] Lustre: 29548:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 3088 previous similar messages [Thu Dec 12 05:15:31 2019][255511.680306] Lustre: fir-OST0058: Connection restored to c2d505e5-30ab-23ac-7017-17a83f00b35d (at 10.9.102.49@o2ib4) [Thu Dec 12 05:15:31 2019][255511.690832] Lustre: Skipped 1442 previous similar messages [Thu Dec 12 05:15:47 2019][255527.011028] Pid: 32063, comm: ll_ost03_087 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:15:47 2019][255527.021549] Call Trace: [Thu Dec 12 05:15:47 2019][255527.024102] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 05:15:47 2019][255527.031141] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 05:15:47 2019][255527.038449] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 05:15:47 2019][255527.045112] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 05:15:47 2019][255527.051514] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:15:47 2019][255527.058548] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:15:47 2019][255527.066361] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:15:47 2019][255527.072779] [] kthread+0xd1/0xe0 [Thu Dec 12 05:15:47 2019][255527.077779] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:15:47 2019][255527.084348] [] 0xffffffffffffffff [Thu Dec 12 05:15:47 2019][255527.089457] LustreError: dumping log to /tmp/lustre-log.1576156547.32063 [Thu Dec 12 05:16:49 2019][255588.964029] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049415 to 0x1a00000401:3049441 [Thu Dec 12 05:17:36 2019][255636.261890] Lustre: fir-OST0056: Export ffff8902f7fafc00 already connecting from 10.8.22.24@o2ib6 [Thu Dec 12 05:17:36 2019][255636.270853] Lustre: Skipped 44 previous similar messages [Thu Dec 12 05:17:47 2019][255647.000876] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082631 to 0x1a80000402:3082657 [Thu Dec 12 05:18:28 2019][255688.286025] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090343 to 0x1980000402:3090369 [Thu Dec 12 05:18:30 2019][255690.854256] Pid: 67747, comm: ll_ost01_046 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:18:30 2019][255690.864772] Call Trace: [Thu Dec 12 05:18:30 2019][255690.867348] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 05:18:30 2019][255690.874346] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 05:18:30 2019][255690.881531] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 05:18:30 2019][255690.888180] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 05:18:30 2019][255690.894929] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 05:18:31 2019][255690.902469] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 05:18:31 2019][255690.909574] [] dqget+0x3fa/0x450 [Thu Dec 12 05:18:31 2019][255690.914578] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 05:18:31 2019][255690.920351] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 05:18:31 2019][255690.927977] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 05:18:31 2019][255690.934450] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 05:18:31 2019][255690.940594] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:18:31 2019][255690.947643] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:18:31 2019][255690.955457] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:18:31 2019][255690.961873] [] kthread+0xd1/0xe0 [Thu Dec 12 05:18:31 2019][255690.966901] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:18:31 2019][255690.973462] [] 0xffffffffffffffff [Thu Dec 12 05:18:31 2019][255690.978573] LustreError: dumping log to /tmp/lustre-log.1576156711.67747 [Thu Dec 12 05:19:03 2019][255723.622908] LNet: Service thread pid 31675 was inactive for 1200.26s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 05:19:03 2019][255723.640018] LNet: Skipped 4 previous similar messages [Thu Dec 12 05:19:03 2019][255723.645166] Pid: 31675, comm: ll_ost03_083 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:19:03 2019][255723.655704] Call Trace: [Thu Dec 12 05:19:03 2019][255723.658262] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 05:19:03 2019][255723.665296] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 05:19:03 2019][255723.672592] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 05:19:03 2019][255723.679241] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 05:19:03 2019][255723.685644] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:19:03 2019][255723.692683] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:19:03 2019][255723.700498] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:19:03 2019][255723.706914] [] kthread+0xd1/0xe0 [Thu Dec 12 05:19:03 2019][255723.711930] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:19:03 2019][255723.718493] [] 0xffffffffffffffff [Thu Dec 12 05:19:03 2019][255723.723617] LustreError: dumping log to /tmp/lustre-log.1576156743.31675 [Thu Dec 12 05:19:04 2019][255723.879912] LustreError: 31953:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576156443, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0054_UUID lock: ffff8910f901d580/0x7066c9c190b83b91 lrc: 3/0,1 mode: --/PW res: [0x1800000400:0x2edc82:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 31953 timeout: 0 lvb_type: 0 [Thu Dec 12 05:19:04 2019][255723.879914] LustreError: 32246:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576156443, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005e_UUID lock: ffff89045a9b7500/0x7066c9c190b83b9f lrc: 3/0,1 mode: --/PW res: [0x1a80000402:0x2f0966:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 32246 timeout: 0 lvb_type: 0 [Thu Dec 12 05:19:04 2019][255723.879917] LustreError: 32246:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 1 previous similar message [Thu Dec 12 05:19:04 2019][255723.978244] LustreError: 31953:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 1 previous similar message [Thu Dec 12 05:19:11 2019][255731.815065] Pid: 31785, comm: ll_ost03_085 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:19:11 2019][255731.825581] Call Trace: [Thu Dec 12 05:19:11 2019][255731.828143] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 05:19:11 2019][255731.835173] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 05:19:11 2019][255731.842463] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 05:19:11 2019][255731.849109] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 05:19:11 2019][255731.855513] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:19:11 2019][255731.862550] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:19:11 2019][255731.870366] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:19:11 2019][255731.876783] [] kthread+0xd1/0xe0 [Thu Dec 12 05:19:11 2019][255731.881783] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:19:12 2019][255731.888352] [] 0xffffffffffffffff [Thu Dec 12 05:19:12 2019][255731.893458] LustreError: dumping log to /tmp/lustre-log.1576156751.31785 [Thu Dec 12 05:20:13 2019][255792.880018] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086406 to 0x1900000402:3086433 [Thu Dec 12 05:21:27 2019][255866.985733] Pid: 67734, comm: ll_ost03_038 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:21:27 2019][255866.996266] Call Trace: [Thu Dec 12 05:21:27 2019][255866.998817] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 05:21:27 2019][255867.005251] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:21:27 2019][255867.012353] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:21:27 2019][255867.020177] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:21:27 2019][255867.026621] [] kthread+0xd1/0xe0 [Thu Dec 12 05:21:27 2019][255867.031625] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:21:27 2019][255867.038211] [] 0xffffffffffffffff [Thu Dec 12 05:21:27 2019][255867.043335] LustreError: dumping log to /tmp/lustre-log.1576156887.67734 [Thu Dec 12 05:21:50 2019][255890.398680] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126371 to 0x1980000400:1126401 [Thu Dec 12 05:22:09 2019][255909.043280] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122663 to 0x1a80000401:1122689 [Thu Dec 12 05:23:41 2019][256000.956173] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071146 to 0x1800000400:3071169 [Thu Dec 12 05:23:53 2019][256012.981329] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124901 to 0x1900000400:1124929 [Thu Dec 12 05:24:01 2019][256021.109512] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117956 to 0x1800000402:1117985 [Thu Dec 12 05:24:10 2019][256030.725939] Lustre: fir-OST005c: Client 9ff24344-feb6-8c0e-cb07-a92244c00aa4 (at 10.9.101.21@o2ib4) reconnecting [Thu Dec 12 05:24:10 2019][256030.736199] Lustre: Skipped 1446 previous similar messages [Thu Dec 12 05:24:56 2019][256076.118574] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1106051 to 0x1a00000402:1106113 [Thu Dec 12 05:25:04 2019][256084.078018] Pid: 112548, comm: ll_ost01_090 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:25:04 2019][256084.088619] Call Trace: [Thu Dec 12 05:25:04 2019][256084.091174] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 05:25:04 2019][256084.097574] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:25:04 2019][256084.104637] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:25:04 2019][256084.112436] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:25:04 2019][256084.118865] [] kthread+0xd1/0xe0 [Thu Dec 12 05:25:04 2019][256084.123867] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:25:04 2019][256084.130445] [] 0xffffffffffffffff [Thu Dec 12 05:25:04 2019][256084.135570] LustreError: dumping log to /tmp/lustre-log.1576157104.112548 [Thu Dec 12 05:25:08 2019][256088.408112] Lustre: 27223:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/-127), not sending early reply [Thu Dec 12 05:25:08 2019][256088.408112] req@ffff891c225b5050 x1649050252742160/t0(0) o4->935b75df-613a-c7ad-95b7-8cbfb8326a67@10.9.101.28@o2ib4:238/0 lens 8632/0 e 0 to 0 dl 1576157113 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 05:25:08 2019][256088.437336] Lustre: 27223:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 3071 previous similar messages [Thu Dec 12 05:25:32 2019][256111.956162] Lustre: fir-OST0058: Connection restored to c2d505e5-30ab-23ac-7017-17a83f00b35d (at 10.9.102.49@o2ib4) [Thu Dec 12 05:25:32 2019][256111.966686] Lustre: Skipped 1444 previous similar messages [Thu Dec 12 05:27:38 2019][256238.385631] Lustre: fir-OST0056: Export ffff8902f7fafc00 already connecting from 10.8.22.24@o2ib6 [Thu Dec 12 05:27:38 2019][256238.394595] Lustre: Skipped 44 previous similar messages [Thu Dec 12 05:27:48 2019][256248.364716] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082661 to 0x1a80000402:3082689 [Thu Dec 12 05:28:29 2019][256289.801883] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090383 to 0x1980000402:3090401 [Thu Dec 12 05:29:07 2019][256327.794829] LNet: Service thread pid 112564 was inactive for 200.17s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 05:29:07 2019][256327.811934] LNet: Skipped 3 previous similar messages [Thu Dec 12 05:29:07 2019][256327.817083] Pid: 112564, comm: ll_ost01_094 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:29:07 2019][256327.827705] Call Trace: [Thu Dec 12 05:29:07 2019][256327.830272] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 05:29:07 2019][256327.837264] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 05:29:07 2019][256327.844446] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 05:29:07 2019][256327.851096] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 05:29:07 2019][256327.857846] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 05:29:07 2019][256327.865387] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 05:29:07 2019][256327.872484] [] dqget+0x3fa/0x450 [Thu Dec 12 05:29:07 2019][256327.877488] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 05:29:07 2019][256327.883259] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 05:29:07 2019][256327.890886] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 05:29:07 2019][256327.897361] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 05:29:08 2019][256327.903503] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:29:08 2019][256327.910550] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:29:08 2019][256327.918366] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:29:08 2019][256327.924785] [] kthread+0xd1/0xe0 [Thu Dec 12 05:29:08 2019][256327.929784] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:29:08 2019][256327.936370] [] 0xffffffffffffffff [Thu Dec 12 05:29:08 2019][256327.941469] LustreError: dumping log to /tmp/lustre-log.1576157348.112564 [Thu Dec 12 05:29:18 2019][256338.547041] Pid: 67706, comm: ll_ost01_038 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:29:18 2019][256338.557555] Call Trace: [Thu Dec 12 05:29:18 2019][256338.560116] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 05:29:18 2019][256338.567111] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 05:29:18 2019][256338.574295] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 05:29:18 2019][256338.580947] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 05:29:18 2019][256338.587695] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 05:29:18 2019][256338.595236] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 05:29:18 2019][256338.602333] [] dqget+0x3fa/0x450 [Thu Dec 12 05:29:18 2019][256338.607336] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 05:29:18 2019][256338.613123] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 05:29:18 2019][256338.620734] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 05:29:18 2019][256338.627224] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 05:29:18 2019][256338.633353] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:29:18 2019][256338.640407] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:29:18 2019][256338.648209] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:29:18 2019][256338.654638] [] kthread+0xd1/0xe0 [Thu Dec 12 05:29:18 2019][256338.659657] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:29:18 2019][256338.666241] [] 0xffffffffffffffff [Thu Dec 12 05:29:18 2019][256338.671334] LustreError: dumping log to /tmp/lustre-log.1576157358.67706 [Thu Dec 12 05:29:25 2019][256345.187961] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049453 to 0x1a00000401:3049473 [Thu Dec 12 05:31:51 2019][256491.006782] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126408 to 0x1980000400:1126433 [Thu Dec 12 05:32:48 2019][256548.063021] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086449 to 0x1900000402:3086465 [Thu Dec 12 05:33:42 2019][256602.880081] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071175 to 0x1800000400:3071201 [Thu Dec 12 05:34:00 2019][256620.664623] Pid: 67635, comm: ll_ost03_017 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:34:00 2019][256620.675138] Call Trace: [Thu Dec 12 05:34:00 2019][256620.677698] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 05:34:00 2019][256620.684096] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:34:00 2019][256620.691164] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:34:00 2019][256620.698969] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:34:00 2019][256620.705399] [] kthread+0xd1/0xe0 [Thu Dec 12 05:34:00 2019][256620.710402] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:34:00 2019][256620.716978] [] 0xffffffffffffffff [Thu Dec 12 05:34:00 2019][256620.722102] LustreError: dumping log to /tmp/lustre-log.1576157640.67635 [Thu Dec 12 05:34:04 2019][256624.760708] Pid: 32190, comm: ll_ost03_088 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:34:04 2019][256624.771247] Call Trace: [Thu Dec 12 05:34:04 2019][256624.773805] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 05:34:04 2019][256624.780846] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 05:34:04 2019][256624.788156] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 05:34:04 2019][256624.794807] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 05:34:04 2019][256624.801210] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:34:04 2019][256624.808249] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:34:04 2019][256624.816063] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:34:04 2019][256624.822481] [] kthread+0xd1/0xe0 [Thu Dec 12 05:34:04 2019][256624.827482] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:34:04 2019][256624.834059] [] 0xffffffffffffffff [Thu Dec 12 05:34:04 2019][256624.839166] LustreError: dumping log to /tmp/lustre-log.1576157644.32190 [Thu Dec 12 05:34:04 2019][256624.846566] Pid: 32246, comm: ll_ost03_090 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:34:04 2019][256624.857098] Call Trace: [Thu Dec 12 05:34:04 2019][256624.859651] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 05:34:04 2019][256624.866672] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 05:34:04 2019][256624.873960] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 05:34:04 2019][256624.880609] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 05:34:04 2019][256624.887011] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:34:04 2019][256624.894049] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:34:04 2019][256624.901869] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:34:04 2019][256624.908283] [] kthread+0xd1/0xe0 [Thu Dec 12 05:34:04 2019][256624.913298] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:34:05 2019][256624.919853] [] 0xffffffffffffffff [Thu Dec 12 05:34:05 2019][256624.924958] LNet: Service thread pid 31953 was inactive for 1201.02s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 05:34:11 2019][256630.937376] Lustre: fir-OST005c: Client 55158f7b-59ce-2f71-9169-48a899507185 (at 10.9.117.3@o2ib4) reconnecting [Thu Dec 12 05:34:11 2019][256630.947554] Lustre: Skipped 1437 previous similar messages [Thu Dec 12 05:34:45 2019][256665.218237] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122693 to 0x1a80000401:1122721 [Thu Dec 12 05:34:57 2019][256677.306476] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1106117 to 0x1a00000402:1106145 [Thu Dec 12 05:35:06 2019][256686.201926] Pid: 67658, comm: ll_ost01_027 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:35:06 2019][256686.212444] Call Trace: [Thu Dec 12 05:35:06 2019][256686.214996] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 05:35:06 2019][256686.221395] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:35:06 2019][256686.228460] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:35:06 2019][256686.236260] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:35:06 2019][256686.242689] [] kthread+0xd1/0xe0 [Thu Dec 12 05:35:06 2019][256686.247693] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:35:06 2019][256686.254270] [] 0xffffffffffffffff [Thu Dec 12 05:35:06 2019][256686.259376] LustreError: dumping log to /tmp/lustre-log.1576157706.67658 [Thu Dec 12 05:35:09 2019][256688.925990] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 05:35:09 2019][256688.925990] req@ffff88fa20a6a050 x1649516933178880/t0(0) o10->c8791ada-b652-f6ba-7581-332c648ad12e@10.9.103.49@o2ib4:83/0 lens 440/0 e 0 to 0 dl 1576157713 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 05:35:09 2019][256688.954795] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 3096 previous similar messages [Thu Dec 12 05:35:33 2019][256713.288237] Lustre: fir-OST0058: Connection restored to c2d505e5-30ab-23ac-7017-17a83f00b35d (at 10.9.102.49@o2ib4) [Thu Dec 12 05:35:33 2019][256713.298767] Lustre: Skipped 1456 previous similar messages [Thu Dec 12 05:36:29 2019][256768.972297] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124932 to 0x1900000400:1124961 [Thu Dec 12 05:36:37 2019][256777.332433] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117988 to 0x1800000402:1118017 [Thu Dec 12 05:37:40 2019][256840.509671] Lustre: fir-OST0056: Export ffff8902f7fafc00 already connecting from 10.8.22.24@o2ib6 [Thu Dec 12 05:37:40 2019][256840.518660] Lustre: Skipped 44 previous similar messages [Thu Dec 12 05:37:49 2019][256849.424672] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082700 to 0x1a80000402:3082721 [Thu Dec 12 05:38:30 2019][256890.197793] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090413 to 0x1980000402:3090433 [Thu Dec 12 05:41:52 2019][257092.038358] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126436 to 0x1980000400:1126465 [Thu Dec 12 05:42:01 2019][257101.161968] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049489 to 0x1a00000401:3049505 [Thu Dec 12 05:43:43 2019][257203.499957] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071208 to 0x1800000400:3071233 [Thu Dec 12 05:44:02 2019][257222.788550] LNet: Service thread pid 32513 was inactive for 1200.81s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 05:44:02 2019][257222.805655] LNet: Skipped 5 previous similar messages [Thu Dec 12 05:44:02 2019][257222.810802] Pid: 32513, comm: ll_ost03_094 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:44:02 2019][257222.821338] Call Trace: [Thu Dec 12 05:44:02 2019][257222.823888] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 05:44:02 2019][257222.830281] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:44:02 2019][257222.837342] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:44:02 2019][257222.845144] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:44:02 2019][257222.851574] [] kthread+0xd1/0xe0 [Thu Dec 12 05:44:02 2019][257222.856576] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:44:02 2019][257222.863153] [] 0xffffffffffffffff [Thu Dec 12 05:44:02 2019][257222.868261] LustreError: dumping log to /tmp/lustre-log.1576158242.32513 [Thu Dec 12 05:44:11 2019][257231.013228] Lustre: fir-OST0058: Client a5aa8a03-126c-c3c2-1aed-484f379dc83d (at 10.9.105.72@o2ib4) reconnecting [Thu Dec 12 05:44:11 2019][257231.023489] Lustre: Skipped 1454 previous similar messages [Thu Dec 12 05:44:58 2019][257278.742355] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1106149 to 0x1a00000402:1106177 [Thu Dec 12 05:45:09 2019][257289.029868] Lustre: 27223:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 05:45:09 2019][257289.029868] req@ffff891f457ea050 x1649516933178880/t0(0) o10->c8791ada-b652-f6ba-7581-332c648ad12e@10.9.103.49@o2ib4:684/0 lens 440/0 e 0 to 0 dl 1576158314 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 05:45:09 2019][257289.058772] Lustre: 27223:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 3145 previous similar messages [Thu Dec 12 05:45:24 2019][257304.093938] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086471 to 0x1900000402:3086497 [Thu Dec 12 05:45:26 2019][257306.295267] LustreError: 66071:0:(ldlm_lockd.c:256:expired_lock_main()) ### lock callback timer expired after 750s: evicting client at 10.9.101.54@o2ib4 ns: filter-fir-OST005c_UUID lock: ffff88fb25939200/0x7066c9c190b7797b lrc: 3/0,0 mode: PR/PR res: [0x1a00000400:0xb1d139:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->134217727) flags: 0x60000400000020 nid: 10.9.101.54@o2ib4 remote: 0xc717f25cc11d22e8 expref: 46 pid: 67851 timeout: 256700 lvb_type: 1 [Thu Dec 12 05:45:34 2019][257314.516914] Lustre: fir-OST0058: Connection restored to c2d505e5-30ab-23ac-7017-17a83f00b35d (at 10.9.102.49@o2ib4) [Thu Dec 12 05:45:34 2019][257314.527434] Lustre: Skipped 1474 previous similar messages [Thu Dec 12 05:47:21 2019][257421.441152] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122726 to 0x1a80000401:1122753 [Thu Dec 12 05:47:42 2019][257442.633734] Lustre: fir-OST0056: Export ffff8902f7fafc00 already connecting from 10.8.22.24@o2ib6 [Thu Dec 12 05:47:42 2019][257442.642702] Lustre: Skipped 55 previous similar messages [Thu Dec 12 05:47:44 2019][257443.976915] Pid: 112580, comm: ll_ost01_095 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:47:44 2019][257443.987515] Call Trace: [Thu Dec 12 05:47:44 2019][257443.990068] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 05:47:44 2019][257443.996471] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:47:44 2019][257444.003541] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:47:44 2019][257444.011339] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:47:44 2019][257444.017769] [] kthread+0xd1/0xe0 [Thu Dec 12 05:47:44 2019][257444.022772] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:47:44 2019][257444.029345] [] 0xffffffffffffffff [Thu Dec 12 05:47:44 2019][257444.034456] LustreError: dumping log to /tmp/lustre-log.1576158464.112580 [Thu Dec 12 05:47:50 2019][257450.340533] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082733 to 0x1a80000402:3082753 [Thu Dec 12 05:48:31 2019][257491.217646] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090441 to 0x1980000402:3090465 [Thu Dec 12 05:49:05 2019][257525.515222] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124970 to 0x1900000400:1124993 [Thu Dec 12 05:49:13 2019][257533.107354] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1118022 to 0x1800000402:1118049 [Thu Dec 12 05:53:27 2019][257786.933250] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126469 to 0x1980000400:1126497 [Thu Dec 12 05:53:44 2019][257804.823804] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071242 to 0x1800000400:3071265 [Thu Dec 12 05:54:11 2019][257831.264131] Lustre: fir-OST0058: Client 4380447f-ca0d-2b3f-165d-69d1c6bd4d88 (at 10.9.108.8@o2ib4) reconnecting [Thu Dec 12 05:54:11 2019][257831.274336] Lustre: Skipped 1436 previous similar messages [Thu Dec 12 05:54:37 2019][257857.144848] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049514 to 0x1a00000401:3049537 [Thu Dec 12 05:54:59 2019][257879.250192] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1106149 to 0x1a00000402:1106209 [Thu Dec 12 05:55:09 2019][257889.131709] Lustre: 29548:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/-150), not sending early reply [Thu Dec 12 05:55:09 2019][257889.131709] req@ffff88ecd07af050 x1649447764852256/t0(0) o17->34b263e7-c235-6737-be01-1bc0ec67d622@10.9.117.33@o2ib4:529/0 lens 456/0 e 0 to 0 dl 1576158914 ref 2 fl New:/0/ffffffff rc 0/-1 [Thu Dec 12 05:55:09 2019][257889.160861] Lustre: 29548:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 3017 previous similar messages [Thu Dec 12 05:55:34 2019][257914.913479] Lustre: fir-OST0056: Connection restored to fb9a2d5e-e9b3-4fb9-b988-9954fcfb0920 (at 10.8.0.66@o2ib6) [Thu Dec 12 05:55:34 2019][257914.923838] Lustre: Skipped 1418 previous similar messages [Thu Dec 12 05:55:55 2019][257935.506626] LNet: Service thread pid 67674 was inactive for 1201.22s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 05:55:55 2019][257935.523741] LNet: Skipped 1 previous similar message [Thu Dec 12 05:55:55 2019][257935.528805] Pid: 67674, comm: ll_ost01_031 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:55:55 2019][257935.539336] Call Trace: [Thu Dec 12 05:55:55 2019][257935.541907] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 05:55:55 2019][257935.548903] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 05:55:55 2019][257935.556086] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 05:55:55 2019][257935.562735] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 05:55:55 2019][257935.569484] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 05:55:55 2019][257935.577009] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 05:55:55 2019][257935.584105] [] dqget+0x3fa/0x450 [Thu Dec 12 05:55:55 2019][257935.589108] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 05:55:55 2019][257935.594880] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 05:55:55 2019][257935.602522] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 05:55:55 2019][257935.609000] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 05:55:55 2019][257935.615142] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:55:55 2019][257935.622198] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:55:55 2019][257935.630023] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:55:55 2019][257935.636441] [] kthread+0xd1/0xe0 [Thu Dec 12 05:55:55 2019][257935.641438] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:55:55 2019][257935.648016] [] 0xffffffffffffffff [Thu Dec 12 05:55:55 2019][257935.653132] LustreError: dumping log to /tmp/lustre-log.1576158955.67674 [Thu Dec 12 05:56:40 2019][257980.563536] Pid: 32762, comm: ll_ost03_098 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:56:40 2019][257980.574053] Call Trace: [Thu Dec 12 05:56:40 2019][257980.576604] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 05:56:40 2019][257980.582998] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:56:40 2019][257980.590063] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:56:40 2019][257980.597862] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:56:40 2019][257980.604292] [] kthread+0xd1/0xe0 [Thu Dec 12 05:56:40 2019][257980.609292] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:56:40 2019][257980.615871] [] 0xffffffffffffffff [Thu Dec 12 05:56:40 2019][257980.620977] LustreError: dumping log to /tmp/lustre-log.1576159000.32762 [Thu Dec 12 05:57:09 2019][258009.236102] Pid: 32728, comm: ll_ost01_096 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:57:09 2019][258009.246623] Call Trace: [Thu Dec 12 05:57:09 2019][258009.249195] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 05:57:09 2019][258009.256192] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 05:57:09 2019][258009.263377] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 05:57:09 2019][258009.270023] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 05:57:09 2019][258009.276771] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 05:57:09 2019][258009.284314] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 05:57:09 2019][258009.291397] [] dqget+0x3fa/0x450 [Thu Dec 12 05:57:09 2019][258009.296424] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 05:57:09 2019][258009.302205] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 05:57:09 2019][258009.309828] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 05:57:09 2019][258009.316303] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 05:57:09 2019][258009.322446] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:57:09 2019][258009.329495] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:57:09 2019][258009.337310] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:57:09 2019][258009.343727] [] kthread+0xd1/0xe0 [Thu Dec 12 05:57:09 2019][258009.348742] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:57:09 2019][258009.355306] [] 0xffffffffffffffff [Thu Dec 12 05:57:09 2019][258009.360418] LustreError: dumping log to /tmp/lustre-log.1576159029.32728 [Thu Dec 12 05:57:46 2019][258046.100835] Pid: 32889, comm: ll_ost01_099 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:57:46 2019][258046.111360] Call Trace: [Thu Dec 12 05:57:46 2019][258046.113914] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 05:57:46 2019][258046.120303] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:57:46 2019][258046.127370] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:57:46 2019][258046.135176] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:57:46 2019][258046.141605] [] kthread+0xd1/0xe0 [Thu Dec 12 05:57:46 2019][258046.146607] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:57:46 2019][258046.153168] [] 0xffffffffffffffff [Thu Dec 12 05:57:46 2019][258046.158289] LustreError: dumping log to /tmp/lustre-log.1576159066.32889 [Thu Dec 12 05:57:51 2019][258051.264401] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082757 to 0x1a80000402:3082785 [Thu Dec 12 05:58:00 2019][258060.317872] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086503 to 0x1900000402:3086529 [Thu Dec 12 05:58:01 2019][258061.771983] Lustre: fir-OST0056: Export ffff8912f1e21000 already connecting from 10.8.25.17@o2ib6 [Thu Dec 12 05:58:01 2019][258061.780943] Lustre: Skipped 64 previous similar messages [Thu Dec 12 05:58:32 2019][258092.445632] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090473 to 0x1980000402:3090497 [Thu Dec 12 05:59:57 2019][258177.175943] Pid: 32512, comm: ll_ost03_093 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 05:59:57 2019][258177.186459] Call Trace: [Thu Dec 12 05:59:57 2019][258177.189029] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 05:59:57 2019][258177.196035] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 05:59:57 2019][258177.203236] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 05:59:57 2019][258177.209884] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 05:59:57 2019][258177.216632] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 05:59:57 2019][258177.224158] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 05:59:57 2019][258177.231253] [] dqget+0x3fa/0x450 [Thu Dec 12 05:59:57 2019][258177.236257] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 05:59:57 2019][258177.242030] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 05:59:57 2019][258177.249655] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 05:59:57 2019][258177.256140] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 05:59:57 2019][258177.262291] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 05:59:57 2019][258177.269355] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 05:59:57 2019][258177.277171] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 05:59:57 2019][258177.283587] [] kthread+0xd1/0xe0 [Thu Dec 12 05:59:57 2019][258177.288601] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 05:59:57 2019][258177.295166] [] 0xffffffffffffffff [Thu Dec 12 05:59:57 2019][258177.300279] LustreError: dumping log to /tmp/lustre-log.1576159197.32512 [Thu Dec 12 05:59:57 2019][258177.664094] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122756 to 0x1a80000401:1122785 [Thu Dec 12 06:01:15 2019][258255.000980] Pid: 32917, comm: ll_ost01_100 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:01:15 2019][258255.011502] Call Trace: [Thu Dec 12 06:01:15 2019][258255.014074] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 06:01:15 2019][258255.021069] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 06:01:15 2019][258255.028270] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 06:01:15 2019][258255.034918] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 06:01:15 2019][258255.041667] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 06:01:15 2019][258255.049194] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 06:01:15 2019][258255.056298] [] tgt_client_data_update+0x303/0x5e0 [ptlrpc] [Thu Dec 12 06:01:15 2019][258255.063605] [] tgt_client_new+0x41b/0x610 [ptlrpc] [Thu Dec 12 06:01:15 2019][258255.070217] [] ofd_obd_connect+0x3a3/0x4c0 [ofd] [Thu Dec 12 06:01:15 2019][258255.076606] [] target_handle_connect+0xecb/0x2b10 [ptlrpc] [Thu Dec 12 06:01:15 2019][258255.083901] [] tgt_request_handle+0x50a/0x1580 [ptlrpc] [Thu Dec 12 06:01:15 2019][258255.090934] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:01:15 2019][258255.098763] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:01:15 2019][258255.105182] [] kthread+0xd1/0xe0 [Thu Dec 12 06:01:15 2019][258255.110195] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:01:15 2019][258255.116776] [] 0xffffffffffffffff [Thu Dec 12 06:01:15 2019][258255.121892] LustreError: dumping log to /tmp/lustre-log.1576159275.32917 [Thu Dec 12 06:01:41 2019][258281.762177] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124997 to 0x1900000400:1125025 [Thu Dec 12 06:01:49 2019][258289.642312] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1118054 to 0x1800000402:1118081 [Thu Dec 12 06:01:54 2019][258294.266426] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126500 to 0x1980000400:1126529 [Thu Dec 12 06:03:17 2019][258377.883423] Pid: 112541, comm: ll_ost01_088 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:03:17 2019][258377.894031] Call Trace: [Thu Dec 12 06:03:17 2019][258377.896603] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 06:03:17 2019][258377.903599] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 06:03:17 2019][258377.910787] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 06:03:17 2019][258377.917442] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 06:03:17 2019][258377.924190] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 06:03:17 2019][258377.931714] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 06:03:17 2019][258377.938810] [] dqget+0x3fa/0x450 [Thu Dec 12 06:03:17 2019][258377.943813] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 06:03:18 2019][258377.949587] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 06:03:18 2019][258377.957219] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 06:03:18 2019][258377.963696] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 06:03:18 2019][258377.969836] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 06:03:18 2019][258377.976887] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:03:18 2019][258377.984702] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:03:18 2019][258377.991117] [] kthread+0xd1/0xe0 [Thu Dec 12 06:03:18 2019][258377.996132] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:03:18 2019][258378.002696] [] 0xffffffffffffffff [Thu Dec 12 06:03:18 2019][258378.007809] LustreError: dumping log to /tmp/lustre-log.1576159398.112541 [Thu Dec 12 06:03:46 2019][258406.083759] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071269 to 0x1800000400:3071297 [Thu Dec 12 06:04:11 2019][258431.390313] Lustre: fir-OST0054: Client 1f1945b8-54db-69d1-6e1e-25c0ee92d4cb (at 10.9.103.47@o2ib4) reconnecting [Thu Dec 12 06:04:11 2019][258431.400577] Lustre: Skipped 1521 previous similar messages [Thu Dec 12 06:05:00 2019][258480.222125] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1106217 to 0x1a00000402:1106241 [Thu Dec 12 06:05:09 2019][258489.213630] Lustre: 27223:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 06:05:09 2019][258489.213630] req@ffff8916cc359050 x1651840015315792/t0(0) o4->20841216-9d8b-7794-9459-ced18b617ae2@10.9.114.3@o2ib4:374/0 lens 488/0 e 0 to 0 dl 1576159514 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 06:05:09 2019][258489.242352] Lustre: 27223:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 3262 previous similar messages [Thu Dec 12 06:05:35 2019][258515.944870] Lustre: fir-OST0054: Connection restored to cded0104-b7e2-3351-ef3d-a03eb9e0010a (at 10.9.108.66@o2ib4) [Thu Dec 12 06:05:35 2019][258515.955404] Lustre: Skipped 1497 previous similar messages [Thu Dec 12 06:07:13 2019][258613.223861] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049539 to 0x1a00000401:3049569 [Thu Dec 12 06:07:52 2019][258652.284429] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082790 to 0x1a80000402:3082817 [Thu Dec 12 06:08:08 2019][258668.598882] Lustre: fir-OST0056: Export ffff8912408a3000 already connecting from 10.9.112.4@o2ib4 [Thu Dec 12 06:08:08 2019][258668.607877] Lustre: Skipped 59 previous similar messages [Thu Dec 12 06:08:33 2019][258693.689478] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090502 to 0x1980000402:3090529 [Thu Dec 12 06:09:18 2019][258738.338551] LNet: Service thread pid 32514 was inactive for 1202.30s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 06:09:18 2019][258738.355659] LNet: Skipped 6 previous similar messages [Thu Dec 12 06:09:18 2019][258738.360800] Pid: 32514, comm: ll_ost03_095 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:09:18 2019][258738.371340] Call Trace: [Thu Dec 12 06:09:18 2019][258738.373895] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 06:09:18 2019][258738.380277] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 06:09:18 2019][258738.387341] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:09:18 2019][258738.395142] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:09:18 2019][258738.401570] [] kthread+0xd1/0xe0 [Thu Dec 12 06:09:18 2019][258738.406573] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:09:18 2019][258738.413141] [] 0xffffffffffffffff [Thu Dec 12 06:09:18 2019][258738.418251] LustreError: dumping log to /tmp/lustre-log.1576159758.32514 [Thu Dec 12 06:09:30 2019][258750.626794] Pid: 33085, comm: ll_ost03_100 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:09:30 2019][258750.637311] Call Trace: [Thu Dec 12 06:09:30 2019][258750.639861] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 06:09:30 2019][258750.646859] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 06:09:30 2019][258750.654041] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 06:09:30 2019][258750.660684] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 06:09:30 2019][258750.667425] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 06:09:30 2019][258750.674951] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 06:09:30 2019][258750.682045] [] tgt_client_data_update+0x303/0x5e0 [ptlrpc] [Thu Dec 12 06:09:30 2019][258750.689338] [] tgt_client_new+0x41b/0x610 [ptlrpc] [Thu Dec 12 06:09:30 2019][258750.695936] [] ofd_obd_connect+0x3a3/0x4c0 [ofd] [Thu Dec 12 06:09:30 2019][258750.702328] [] target_handle_connect+0xecb/0x2b10 [ptlrpc] [Thu Dec 12 06:09:30 2019][258750.709613] [] tgt_request_handle+0x50a/0x1580 [ptlrpc] [Thu Dec 12 06:09:30 2019][258750.716654] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:09:30 2019][258750.724468] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:09:30 2019][258750.730886] [] kthread+0xd1/0xe0 [Thu Dec 12 06:09:30 2019][258750.735891] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:09:30 2019][258750.742446] [] 0xffffffffffffffff [Thu Dec 12 06:09:30 2019][258750.747551] LustreError: dumping log to /tmp/lustre-log.1576159770.33085 [Thu Dec 12 06:10:23 2019][258803.875861] Pid: 67636, comm: ll_ost01_019 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:10:23 2019][258803.886379] Call Trace: [Thu Dec 12 06:10:23 2019][258803.888940] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 06:10:23 2019][258803.895338] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 06:10:23 2019][258803.902402] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:10:23 2019][258803.910202] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:10:23 2019][258803.916631] [] kthread+0xd1/0xe0 [Thu Dec 12 06:10:23 2019][258803.921633] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:10:23 2019][258803.928208] [] 0xffffffffffffffff [Thu Dec 12 06:10:23 2019][258803.933317] LustreError: dumping log to /tmp/lustre-log.1576159823.67636 [Thu Dec 12 06:10:36 2019][258816.427865] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086531 to 0x1900000402:3086561 [Thu Dec 12 06:11:55 2019][258895.674050] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126500 to 0x1980000400:1126561 [Thu Dec 12 06:12:33 2019][258933.935132] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122787 to 0x1a80000401:1122817 [Thu Dec 12 06:13:47 2019][259007.055663] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071300 to 0x1800000400:3071329 [Thu Dec 12 06:13:48 2019][259008.679904] Pid: 33111, comm: ll_ost01_103 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:13:48 2019][259008.690420] Call Trace: [Thu Dec 12 06:13:48 2019][259008.692988] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 06:13:48 2019][259008.699985] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 06:13:48 2019][259008.707173] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 06:13:48 2019][259008.713817] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 06:13:48 2019][259008.720568] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 06:13:48 2019][259008.728083] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 06:13:48 2019][259008.735178] [] dqget+0x3fa/0x450 [Thu Dec 12 06:13:48 2019][259008.740183] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 06:13:48 2019][259008.745969] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 06:13:48 2019][259008.753581] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 06:13:48 2019][259008.760070] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 06:13:48 2019][259008.766200] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 06:13:48 2019][259008.773259] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:13:48 2019][259008.781066] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:13:48 2019][259008.787507] [] kthread+0xd1/0xe0 [Thu Dec 12 06:13:48 2019][259008.792503] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:13:48 2019][259008.799079] [] 0xffffffffffffffff [Thu Dec 12 06:13:48 2019][259008.804180] LustreError: dumping log to /tmp/lustre-log.1576160028.33111 [Thu Dec 12 06:14:11 2019][259031.699781] Lustre: fir-OST0058: Client ec6f0728-3f9f-b5fd-43eb-c07bc3da43b2 (at 10.9.117.12@o2ib4) reconnecting [Thu Dec 12 06:14:11 2019][259031.710050] Lustre: Skipped 1496 previous similar messages [Thu Dec 12 06:14:17 2019][259037.329181] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1125030 to 0x1900000400:1125057 [Thu Dec 12 06:14:25 2019][259045.817318] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1118084 to 0x1800000402:1118113 [Thu Dec 12 06:15:01 2019][259081.874075] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1106243 to 0x1a00000402:1106273 [Thu Dec 12 06:15:09 2019][259089.621505] Lustre: 29439:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/-150), not sending early reply [Thu Dec 12 06:15:09 2019][259089.621505] req@ffff88eb1c5d1850 x1649447765080400/t0(0) o17->34b263e7-c235-6737-be01-1bc0ec67d622@10.9.117.33@o2ib4:219/0 lens 456/0 e 0 to 0 dl 1576160114 ref 2 fl New:/0/ffffffff rc 0/-1 [Thu Dec 12 06:15:09 2019][259089.650655] Lustre: 29439:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 3267 previous similar messages [Thu Dec 12 06:15:36 2019][259116.335315] Lustre: fir-OST005e: Connection restored to 4e97c29c-283b-4253-402d-db9d46beedd7 (at 10.9.101.39@o2ib4) [Thu Dec 12 06:15:36 2019][259116.345834] Lustre: Skipped 1509 previous similar messages [Thu Dec 12 06:15:59 2019][259139.754503] Pid: 32958, comm: ll_ost01_101 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:15:59 2019][259139.765022] Call Trace: [Thu Dec 12 06:15:59 2019][259139.767610] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 06:15:59 2019][259139.774607] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 06:15:59 2019][259139.781790] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 06:15:59 2019][259139.788453] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 06:15:59 2019][259139.795205] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 06:15:59 2019][259139.802730] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 06:15:59 2019][259139.809825] [] dqget+0x3fa/0x450 [Thu Dec 12 06:15:59 2019][259139.814830] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 06:15:59 2019][259139.820609] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 06:15:59 2019][259139.828227] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 06:15:59 2019][259139.834702] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 06:15:59 2019][259139.840843] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 06:15:59 2019][259139.847894] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:15:59 2019][259139.855708] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:15:59 2019][259139.862138] [] kthread+0xd1/0xe0 [Thu Dec 12 06:15:59 2019][259139.867158] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:15:59 2019][259139.873719] [] 0xffffffffffffffff [Thu Dec 12 06:15:59 2019][259139.878833] LustreError: dumping log to /tmp/lustre-log.1576160159.32958 [Thu Dec 12 06:17:53 2019][259253.216257] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082823 to 0x1a80000402:3082849 [Thu Dec 12 06:18:19 2019][259279.028246] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 06:18:19 2019][259279.037210] Lustre: Skipped 47 previous similar messages [Thu Dec 12 06:18:35 2019][259295.125385] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090531 to 0x1980000402:3090561 [Thu Dec 12 06:19:20 2019][259340.462451] LNet: Service thread pid 32897 was inactive for 1202.40s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 06:19:20 2019][259340.479552] LNet: Skipped 4 previous similar messages [Thu Dec 12 06:19:20 2019][259340.484694] Pid: 32897, comm: ll_ost03_099 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:19:20 2019][259340.495231] Call Trace: [Thu Dec 12 06:19:20 2019][259340.497780] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 06:19:20 2019][259340.504172] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 06:19:20 2019][259340.511235] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:19:20 2019][259340.519038] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:19:20 2019][259340.525465] [] kthread+0xd1/0xe0 [Thu Dec 12 06:19:20 2019][259340.530470] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:19:20 2019][259340.537044] [] 0xffffffffffffffff [Thu Dec 12 06:19:20 2019][259340.542155] LustreError: dumping log to /tmp/lustre-log.1576160360.32897 [Thu Dec 12 06:19:49 2019][259369.142825] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049578 to 0x1a00000401:3049601 [Thu Dec 12 06:21:56 2019][259496.581907] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126500 to 0x1980000400:1126593 [Thu Dec 12 06:22:57 2019][259557.554758] Pid: 32877, comm: ll_ost01_098 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:22:57 2019][259557.565280] Call Trace: [Thu Dec 12 06:22:57 2019][259557.567832] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 06:22:57 2019][259557.574234] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 06:22:57 2019][259557.581312] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:22:57 2019][259557.589114] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:22:57 2019][259557.595541] [] kthread+0xd1/0xe0 [Thu Dec 12 06:22:57 2019][259557.600544] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:22:57 2019][259557.607121] [] 0xffffffffffffffff [Thu Dec 12 06:22:57 2019][259557.612231] LustreError: dumping log to /tmp/lustre-log.1576160577.32877 [Thu Dec 12 06:23:12 2019][259572.106814] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086572 to 0x1900000402:3086593 [Thu Dec 12 06:23:49 2019][259608.987557] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071333 to 0x1800000400:3071361 [Thu Dec 12 06:24:12 2019][259632.698181] Lustre: fir-OST0058: Client ec6f0728-3f9f-b5fd-43eb-c07bc3da43b2 (at 10.9.117.12@o2ib4) reconnecting [Thu Dec 12 06:24:12 2019][259632.708437] Lustre: Skipped 1492 previous similar messages [Thu Dec 12 06:25:02 2019][259682.629933] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1106276 to 0x1a00000402:1106305 [Thu Dec 12 06:25:09 2019][259689.667411] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 06:25:09 2019][259689.667411] req@ffff891091dac850 x1651792518523520/t0(0) o4->f01080a0-cc7c-da9c-568d-51eacd84f956@10.9.114.8@o2ib4:64/0 lens 7808/0 e 0 to 0 dl 1576160714 ref 2 fl New:H/2/ffffffff rc 0/-1 [Thu Dec 12 06:25:09 2019][259689.696243] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 3263 previous similar messages [Thu Dec 12 06:25:10 2019][259690.318084] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122819 to 0x1a80000401:1122849 [Thu Dec 12 06:25:38 2019][259718.012255] Lustre: fir-OST0058: Connection restored to c2d505e5-30ab-23ac-7017-17a83f00b35d (at 10.9.102.49@o2ib4) [Thu Dec 12 06:25:38 2019][259718.022775] Lustre: Skipped 1509 previous similar messages [Thu Dec 12 06:26:53 2019][259793.264111] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1125063 to 0x1900000400:1125089 [Thu Dec 12 06:27:01 2019][259801.168248] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1118117 to 0x1800000402:1118145 [Thu Dec 12 06:27:54 2019][259854.180093] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082852 to 0x1a80000402:3082881 [Thu Dec 12 06:28:16 2019][259876.025064] Pid: 32985, comm: ll_ost01_102 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:28:16 2019][259876.035587] Call Trace: [Thu Dec 12 06:28:16 2019][259876.038165] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 06:28:16 2019][259876.045159] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 06:28:16 2019][259876.052347] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 06:28:16 2019][259876.058992] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 06:28:16 2019][259876.065741] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 06:28:16 2019][259876.073268] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 06:28:16 2019][259876.080363] [] dqget+0x3fa/0x450 [Thu Dec 12 06:28:16 2019][259876.085367] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 06:28:16 2019][259876.091152] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 06:28:16 2019][259876.098766] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 06:28:16 2019][259876.105254] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 06:28:16 2019][259876.111385] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 06:28:16 2019][259876.118444] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:28:16 2019][259876.126248] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:28:16 2019][259876.132676] [] kthread+0xd1/0xe0 [Thu Dec 12 06:28:16 2019][259876.137680] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:28:16 2019][259876.144266] [] 0xffffffffffffffff [Thu Dec 12 06:28:16 2019][259876.149364] LustreError: dumping log to /tmp/lustre-log.1576160896.32985 [Thu Dec 12 06:28:30 2019][259890.883604] Lustre: fir-OST0056: Export ffff88f4f5726000 already connecting from 10.8.7.5@o2ib6 [Thu Dec 12 06:28:30 2019][259890.892397] Lustre: Skipped 59 previous similar messages [Thu Dec 12 06:28:36 2019][259896.361272] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090568 to 0x1980000402:3090593 [Thu Dec 12 06:31:57 2019][260097.361839] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126598 to 0x1980000400:1126625 [Thu Dec 12 06:31:58 2019][260098.237469] LNet: Service thread pid 33368 was inactive for 1203.15s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 06:31:58 2019][260098.254576] LNet: Skipped 2 previous similar messages [Thu Dec 12 06:31:58 2019][260098.259727] Pid: 33368, comm: ll_ost03_103 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:31:58 2019][260098.270265] Call Trace: [Thu Dec 12 06:31:58 2019][260098.272821] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 06:31:58 2019][260098.279224] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 06:31:58 2019][260098.286285] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:31:58 2019][260098.294096] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:31:58 2019][260098.300526] [] kthread+0xd1/0xe0 [Thu Dec 12 06:31:58 2019][260098.305543] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:31:58 2019][260098.312120] [] 0xffffffffffffffff [Thu Dec 12 06:31:58 2019][260098.317240] LustreError: dumping log to /tmp/lustre-log.1576161118.33368 [Thu Dec 12 06:32:25 2019][260125.325839] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049604 to 0x1a00000401:3049633 [Thu Dec 12 06:33:49 2019][260209.191454] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071364 to 0x1800000400:3071393 [Thu Dec 12 06:34:12 2019][260232.885006] Lustre: fir-OST005c: Client d22f0531-865c-6a0a-5b19-ab2316a51d3c (at 10.9.106.13@o2ib4) reconnecting [Thu Dec 12 06:34:12 2019][260232.895271] Lustre: Skipped 1476 previous similar messages [Thu Dec 12 06:35:03 2019][260283.601864] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1106307 to 0x1a00000402:1106337 [Thu Dec 12 06:35:09 2019][260289.749270] Lustre: 29439:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/-150), not sending early reply [Thu Dec 12 06:35:09 2019][260289.749270] req@ffff88e80a35b050 x1649447765303904/t0(0) o17->34b263e7-c235-6737-be01-1bc0ec67d622@10.9.117.33@o2ib4:664/0 lens 456/0 e 0 to 0 dl 1576161314 ref 2 fl New:/0/ffffffff rc 0/-1 [Thu Dec 12 06:35:09 2019][260289.778430] Lustre: 29439:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 3304 previous similar messages [Thu Dec 12 06:35:35 2019][260315.329769] Pid: 33363, comm: ll_ost01_106 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:35:35 2019][260315.340284] Call Trace: [Thu Dec 12 06:35:35 2019][260315.342839] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 06:35:35 2019][260315.349237] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 06:35:35 2019][260315.356301] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:35:35 2019][260315.364101] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:35:35 2019][260315.370531] [] kthread+0xd1/0xe0 [Thu Dec 12 06:35:35 2019][260315.375534] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:35:35 2019][260315.382111] [] 0xffffffffffffffff [Thu Dec 12 06:35:35 2019][260315.387221] LustreError: dumping log to /tmp/lustre-log.1576161335.33363 [Thu Dec 12 06:35:39 2019][260319.146381] Lustre: fir-OST0054: Connection restored to ef2d362e-15f9-19b6-7dd8-07207d8adffe (at 10.9.103.50@o2ib4) [Thu Dec 12 06:35:39 2019][260319.156922] Lustre: Skipped 1509 previous similar messages [Thu Dec 12 06:35:48 2019][260328.009799] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086600 to 0x1900000402:3086625 [Thu Dec 12 06:37:46 2019][260446.157100] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122852 to 0x1a80000401:1122881 [Thu Dec 12 06:37:55 2019][260455.460321] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082884 to 0x1a80000402:3082913 [Thu Dec 12 06:38:36 2019][260496.124239] Lustre: fir-OST0056: Export ffff8912f1e21000 already connecting from 10.8.25.17@o2ib6 [Thu Dec 12 06:38:36 2019][260496.133213] Lustre: Skipped 67 previous similar messages [Thu Dec 12 06:38:37 2019][260497.501195] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090596 to 0x1980000402:3090625 [Thu Dec 12 06:39:29 2019][260549.823153] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1125091 to 0x1900000400:1125121 [Thu Dec 12 06:39:37 2019][260557.151283] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1118148 to 0x1800000402:1118177 [Thu Dec 12 06:41:58 2019][260698.525861] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126630 to 0x1980000400:1126657 [Thu Dec 12 06:43:51 2019][260811.091393] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071401 to 0x1800000400:3071425 [Thu Dec 12 06:44:13 2019][260833.153805] Lustre: fir-OST0054: Client 553c79c8-d0b4-823c-4974-01da71803ed2 (at 10.9.116.9@o2ib4) reconnecting [Thu Dec 12 06:44:13 2019][260833.163988] Lustre: Skipped 1501 previous similar messages [Thu Dec 12 06:44:36 2019][260856.012480] LNet: Service thread pid 32280 was inactive for 1203.89s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 06:44:36 2019][260856.029593] LNet: Skipped 1 previous similar message [Thu Dec 12 06:44:36 2019][260856.034655] Pid: 32280, comm: ll_ost03_091 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:44:36 2019][260856.045195] Call Trace: [Thu Dec 12 06:44:36 2019][260856.047747] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 06:44:36 2019][260856.054141] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 06:44:36 2019][260856.061204] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:44:36 2019][260856.069021] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:44:36 2019][260856.075450] [] kthread+0xd1/0xe0 [Thu Dec 12 06:44:36 2019][260856.080455] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:44:36 2019][260856.087032] [] 0xffffffffffffffff [Thu Dec 12 06:44:36 2019][260856.092140] LustreError: dumping log to /tmp/lustre-log.1576161876.32280 [Thu Dec 12 06:45:01 2019][260881.084793] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049642 to 0x1a00000401:3049665 [Thu Dec 12 06:45:04 2019][260884.893786] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1106341 to 0x1a00000402:1106369 [Thu Dec 12 06:45:10 2019][260890.131167] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 06:45:10 2019][260890.131167] req@ffff8900d10f3050 x1649045828695760/t0(0) o4->341fa20c-48d3-b0f6-d4f7-bb2f25ae43ef@10.9.105.12@o2ib4:510/0 lens 11800/0 e 0 to 0 dl 1576161915 ref 2 fl New:H/2/ffffffff rc 0/-1 [Thu Dec 12 06:45:10 2019][260890.160256] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 3119 previous similar messages [Thu Dec 12 06:45:40 2019][260920.088029] Lustre: fir-OST0054: Connection restored to cded0104-b7e2-3351-ef3d-a03eb9e0010a (at 10.9.108.66@o2ib4) [Thu Dec 12 06:45:40 2019][260920.098547] Lustre: Skipped 1468 previous similar messages [Thu Dec 12 06:47:56 2019][261056.163928] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082922 to 0x1a80000402:3082945 [Thu Dec 12 06:48:13 2019][261073.104777] Pid: 33603, comm: ll_ost01_108 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:48:13 2019][261073.115296] Call Trace: [Thu Dec 12 06:48:13 2019][261073.117848] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 06:48:13 2019][261073.124247] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 06:48:13 2019][261073.131312] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:48:13 2019][261073.139113] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:48:13 2019][261073.145555] [] kthread+0xd1/0xe0 [Thu Dec 12 06:48:13 2019][261073.150561] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:48:13 2019][261073.157137] [] 0xffffffffffffffff [Thu Dec 12 06:48:13 2019][261073.162255] LustreError: dumping log to /tmp/lustre-log.1576162093.33603 [Thu Dec 12 06:48:24 2019][261084.224805] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086630 to 0x1900000402:3086657 [Thu Dec 12 06:48:38 2019][261098.577113] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090637 to 0x1980000402:3090657 [Thu Dec 12 06:48:41 2019][261101.576577] Lustre: fir-OST0056: Export ffff8912f1e21000 already connecting from 10.8.25.17@o2ib6 [Thu Dec 12 06:48:41 2019][261101.585538] Lustre: Skipped 66 previous similar messages [Thu Dec 12 06:53:52 2019][261412.447311] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071433 to 0x1800000400:3071457 [Thu Dec 12 06:54:13 2019][261433.423105] Lustre: fir-OST0054: Client 6c8c6de8-60d9-5ab1-5f74-dc1e64ab5212 (at 10.8.30.9@o2ib6) reconnecting [Thu Dec 12 06:54:13 2019][261433.433210] Lustre: Skipped 1547 previous similar messages [Thu Dec 12 06:54:29 2019][261449.944241] Pid: 32729, comm: ll_ost01_097 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:54:29 2019][261449.954786] Call Trace: [Thu Dec 12 06:54:29 2019][261449.957371] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 06:54:29 2019][261449.964375] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 06:54:29 2019][261449.971584] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 06:54:29 2019][261449.978267] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 06:54:29 2019][261449.985016] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 06:54:29 2019][261449.992584] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 06:54:29 2019][261449.999681] [] dqget+0x3fa/0x450 [Thu Dec 12 06:54:29 2019][261450.004684] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 06:54:29 2019][261450.010465] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 06:54:30 2019][261450.018082] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 06:54:30 2019][261450.024557] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 06:54:30 2019][261450.030697] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 06:54:30 2019][261450.037764] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:54:30 2019][261450.045565] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:54:30 2019][261450.051978] [] kthread+0xd1/0xe0 [Thu Dec 12 06:54:30 2019][261450.056980] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:54:30 2019][261450.063556] [] 0xffffffffffffffff [Thu Dec 12 06:54:30 2019][261450.068656] LustreError: dumping log to /tmp/lustre-log.1576162470.32729 [Thu Dec 12 06:54:38 2019][261458.136416] LNet: Service thread pid 32245 was inactive for 1203.99s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 06:54:38 2019][261458.153524] LNet: Skipped 2 previous similar messages [Thu Dec 12 06:54:38 2019][261458.158670] Pid: 32245, comm: ll_ost03_089 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:54:38 2019][261458.169222] Call Trace: [Thu Dec 12 06:54:38 2019][261458.171773] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 06:54:38 2019][261458.178176] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 06:54:38 2019][261458.185227] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:54:38 2019][261458.193029] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:54:38 2019][261458.199460] [] kthread+0xd1/0xe0 [Thu Dec 12 06:54:38 2019][261458.204461] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:54:38 2019][261458.211038] [] 0xffffffffffffffff [Thu Dec 12 06:54:38 2019][261458.216146] LustreError: dumping log to /tmp/lustre-log.1576162478.32245 [Thu Dec 12 06:54:42 2019][261462.232495] Pid: 33852, comm: ll_ost01_110 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:54:42 2019][261462.243016] Call Trace: [Thu Dec 12 06:54:42 2019][261462.245574] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 06:54:42 2019][261462.252570] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 06:54:42 2019][261462.259753] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 06:54:42 2019][261462.266402] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 06:54:42 2019][261462.273153] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 06:54:42 2019][261462.280667] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 06:54:42 2019][261462.287764] [] dqget+0x3fa/0x450 [Thu Dec 12 06:54:42 2019][261462.292766] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 06:54:42 2019][261462.298552] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 06:54:42 2019][261462.306166] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 06:54:42 2019][261462.312671] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 06:54:42 2019][261462.318803] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 06:54:42 2019][261462.325847] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:54:42 2019][261462.333650] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:54:42 2019][261462.340078] [] kthread+0xd1/0xe0 [Thu Dec 12 06:54:42 2019][261462.345081] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:54:42 2019][261462.351656] [] 0xffffffffffffffff [Thu Dec 12 06:54:42 2019][261462.356748] LustreError: dumping log to /tmp/lustre-log.1576162482.33852 [Thu Dec 12 06:55:10 2019][261490.467054] Lustre: 31346:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/-150), not sending early reply [Thu Dec 12 06:55:10 2019][261490.467054] req@ffff890a40f29050 x1649447765532320/t0(0) o17->34b263e7-c235-6737-be01-1bc0ec67d622@10.9.117.33@o2ib4:355/0 lens 456/0 e 0 to 0 dl 1576162515 ref 2 fl New:/0/ffffffff rc 0/-1 [Thu Dec 12 06:55:10 2019][261490.496211] Lustre: 31346:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 3431 previous similar messages [Thu Dec 12 06:55:40 2019][261520.582993] Lustre: fir-OST005a: Connection restored to 8af5a0e2-950b-b38c-1f8d-81e50d22f3d1 (at 10.9.102.4@o2ib4) [Thu Dec 12 06:55:40 2019][261520.593438] Lustre: Skipped 1548 previous similar messages [Thu Dec 12 06:57:13 2019][261613.787487] Pid: 112560, comm: ll_ost01_092 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:57:13 2019][261613.798092] Call Trace: [Thu Dec 12 06:57:13 2019][261613.800666] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 06:57:13 2019][261613.807658] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 06:57:13 2019][261613.814844] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 06:57:13 2019][261613.821509] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 06:57:13 2019][261613.828258] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 06:57:13 2019][261613.835775] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 06:57:13 2019][261613.842868] [] dqget+0x3fa/0x450 [Thu Dec 12 06:57:13 2019][261613.847872] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 06:57:13 2019][261613.853647] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 06:57:13 2019][261613.861265] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 06:57:13 2019][261613.867740] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 06:57:13 2019][261613.873879] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 06:57:13 2019][261613.880930] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:57:13 2019][261613.888758] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:57:13 2019][261613.895170] [] kthread+0xd1/0xe0 [Thu Dec 12 06:57:13 2019][261613.900170] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:57:13 2019][261613.906747] [] 0xffffffffffffffff [Thu Dec 12 06:57:13 2019][261613.911847] LustreError: dumping log to /tmp/lustre-log.1576162633.112560 [Thu Dec 12 06:57:21 2019][261621.979635] Pid: 33700, comm: ll_ost01_109 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 06:57:21 2019][261621.990149] Call Trace: [Thu Dec 12 06:57:21 2019][261621.992704] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 06:57:21 2019][261621.999699] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 06:57:22 2019][261622.006884] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 06:57:22 2019][261622.013547] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 06:57:22 2019][261622.020288] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 06:57:22 2019][261622.027806] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 06:57:22 2019][261622.034901] [] dqget+0x3fa/0x450 [Thu Dec 12 06:57:22 2019][261622.039904] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 06:57:22 2019][261622.045691] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 06:57:22 2019][261622.053303] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 06:57:22 2019][261622.059792] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 06:57:22 2019][261622.065921] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 06:57:22 2019][261622.072976] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 06:57:22 2019][261622.080793] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 06:57:22 2019][261622.087221] [] kthread+0xd1/0xe0 [Thu Dec 12 06:57:22 2019][261622.092226] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 06:57:22 2019][261622.098792] [] 0xffffffffffffffff [Thu Dec 12 06:57:22 2019][261622.103884] LustreError: dumping log to /tmp/lustre-log.1576162642.33700 [Thu Dec 12 06:57:37 2019][261637.219719] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049675 to 0x1a00000401:3049697 [Thu Dec 12 06:57:57 2019][261657.343817] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082953 to 0x1a80000402:3082977 [Thu Dec 12 06:58:40 2019][261700.581002] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090663 to 0x1980000402:3090689 [Thu Dec 12 06:58:46 2019][261706.915850] Lustre: fir-OST0056: Export ffff890661bf0c00 already connecting from 10.9.113.12@o2ib4 [Thu Dec 12 06:58:46 2019][261706.924895] Lustre: Skipped 18 previous similar messages [Thu Dec 12 07:00:50 2019][261830.879767] Pid: 33217, comm: ll_ost01_105 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 07:00:50 2019][261830.890288] Call Trace: [Thu Dec 12 07:00:50 2019][261830.892842] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 07:00:50 2019][261830.899240] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 07:00:50 2019][261830.906303] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 07:00:50 2019][261830.914103] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 07:00:50 2019][261830.920531] [] kthread+0xd1/0xe0 [Thu Dec 12 07:00:50 2019][261830.925534] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 07:00:50 2019][261830.932112] [] 0xffffffffffffffff [Thu Dec 12 07:00:50 2019][261830.937231] LustreError: dumping log to /tmp/lustre-log.1576162850.33217 [Thu Dec 12 07:01:00 2019][261840.143697] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086664 to 0x1900000402:3086689 [Thu Dec 12 07:03:53 2019][262013.107106] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071461 to 0x1800000400:3071489 [Thu Dec 12 07:04:13 2019][262033.981719] Lustre: fir-OST0058: Client f0c93108-cf16-4103-60a6-55732a55eba6 (at 10.8.30.11@o2ib6) reconnecting [Thu Dec 12 07:04:13 2019][262033.991901] Lustre: Skipped 1555 previous similar messages [Thu Dec 12 07:04:40 2019][262060.260308] LNet: Service thread pid 33123 was inactive for 1204.09s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 07:04:40 2019][262060.277415] LNet: Skipped 4 previous similar messages [Thu Dec 12 07:04:40 2019][262060.282565] Pid: 33123, comm: ll_ost03_102 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 07:04:40 2019][262060.293101] Call Trace: [Thu Dec 12 07:04:40 2019][262060.295660] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 07:04:40 2019][262060.302060] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 07:04:40 2019][262060.309135] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 07:04:40 2019][262060.316942] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 07:04:40 2019][262060.323370] [] kthread+0xd1/0xe0 [Thu Dec 12 07:04:40 2019][262060.328373] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 07:04:40 2019][262060.334949] [] 0xffffffffffffffff [Thu Dec 12 07:04:40 2019][262060.340058] LustreError: dumping log to /tmp/lustre-log.1576163080.33123 [Thu Dec 12 07:05:10 2019][262090.592897] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 07:05:10 2019][262090.592897] req@ffff88fb95758850 x1648391582529008/t0(0) o10->bb0489d8-99d9-bd6e-c7e4-6c2155fd6f79@10.8.23.36@o2ib6:200/0 lens 440/0 e 0 to 0 dl 1576163115 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 07:05:10 2019][262090.621704] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 3487 previous similar messages [Thu Dec 12 07:05:40 2019][262120.784650] Lustre: fir-OST0054: Connection restored to 41d92224-621a-6fa8-8ee5-70a524b04ee8 (at 10.9.117.6@o2ib4) [Thu Dec 12 07:05:40 2019][262120.795082] Lustre: Skipped 1554 previous similar messages [Thu Dec 12 07:06:39 2019][262179.046609] Pid: 67823, comm: ll_ost01_058 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 07:06:39 2019][262179.057128] Call Trace: [Thu Dec 12 07:06:39 2019][262179.059707] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 07:06:39 2019][262179.066704] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 07:06:39 2019][262179.073890] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 07:06:39 2019][262179.080537] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 07:06:39 2019][262179.087286] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 07:06:39 2019][262179.094812] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 07:06:39 2019][262179.101915] [] dqget+0x3fa/0x450 [Thu Dec 12 07:06:39 2019][262179.106933] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 07:06:39 2019][262179.112721] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 07:06:39 2019][262179.120335] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 07:06:39 2019][262179.126823] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 07:06:39 2019][262179.132952] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 07:06:39 2019][262179.140014] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 07:06:39 2019][262179.147818] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 07:06:39 2019][262179.154255] [] kthread+0xd1/0xe0 [Thu Dec 12 07:06:39 2019][262179.159257] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 07:06:39 2019][262179.165832] [] 0xffffffffffffffff [Thu Dec 12 07:06:39 2019][262179.170950] LustreError: dumping log to /tmp/lustre-log.1576163199.67823 [Thu Dec 12 07:07:58 2019][262258.315609] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082989 to 0x1a80000402:3083009 [Thu Dec 12 07:08:41 2019][262301.904801] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090696 to 0x1980000402:3090721 [Thu Dec 12 07:09:30 2019][262350.719622] Lustre: fir-OST0056: Export ffff8912408a3000 already connecting from 10.9.112.4@o2ib4 [Thu Dec 12 07:09:30 2019][262350.728586] Lustre: Skipped 11 previous similar messages [Thu Dec 12 07:10:07 2019][262387.946751] Pid: 33791, comm: ll_ost03_105 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 07:10:07 2019][262387.957284] Call Trace: [Thu Dec 12 07:10:07 2019][262387.959862] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 07:10:07 2019][262387.966868] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 07:10:07 2019][262387.974054] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 07:10:07 2019][262387.980707] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 07:10:07 2019][262387.987457] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 07:10:07 2019][262387.994984] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 07:10:07 2019][262388.002085] [] dqget+0x3fa/0x450 [Thu Dec 12 07:10:07 2019][262388.007106] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 07:10:07 2019][262388.012889] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 07:10:07 2019][262388.020533] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 07:10:07 2019][262388.027015] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 07:10:08 2019][262388.033157] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 07:10:08 2019][262388.040205] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 07:10:08 2019][262388.048030] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 07:10:08 2019][262388.054445] [] kthread+0xd1/0xe0 [Thu Dec 12 07:10:08 2019][262388.059446] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 07:10:08 2019][262388.066023] [] 0xffffffffffffffff [Thu Dec 12 07:10:08 2019][262388.071122] LustreError: dumping log to /tmp/lustre-log.1576163408.33791 [Thu Dec 12 07:10:13 2019][262393.490606] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049708 to 0x1a00000401:3049729 [Thu Dec 12 07:10:20 2019][262400.234964] Pid: 33161, comm: ll_ost01_104 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 07:10:20 2019][262400.245477] Call Trace: [Thu Dec 12 07:10:20 2019][262400.248044] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 07:10:20 2019][262400.255035] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 07:10:20 2019][262400.262233] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 07:10:20 2019][262400.268876] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 07:10:20 2019][262400.275617] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 07:10:20 2019][262400.283133] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 07:10:20 2019][262400.290230] [] dqget+0x3fa/0x450 [Thu Dec 12 07:10:20 2019][262400.295232] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 07:10:20 2019][262400.301004] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 07:10:20 2019][262400.308630] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 07:10:20 2019][262400.315105] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 07:10:20 2019][262400.321248] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 07:10:20 2019][262400.328289] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 07:10:20 2019][262400.336103] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 07:10:20 2019][262400.342518] [] kthread+0xd1/0xe0 [Thu Dec 12 07:10:20 2019][262400.347538] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 07:10:20 2019][262400.354088] [] 0xffffffffffffffff [Thu Dec 12 07:10:20 2019][262400.359195] LustreError: dumping log to /tmp/lustre-log.1576163420.33161 [Thu Dec 12 07:13:36 2019][262596.350622] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086697 to 0x1900000402:3086721 [Thu Dec 12 07:13:54 2019][262614.846989] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071495 to 0x1800000400:3071521 [Thu Dec 12 07:14:15 2019][262635.174559] Lustre: fir-OST005a: Client 4ae1953c-c5de-651a-c222-99cb1d82d019 (at 10.8.7.6@o2ib6) reconnecting [Thu Dec 12 07:14:15 2019][262635.184565] Lustre: Skipped 1599 previous similar messages [Thu Dec 12 07:14:42 2019][262662.384148] LNet: Service thread pid 34019 was inactive for 1204.19s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 07:14:42 2019][262662.401253] LNet: Skipped 3 previous similar messages [Thu Dec 12 07:14:42 2019][262662.406404] Pid: 34019, comm: ll_ost03_107 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 07:14:42 2019][262662.416940] Call Trace: [Thu Dec 12 07:14:42 2019][262662.419498] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 07:14:42 2019][262662.425897] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 07:14:42 2019][262662.432961] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 07:14:42 2019][262662.440762] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 07:14:42 2019][262662.447191] [] kthread+0xd1/0xe0 [Thu Dec 12 07:14:42 2019][262662.452195] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 07:14:42 2019][262662.458770] [] 0xffffffffffffffff [Thu Dec 12 07:14:42 2019][262662.463877] LustreError: dumping log to /tmp/lustre-log.1576163682.34019 [Thu Dec 12 07:15:10 2019][262690.816709] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 07:15:10 2019][262690.816709] req@ffff88fd4d87b850 x1649047542081856/t0(0) o4->eeedb4d1-a88f-91d4-b517-9794013a9735@10.9.101.23@o2ib4:45/0 lens 8632/0 e 0 to 0 dl 1576163715 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 07:15:10 2019][262690.845522] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 3600 previous similar messages [Thu Dec 12 07:15:40 2019][262720.974042] Lustre: fir-OST0058: Connection restored to 0c302cf4-1147-d945-dfa2-e9bc796b3175 (at 10.9.101.32@o2ib4) [Thu Dec 12 07:15:40 2019][262720.984568] Lustre: Skipped 1631 previous similar messages [Thu Dec 12 07:17:59 2019][262859.687585] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3083011 to 0x1a80000402:3083041 [Thu Dec 12 07:18:42 2019][262902.636753] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090728 to 0x1980000402:3090753 [Thu Dec 12 07:19:42 2019][262962.699475] Lustre: fir-OST0056: Export ffff8922dee96000 already connecting from 10.9.116.8@o2ib4 [Thu Dec 12 07:19:42 2019][262962.708441] Lustre: Skipped 10 previous similar messages [Thu Dec 12 07:22:49 2019][263149.665690] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049733 to 0x1a00000401:3049761 [Thu Dec 12 07:23:55 2019][263215.322973] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071525 to 0x1800000400:3071553 [Thu Dec 12 07:24:16 2019][263236.718264] Lustre: fir-OST005e: Client af0b82f1-b12f-53c0-f83a-3232c3516fc9 (at 10.9.117.18@o2ib4) reconnecting [Thu Dec 12 07:24:16 2019][263236.728524] Lustre: Skipped 1588 previous similar messages [Thu Dec 12 07:24:40 2019][263260.412075] Pid: 33538, comm: ll_ost03_104 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 07:24:40 2019][263260.422598] Call Trace: [Thu Dec 12 07:24:40 2019][263260.425149] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 07:24:40 2019][263260.431552] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 07:24:40 2019][263260.438611] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 07:24:40 2019][263260.446412] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 07:24:40 2019][263260.452844] [] kthread+0xd1/0xe0 [Thu Dec 12 07:24:40 2019][263260.457846] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 07:24:40 2019][263260.464420] [] 0xffffffffffffffff [Thu Dec 12 07:24:40 2019][263260.469530] LustreError: dumping log to /tmp/lustre-log.1576164280.33538 [Thu Dec 12 07:25:11 2019][263291.084707] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 07:25:11 2019][263291.084707] req@ffff88ffd6c21850 x1649047542536512/t0(0) o4->eeedb4d1-a88f-91d4-b517-9794013a9735@10.9.101.23@o2ib4:646/0 lens 488/0 e 0 to 0 dl 1576164316 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 07:25:11 2019][263291.113510] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 3783 previous similar messages [Thu Dec 12 07:25:41 2019][263321.242984] Lustre: fir-OST0058: Connection restored to 0c302cf4-1147-d945-dfa2-e9bc796b3175 (at 10.9.101.32@o2ib4) [Thu Dec 12 07:25:41 2019][263321.253504] Lustre: Skipped 1638 previous similar messages [Thu Dec 12 07:26:12 2019][263352.373715] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086729 to 0x1900000402:3086753 [Thu Dec 12 07:28:00 2019][263460.315550] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3083046 to 0x1a80000402:3083073 [Thu Dec 12 07:28:43 2019][263503.608742] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090755 to 0x1980000402:3090785 [Thu Dec 12 07:29:49 2019][263569.480838] Lustre: fir-OST0056: Export ffff890661bf0c00 already connecting from 10.9.113.12@o2ib4 [Thu Dec 12 07:29:49 2019][263569.489891] Lustre: Skipped 11 previous similar messages [Thu Dec 12 07:33:56 2019][263816.334925] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071556 to 0x1800000400:3071585 [Thu Dec 12 07:34:17 2019][263837.823815] Lustre: fir-OST0054: Client 6c8c6de8-60d9-5ab1-5f74-dc1e64ab5212 (at 10.8.30.9@o2ib6) reconnecting [Thu Dec 12 07:34:17 2019][263837.833909] Lustre: Skipped 1651 previous similar messages [Thu Dec 12 07:35:11 2019][263891.110578] Lustre: 27223:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 07:35:11 2019][263891.110578] req@ffff8914fc6b7850 x1649315416807712/t0(0) o4->25827f45-931d-eb26-8907-81e567064f86@10.9.106.11@o2ib4:491/0 lens 488/0 e 1 to 0 dl 1576164916 ref 2 fl New:/0/ffffffff rc 0/-1 [Thu Dec 12 07:35:11 2019][263891.139386] Lustre: 27223:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 3666 previous similar messages [Thu Dec 12 07:35:25 2019][263905.168644] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049769 to 0x1a00000401:3049793 [Thu Dec 12 07:35:41 2019][263921.465131] Lustre: fir-OST005c: Connection restored to 592dade7-9f13-e25c-8636-229aef409ea9 (at 10.9.108.38@o2ib4) [Thu Dec 12 07:35:41 2019][263921.475654] Lustre: Skipped 1598 previous similar messages [Thu Dec 12 07:37:18 2019][264018.187097] LNet: Service thread pid 33911 was inactive for 1200.94s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 07:37:18 2019][264018.204204] LNet: Skipped 1 previous similar message [Thu Dec 12 07:37:18 2019][264018.209269] Pid: 33911, comm: ll_ost03_106 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 07:37:18 2019][264018.219804] Call Trace: [Thu Dec 12 07:37:18 2019][264018.222354] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 07:37:18 2019][264018.228754] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 07:37:18 2019][264018.235831] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 07:37:18 2019][264018.243637] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 07:37:18 2019][264018.250067] [] kthread+0xd1/0xe0 [Thu Dec 12 07:37:18 2019][264018.255066] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 07:37:18 2019][264018.261642] [] 0xffffffffffffffff [Thu Dec 12 07:37:18 2019][264018.266754] LustreError: dumping log to /tmp/lustre-log.1576165038.33911 [Thu Dec 12 07:39:06 2019][264126.221225] LNet: Service thread pid 34617 was inactive for 200.36s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 07:39:06 2019][264126.238268] Pid: 34617, comm: ll_ost03_111 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 07:39:06 2019][264126.248785] Call Trace: [Thu Dec 12 07:39:06 2019][264126.251360] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 07:39:06 2019][264126.258376] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 07:39:06 2019][264126.265544] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 07:39:06 2019][264126.272211] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 07:39:06 2019][264126.278951] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 07:39:06 2019][264126.286491] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 07:39:06 2019][264126.293573] [] dqget+0x3fa/0x450 [Thu Dec 12 07:39:06 2019][264126.298588] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 07:39:06 2019][264126.304371] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 07:39:06 2019][264126.311997] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 07:39:06 2019][264126.318488] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 07:39:06 2019][264126.324630] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 07:39:06 2019][264126.331681] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 07:39:06 2019][264126.339496] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 07:39:06 2019][264126.345912] [] kthread+0xd1/0xe0 [Thu Dec 12 07:39:06 2019][264126.350926] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 07:39:06 2019][264126.357490] [] 0xffffffffffffffff [Thu Dec 12 07:39:06 2019][264126.362604] LustreError: dumping log to /tmp/lustre-log.1576165146.34617 [Thu Dec 12 07:39:16 2019][264136.461417] Pid: 34429, comm: ll_ost03_110 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 07:39:16 2019][264136.471940] Call Trace: [Thu Dec 12 07:39:16 2019][264136.474507] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 07:39:16 2019][264136.481514] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 07:39:16 2019][264136.488699] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 07:39:16 2019][264136.495351] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 07:39:16 2019][264136.502103] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 07:39:16 2019][264136.509629] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 07:39:16 2019][264136.516725] [] dqget+0x3fa/0x450 [Thu Dec 12 07:39:16 2019][264136.521728] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 07:39:16 2019][264136.527514] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 07:39:16 2019][264136.535127] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 07:39:16 2019][264136.541618] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 07:39:16 2019][264136.547746] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 07:39:16 2019][264136.554806] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 07:39:16 2019][264136.562610] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 07:39:16 2019][264136.569039] [] kthread+0xd1/0xe0 [Thu Dec 12 07:39:16 2019][264136.574042] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 07:39:16 2019][264136.580632] [] 0xffffffffffffffff [Thu Dec 12 07:39:16 2019][264136.585737] LustreError: dumping log to /tmp/lustre-log.1576165156.34429 [Thu Dec 12 07:42:49 2019][264349.590511] SysRq : Trigger a crash [Thu Dec 12 07:42:49 2019][264349.594172] BUG: unable to handle kernel NULL pointer dereference at (null) [Thu Dec 12 07:42:49 2019][264349.602142] IP: [] sysrq_handle_crash+0x16/0x20 [Thu Dec 12 07:42:49 2019][264349.608357] PGD 3344b32067 PUD 31426f2067 PMD 0 [Thu Dec 12 07:42:49 2019][264349.613141] Oops: 0002 [#1] SMP [Thu Dec 12 07:42:49 2019][264349.616515] Modules linked in: osp(OE) ofd(OE) lfsck(OE) ost(OE) mgc(OE) osd_ldiskfs(OE) lquota(OE) raid456 async_raid6_recov async_memcpy async_pq raid6_pq libcrc32c async_xor xor async_tx ldiskfs(OE) lmv(OE) osc(OE) lov(OE) fid(OE) fld(OE) ko2iblnd(OE) ptlrpc(OE) obdclass(OE) lnet(OE) libcfs(OE) rpcsec_gss_krb5 auth_rpcgss nfsv4 dns_resolver nfs lockd grace fscache rdma_ucm(OE) ib_ucm(OE) rdma_cm(OE) iw_cm(OE) ib_ipoib(OE) ib_cm(OE) ib_umad(OE) mlx4_en(OE) mlx4_ib(OE) mlx4_core(OE) dell_rbu sunrpc vfat fat dcdbas amd64_edac_mod edac_mce_amd kvm_amd kvm irqbypass crc32_pclmul ghash_clmulni_intel aesni_intel lrw gf128mul glue_helper ablk_helper cryptd pcspkr dm_service_time ses enclosure dm_multipath dm_mod ipmi_si ipmi_devintf sg ccp ipmi_msghandler acpi_power_meter k10temp i2c_piix4 ip_tables ext4 mbcache jbd2 sd_mod crc_t10dif crct10dif_generic mlx5_ib(OE) ib_uverbs(OE) ib_core(OE) i2c_algo_bit drm_kms_helper mlx5_core(OE) syscopyarea sysfillrect sysimgblt mlxfw(OE) fb_sys_fops ahci devlink ttm libahci mpt3sas(OE) mlx_compat(OE) crct10dif_pclmul tg3 raid_class crct10dif_common drm libata crc32c_intel ptp megaraid_sas scsi_transport_sas drm_panel_orientation_quirks pps_core [last unloaded: mdc] [Thu Dec 12 07:42:49 2019][264349.725354] CPU: 21 PID: 35083 Comm: bash Kdump: loaded Tainted: G OE ------------ 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 [Thu Dec 12 07:42:49 2019][264349.737684] Hardware name: Dell Inc. PowerEdge R6415/07YXFK, BIOS 1.10.6 08/15/2019 [Thu Dec 12 07:42:49 2019][264349.745424] task: ffff892252d72080 ti: ffff890e9b504000 task.ti: ffff890e9b504000 [Thu Dec 12 07:42:49 2019][264349.752991] RIP: 0010:[] [] sysrq_handle_crash+0x16/0x20 [Thu Dec 12 07:42:49 2019][264349.761623] RSP: 0018:ffff890e9b507e58 EFLAGS: 00010246 [Thu Dec 12 07:42:49 2019][264349.767020] RAX: ffffffffa7e64430 RBX: ffffffffa86e4f80 RCX: 0000000000000000 [Thu Dec 12 07:42:49 2019][264349.774240] RDX: 0000000000000000 RSI: ffff8902ff753898 RDI: 0000000000000063 [Thu Dec 12 07:42:49 2019][264349.781461] RBP: ffff890e9b507e58 R08: ffffffffa89e38bc R09: ffffffffa8a0e7d7 [Thu Dec 12 07:42:49 2019][264349.788681] R10: 00000000000032da R11: 00000000000032d9 R12: 0000000000000063 [Thu Dec 12 07:42:49 2019][264349.795900] R13: 0000000000000000 R14: 0000000000000007 R15: 0000000000000000 [Thu Dec 12 07:42:49 2019][264349.803120] FS: 00007fe662f8d740(0000) GS:ffff8902ff740000(0000) knlGS:0000000000000000 [Thu Dec 12 07:42:49 2019][264349.811291] CS: 0010 DS: 0000 ES: 0000 CR0: 0000000080050033 [Thu Dec 12 07:42:49 2019][264349.817124] CR2: 0000000000000000 CR3: 0000003d0c2a0000 CR4: 00000000003407e0 [Thu Dec 12 07:42:49 2019][264349.824344] Call Trace: [Thu Dec 12 07:42:49 2019][264349.826888] [] __handle_sysrq+0x10d/0x170 [Thu Dec 12 07:42:49 2019][264349.832630] [] write_sysrq_trigger+0x28/0x40 [Thu Dec 12 07:42:49 2019][264349.838640] [] proc_reg_write+0x40/0x80 [Thu Dec 12 07:42:49 2019][264349.844210] [] vfs_write+0xc0/0x1f0 [Thu Dec 12 07:42:49 2019][264349.849436] [] SyS_write+0x7f/0xf0 [Thu Dec 12 07:42:49 2019][264349.854578] [] system_call_fastpath+0x22/0x27 [Thu Dec 12 07:42:49 2019][264349.860667] Code: eb 9b 45 01 f4 45 39 65 34 75 e5 4c 89 ef e8 e2 f7 ff ff eb db 66 66 66 66 90 55 48 89 e5 c7 05 91 31 7e 00 01 00 00 00 0f ae f8 04 25 00 00 00 00 01 5d c3 66 66 66 66 90 55 31 c0 c7 05 0e [Thu Dec 12 07:42:49 2019][264349.881276] RIP [] sysrq_handle_crash+0x16/0x20 [Thu Dec 12 07:42:49 2019][264349.887568] RSP [Thu Dec 12 07:42:49 2019][264349.891147] CR2: 0000000000000000 [Thu Dec 12 07:42:50 2019][ 0.000000] Initializing cgroup subsys cpuset [Thu Dec 12 07:42:50 2019][ 0.000000] Initializing cgroup subsys cpu [Thu Dec 12 07:42:50 2019][ 0.000000] Initializing cgroup subsys cpuacct [Thu Dec 12 07:42:50 2019][ 0.000000] Linux version 3.10.0-957.27.2.el7_lustre.pl2.x86_64 (sthiell@oak-rbh01) (gcc version 4.8.5 20150623 (Red Hat 4.8.5-39) (GCC) ) #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 07:42:50 2019][ 0.000000] Command line: BOOT_IMAGE=/boot/vmlinuz-3.10.0-957.27.2.el7_lustre.pl2.x86_64 ro nomodeset console=ttyS0,115200 LANG=en_US.UTF-8 irqpoll nr_cpus=1 reset_devices cgroup_disable=memory mce=off numa=off udev.children-max=2 panic=10 rootflags=nofail acpi_no_memhotplug transparent_hugepage=never nokaslr rd.driver.blacklist=mlx5_core,mpt3sas disable_cpu_apicid=0 elfcorehdr=900536K [Thu Dec 12 07:42:50 2019][ 0.000000] e820: BIOS-provided physical RAM map: [Thu Dec 12 07:42:50 2019][ 0.000000] BIOS-e820: [mem 0x0000000000000000-0x0000000000000fff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] BIOS-e820: [mem 0x0000000000001000-0x000000000008efff] usable [Thu Dec 12 07:42:50 2019][ 0.000000] BIOS-e820: [mem 0x000000000008f000-0x000000000008ffff] ACPI NVS [Thu Dec 12 07:42:50 2019][ 0.000000] BIOS-e820: [mem 0x000000002c000000-0x0000000036f6dfff] usable [Thu Dec 12 07:42:50 2019][ 0.000000] BIOS-e820: [mem 0x000000004f774000-0x000000005777cfff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] BIOS-e820: [mem 0x000000006cacf000-0x000000006efcefff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] BIOS-e820: [mem 0x000000006efcf000-0x000000006fdfefff] ACPI NVS [Thu Dec 12 07:42:50 2019][ 0.000000] BIOS-e820: [mem 0x000000006fdff000-0x000000006fffefff] ACPI data [Thu Dec 12 07:42:50 2019][ 0.000000] BIOS-e820: [mem 0x0000000070000000-0x000000008fffffff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] BIOS-e820: [mem 0x00000000fec10000-0x00000000fec10fff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] BIOS-e820: [mem 0x00000000fed80000-0x00000000fed80fff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] BIOS-e820: [mem 0x000000107f380000-0x000000107fffffff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] BIOS-e820: [mem 0x000000207ff80000-0x000000207fffffff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] BIOS-e820: [mem 0x000000307ff80000-0x000000307fffffff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] BIOS-e820: [mem 0x000000407ff80000-0x000000407fffffff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] NX (Execute Disable) protection: active [Thu Dec 12 07:42:50 2019][ 0.000000] extended physical RAM map: [Thu Dec 12 07:42:50 2019][ 0.000000] reserve setup_data: [mem 0x0000000000000000-0x0000000000000fff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] reserve setup_data: [mem 0x0000000000001000-0x000000000008efff] usable [Thu Dec 12 07:42:50 2019][ 0.000000] reserve setup_data: [mem 0x000000000008f000-0x000000000008ffff] ACPI NVS [Thu Dec 12 07:42:50 2019][ 0.000000] reserve setup_data: [mem 0x000000002c000000-0x000000002c00006f] usable [Thu Dec 12 07:42:50 2019][ 0.000000] reserve setup_data: [mem 0x000000002c000070-0x0000000036f6dfff] usable [Thu Dec 12 07:42:50 2019][ 0.000000] reserve setup_data: [mem 0x000000004f774000-0x000000005777cfff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] reserve setup_data: [mem 0x000000006cacf000-0x000000006efcefff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] reserve setup_data: [mem 0x000000006efcf000-0x000000006fdfefff] ACPI NVS [Thu Dec 12 07:42:50 2019][ 0.000000] reserve setup_data: [mem 0x000000006fdff000-0x000000006fffefff] ACPI data [Thu Dec 12 07:42:50 2019][ 0.000000] reserve setup_data: [mem 0x0000000070000000-0x000000008fffffff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] reserve setup_data: [mem 0x00000000fec10000-0x00000000fec10fff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] reserve setup_data: [mem 0x00000000fed80000-0x00000000fed80fff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] reserve setup_data: [mem 0x000000107f380000-0x000000107fffffff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] reserve setup_data: [mem 0x000000207ff80000-0x000000207fffffff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] reserve setup_data: [mem 0x000000307ff80000-0x000000307fffffff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] reserve setup_data: [mem 0x000000407ff80000-0x000000407fffffff] reserved [Thu Dec 12 07:42:50 2019][ 0.000000] efi: EFI v2.50 by Dell Inc. [Thu Dec 12 07:42:50 2019][ 0.000000] efi: ACPI=0x6fffe000 ACPI 2.0=0x6fffe014 SMBIOS=0x6eab5000 SMBIOS 3.0=0x6eab3000 [Thu Dec 12 07:42:50 2019][ 0.000000] efi: mem00: type=6, attr=0x800000000000000f, range=[0x000000006cacf000-0x000000006cbcf000) (1MB) [Thu Dec 12 07:42:50 2019][ 0.000000] efi: mem01: type=5, attr=0x800000000000000f, range=[0x000000006cbcf000-0x000000006cdcf000) (2MB) [Thu Dec 12 07:42:50 2019][ 0.000000] efi: mem02: type=11, attr=0x800000000000000f, range=[0x0000000080000000-0x0000000090000000) (256MB) [Thu Dec 12 07:42:50 2019][ 0.000000] efi: mem03: type=11, attr=0x800000000000000f, range=[0x00000000fec10000-0x00000000fec11000) (0MB) [Thu Dec 12 07:42:50 2019][ 0.000000] efi: mem04: type=11, attr=0x800000000000000f, range=[0x00000000fed80000-0x00000000fed81000) (0MB) [Thu Dec 12 07:42:50 2019][ 0.000000] SMBIOS 3.2.0 present. [Thu Dec 12 07:42:50 2019][ 0.000000] DMI: Dell Inc. PowerEdge R6415/07YXFK, BIOS 1.10.6 08/15/2019 [Thu Dec 12 07:42:50 2019][ 0.000000] e820: last_pfn = 0x36f6e max_arch_pfn = 0x400000000 [Thu Dec 12 07:42:50 2019][ 0.000000] PAT configuration [0-7]: WB WC UC- UC WB WP UC- UC [Thu Dec 12 07:42:50 2019][ 0.000000] Using GB pages for direct mapping [Thu Dec 12 07:42:50 2019][ 0.000000] RAMDISK: [mem 0x34c17000-0x357fffff] [Thu Dec 12 07:42:50 2019][ 0.000000] Early table checksum verification disabled [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: RSDP 000000006fffe014 00024 (v02 DELL ) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: XSDT 000000006fffd0e8 000AC (v01 DELL PE_SC3 00000002 DELL 00000001) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: FACP 000000006fff0000 00114 (v06 DELL PE_SC3 00000002 DELL 00000001) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: DSDT 000000006ffdc000 1038C (v02 DELL PE_SC3 00000002 DELL 00000001) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: FACS 000000006fdd3000 00040 [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: SSDT 000000006fffc000 000D2 (v02 DELL PE_SC3 00000002 MSFT 04000000) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: BERT 000000006fffb000 00030 (v01 DELL BERT 00000001 DELL 00000001) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: HEST 000000006fffa000 006DC (v01 DELL HEST 00000001 DELL 00000001) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: SSDT 000000006fff9000 00294 (v01 DELL PE_SC3 00000001 AMD 00000001) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: SRAT 000000006fff8000 00420 (v03 DELL PE_SC3 00000001 AMD 00000001) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: MSCT 000000006fff7000 0004E (v01 DELL PE_SC3 00000000 AMD 00000001) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: SLIT 000000006fff6000 0003C (v01 DELL PE_SC3 00000001 AMD 00000001) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: CRAT 000000006fff3000 02DC0 (v01 DELL PE_SC3 00000001 AMD 00000001) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: EINJ 000000006fff2000 00150 (v01 DELL PE_SC3 00000001 AMD 00000001) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: SLIC 000000006fff1000 00024 (v01 DELL PE_SC3 00000002 DELL 00000001) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: HPET 000000006ffef000 00038 (v01 DELL PE_SC3 00000002 DELL 00000001) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: APIC 000000006ffee000 004B2 (v03 DELL PE_SC3 00000002 DELL 00000001) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: MCFG 000000006ffed000 0003C (v01 DELL PE_SC3 00000002 DELL 00000001) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: SSDT 000000006ffdb000 00629 (v02 DELL xhc_port 00000001 INTL 20170119) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: IVRS 000000006ffda000 00210 (v02 DELL PE_SC3 00000001 AMD 00000000) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: SSDT 000000006ffd8000 01658 (v01 AMD CPMCMN 00000001 INTL 20170119) [Thu Dec 12 07:42:50 2019][ 0.000000] NUMA turned off [Thu Dec 12 07:42:50 2019][ 0.000000] Faking a node at [mem 0x0000000000000000-0x0000000036f6dfff] [Thu Dec 12 07:42:50 2019][ 0.000000] NODE_DATA(0) allocated [mem 0x36f47000-0x36f6dfff] [Thu Dec 12 07:42:50 2019][ 0.000000] Zone ranges: [Thu Dec 12 07:42:50 2019][ 0.000000] DMA [mem 0x00001000-0x00ffffff] [Thu Dec 12 07:42:50 2019][ 0.000000] DMA32 [mem 0x01000000-0xffffffff] [Thu Dec 12 07:42:50 2019][ 0.000000] Normal empty [Thu Dec 12 07:42:50 2019][ 0.000000] Movable zone start for each node [Thu Dec 12 07:42:50 2019][ 0.000000] Early memory node ranges [Thu Dec 12 07:42:50 2019][ 0.000000] node 0: [mem 0x00001000-0x0008efff] [Thu Dec 12 07:42:50 2019][ 0.000000] node 0: [mem 0x2c000000-0x36f6dfff] [Thu Dec 12 07:42:50 2019][ 0.000000] Initmem setup node 0 [mem 0x00001000-0x36f6dfff] [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: PM-Timer IO Port: 0x408 [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x00] lapic_id[0x00] enabled) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: Disabling requested cpu. Processor 0/0x0 ignored. [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x01] lapic_id[0x10] enabled) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 1/0x10 ignored. [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x02] lapic_id[0x20] enabled) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 2/0x20 ignored. [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x03] lapic_id[0x30] enabled) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 3/0x30 ignored. [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x04] lapic_id[0x08] enabled) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 4/0x8 ignored. [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x05] lapic_id[0x18] enabled) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 5/0x18 ignored. [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x06] lapic_id[0x28] enabled) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 6/0x28 ignored. [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x07] lapic_id[0x38] enabled) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 7/0x38 ignored. [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x08] lapic_id[0x02] enabled) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 8/0x2 ignored. [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x09] lapic_id[0x12] enabled) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 9/0x12 ignored. [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0a] lapic_id[0x22] enabled) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 10/0x22 ignored. [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0b] lapic_id[0x32] enabled) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 11/0x32 ignored. [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0c] lapic_id[0x0a] enabled) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 12/0xa ignored. [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0d] lapic_id[0x1a] enabled) [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 13/0x1a ignored. [Thu Dec 12 07:42:50 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0e] lapic_id[0x2a] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 14/0x2a ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0f] lapic_id[0x3a] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 15/0x3a ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x10] lapic_id[0x04] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 16/0x4 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x11] lapic_id[0x14] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 17/0x14 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x12] lapic_id[0x24] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 18/0x24 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x13] lapic_id[0x34] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 19/0x34 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x14] lapic_id[0x0c] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 almost reached. Keeping one slot for boot cpu. Processor 20/0xc ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x15] lapic_id[0x1c] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x16] lapic_id[0x2c] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 22/0x2c ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x17] lapic_id[0x3c] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 23/0x3c ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x18] lapic_id[0x01] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 24/0x1 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x19] lapic_id[0x11] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 25/0x11 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1a] lapic_id[0x21] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 26/0x21 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1b] lapic_id[0x31] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 27/0x31 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1c] lapic_id[0x09] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 28/0x9 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1d] lapic_id[0x19] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 29/0x19 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1e] lapic_id[0x29] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 30/0x29 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1f] lapic_id[0x39] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 31/0x39 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x20] lapic_id[0x03] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 32/0x3 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x21] lapic_id[0x13] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 33/0x13 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x22] lapic_id[0x23] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 34/0x23 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x23] lapic_id[0x33] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 35/0x33 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x24] lapic_id[0x0b] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 36/0xb ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x25] lapic_id[0x1b] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 37/0x1b ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x26] lapic_id[0x2b] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 38/0x2b ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x27] lapic_id[0x3b] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 39/0x3b ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x28] lapic_id[0x05] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 40/0x5 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x29] lapic_id[0x15] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 41/0x15 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2a] lapic_id[0x25] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 42/0x25 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2b] lapic_id[0x35] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 43/0x35 ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2c] lapic_id[0x0d] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 44/0xd ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2d] lapic_id[0x1d] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 45/0x1d ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2e] lapic_id[0x2d] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 46/0x2d ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2f] lapic_id[0x3d] enabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: NR_CPUS/possible_cpus limit of 1 reached. Processor 47/0x3d ignored. [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x30] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x31] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x32] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x33] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x34] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x35] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x36] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x37] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x38] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x39] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3a] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3b] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3c] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3d] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3e] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3f] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x40] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x41] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x42] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x43] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x44] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x45] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x46] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x47] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x48] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x49] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4a] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4b] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4c] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4d] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4e] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4f] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x50] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x51] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x52] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x53] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x54] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x55] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x56] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x57] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x58] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x59] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5a] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5b] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5c] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5d] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5e] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5f] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x60] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x61] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x62] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x63] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x64] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x65] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x66] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x67] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x68] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x69] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6a] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6b] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6c] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6d] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6e] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6f] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x70] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x71] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x72] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x73] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x74] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x75] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x76] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x77] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x78] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x79] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7a] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7b] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7c] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7d] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7e] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7f] lapic_id[0x00] disabled) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: LAPIC_NMI (acpi_id[0xff] high edge lint[0x1]) [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: IOAPIC (id[0x80] address[0xfec00000] gsi_base[0]) [Thu Dec 12 07:42:51 2019][ 0.000000] IOAPIC[0]: apic_id 128, version 33, address 0xfec00000, GSI 0-23 [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: IOAPIC (id[0x81] address[0xfd880000] gsi_base[24]) [Thu Dec 12 07:42:51 2019][ 0.000000] IOAPIC[1]: apic_id 129, version 33, address 0xfd880000, GSI 24-55 [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: IOAPIC (id[0x82] address[0xe0900000] gsi_base[56]) [Thu Dec 12 07:42:51 2019][ 0.000000] IOAPIC[2]: apic_id 130, version 33, address 0xe0900000, GSI 56-87 [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: IOAPIC (id[0x83] address[0xc5900000] gsi_base[88]) [Thu Dec 12 07:42:51 2019][ 0.000000] IOAPIC[3]: apic_id 131, version 33, address 0xc5900000, GSI 88-119 [Thu Dec 12 07:42:51 2019][ 0.000000] ACPI: IOAPIC (id[0x84] address[0xaa900000] gsi_base[120]) [Thu Dec 12 07:42:51 2019][ 0.000000] IOAPIC[4]: apic_id 132, version 33, address 0xaa900000, GSI 120-151 [Thu Dec 12 07:42:52 2019][ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) [Thu Dec 12 07:42:52 2019][ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 low level) [Thu Dec 12 07:42:52 2019][ 0.000000] Using ACPI (MADT) for SMP configuration information [Thu Dec 12 07:42:52 2019][ 0.000000] ACPI: HPET id: 0x10228201 base: 0xfed00000 [Thu Dec 12 07:42:52 2019][ 0.000000] smpboot: 128 Processors exceeds NR_CPUS limit of 1 [Thu Dec 12 07:42:52 2019][ 0.000000] smpboot: Allowing 1 CPUs, 0 hotplug CPUs [Thu Dec 12 07:42:52 2019][ 0.000000] PM: Registered nosave memory: [mem 0x0008f000-0x0008ffff] [Thu Dec 12 07:42:52 2019][ 0.000000] PM: Registered nosave memory: [mem 0x00090000-0x2bffffff] [Thu Dec 12 07:42:52 2019][ 0.000000] PM: Registered nosave memory: [mem 0x2c000000-0x2c000fff] [Thu Dec 12 07:42:52 2019][ 0.000000] e820: [mem 0x90000000-0xfec0ffff] available for PCI devices [Thu Dec 12 07:42:52 2019][ 0.000000] Booting paravirtualized kernel on bare hardware [Thu Dec 12 07:42:52 2019][ 0.000000] setup_percpu: NR_CPUS:5120 nr_cpumask_bits:1 nr_cpu_ids:1 nr_node_ids:1 [Thu Dec 12 07:42:52 2019][ 0.000000] PERCPU: Embedded 38 pages/cpu @ffff880036c00000 s118784 r8192 d28672 u2097152 [Thu Dec 12 07:42:52 2019][ 0.000000] Built 1 zonelists in Node order, mobility grouping on. Total pages: 44326 [Thu Dec 12 07:42:52 2019][ 0.000000] Policy zone: DMA32 [Thu Dec 12 07:42:52 2019][ 0.000000] Kernel command line: BOOT_IMAGE=/boot/vmlinuz-3.10.0-957.27.2.el7_lustre.pl2.x86_64 ro nomodeset console=ttyS0,115200 LANG=en_US.UTF-8 irqpoll nr_cpus=1 reset_devices cgroup_disable=memory mce=off numa=off udev.children-max=2 panic=10 rootflags=nofail acpi_no_memhotplug transparent_hugepage=never nokaslr rd.driver.blacklist=mlx5_core,mpt3sas disable_cpu_apicid=0 elfcorehdr=900536K [Thu Dec 12 07:42:52 2019][ 0.000000] Misrouted IRQ fixup and polling support enabled [Thu Dec 12 07:42:52 2019][ 0.000000] This may significantly impact system performance [Thu Dec 12 07:42:52 2019][ 0.000000] Disabling memory control group subsystem [Thu Dec 12 07:42:52 2019][ 0.000000] PID hash table entries: 1024 (order: 1, 8192 bytes) [Thu Dec 12 07:42:52 2019][ 0.000000] x86/fpu: xstate_offset[2]: 0240, xstate_sizes[2]: 0100 [Thu Dec 12 07:42:52 2019][ 0.000000] xsave: enabled xstate_bv 0x7, cntxt size 0x340 using standard form [Thu Dec 12 07:42:52 2019][ 0.000000] Memory: 142420k/900536k available (7676k kernel code, 720328k absent, 37788k reserved, 6045k data, 1876k init) [Thu Dec 12 07:42:52 2019][ 0.000000] SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=1, Nodes=1 [Thu Dec 12 07:42:52 2019][ 0.000000] Hierarchical RCU implementation. [Thu Dec 12 07:42:52 2019][ 0.000000] RCU restricting CPUs from NR_CPUS=5120 to nr_cpu_ids=1. [Thu Dec 12 07:42:52 2019][ 0.000000] NR_IRQS:327936 nr_irqs:256 0 [Thu Dec 12 07:42:52 2019][ 0.000000] Spurious LAPIC timer interrupt on cpu 0 [Thu Dec 12 07:42:52 2019][ 0.000000] Console: colour dummy device 80x25 [Thu Dec 12 07:42:52 2019][ 0.000000] console [ttyS0] enabled [Thu Dec 12 07:42:52 2019][ 0.000000] tsc: Fast TSC calibration using PIT [Thu Dec 12 07:42:52 2019][ 0.000000] tsc: Detected 1996.200 MHz processor [Thu Dec 12 07:42:52 2019][ 0.000021] Calibrating delay loop (skipped), value calculated using timer frequency.. 3992.40 BogoMIPS (lpj=1996200) [Thu Dec 12 07:42:52 2019][ 0.010668] pid_max: default: 32768 minimum: 301 [Thu Dec 12 07:42:52 2019][ 0.015811] Security Framework initialized [Thu Dec 12 07:42:52 2019][ 0.019927] SELinux: Initializing. [Thu Dec 12 07:42:52 2019][ 0.023443] Yama: becoming mindful. [Thu Dec 12 07:42:52 2019][ 0.026995] Dentry cache hash table entries: 32768 (order: 6, 262144 bytes) [Thu Dec 12 07:42:52 2019][ 0.034013] Inode-cache hash table entries: 16384 (order: 5, 131072 bytes) [Thu Dec 12 07:42:52 2019][ 0.040923] Mount-cache hash table entries: 512 (order: 0, 4096 bytes) [Thu Dec 12 07:42:52 2019][ 0.047459] Mountpoint-cache hash table entries: 512 (order: 0, 4096 bytes) [Thu Dec 12 07:42:52 2019][ 0.054588] Initializing cgroup subsys memory [Thu Dec 12 07:42:52 2019][ 0.058963] Initializing cgroup subsys devices [Thu Dec 12 07:42:52 2019][ 0.063424] Initializing cgroup subsys freezer [Thu Dec 12 07:42:52 2019][ 0.067887] Initializing cgroup subsys net_cls [Thu Dec 12 07:42:52 2019][ 0.072350] Initializing cgroup subsys blkio [Thu Dec 12 07:42:52 2019][ 0.076630] Initializing cgroup subsys perf_event [Thu Dec 12 07:42:52 2019][ 0.081346] Initializing cgroup subsys hugetlb [Thu Dec 12 07:42:52 2019][ 0.085808] Initializing cgroup subsys pids [Thu Dec 12 07:42:52 2019][ 0.090005] Initializing cgroup subsys net_prio [Thu Dec 12 07:42:52 2019][ 0.100221] Last level iTLB entries: 4KB 1024, 2MB 1024, 4MB 512 [Thu Dec 12 07:42:52 2019][ 0.106238] Last level dTLB entries: 4KB 1536, 2MB 1536, 4MB 768 [Thu Dec 12 07:42:52 2019][ 0.112259] tlb_flushall_shift: 6 [Thu Dec 12 07:42:52 2019][ 0.115598] Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl and seccomp [Thu Dec 12 07:42:52 2019][ 0.125174] FEATURE SPEC_CTRL Not Present [Thu Dec 12 07:42:52 2019][ 0.129195] FEATURE IBPB_SUPPORT Present [Thu Dec 12 07:42:52 2019][ 0.133132] Spectre V2 : Enabling Indirect Branch Prediction Barrier [Thu Dec 12 07:42:52 2019][ 0.139562] Spectre V2 : Mitigation: Full retpoline [Thu Dec 12 07:42:52 2019][ 0.149134] Freeing SMP alternatives: 28k freed [Thu Dec 12 07:42:52 2019][ 0.155492] ACPI: Core revision 20130517 [Thu Dec 12 07:42:52 2019][ 0.163835] ACPI: All ACPI Tables successfully acquired [Thu Dec 12 07:42:52 2019][ 0.169146] ftrace: allocating 29216 entries in 115 pages [Thu Dec 12 07:42:52 2019][ 0.201024] Translation is already enabled - trying to copy translation structures [Thu Dec 12 07:42:52 2019][ 0.209108] Copied DEV table from previous kernel. [Thu Dec 12 07:42:52 2019][ 0.521141] ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 [Thu Dec 12 07:42:52 2019][ 0.537152] smpboot: CPU0: AMD EPYC 7401P 24-Core Processor (fam: 17, model: 01, stepping: 02) [Thu Dec 12 07:42:52 2019][ 0.647635] Performance Events: Fam17h core perfctr, Broken BIOS detected, complain to your hardware vendor. [Thu Dec 12 07:42:52 2019][ 0.657533] [Firmware Bug]: the BIOS has corrupted hw-PMU resources (MSR c0010200 is 530076) [Thu Dec 12 07:42:52 2019][ 0.665966] AMD PMU driver. [Thu Dec 12 07:42:52 2019][ 0.668770] ... version: 0 [Thu Dec 12 07:42:52 2019][ 0.672777] ... bit width: 48 [Thu Dec 12 07:42:52 2019][ 0.676877] ... generic registers: 6 [Thu Dec 12 07:42:52 2019][ 0.680892] ... value mask: 0000ffffffffffff [Thu Dec 12 07:42:52 2019][ 0.686202] ... max period: 00007fffffffffff [Thu Dec 12 07:42:52 2019][ 0.691515] ... fixed-purpose events: 0 [Thu Dec 12 07:42:52 2019][ 0.695530] ... event mask: 000000000000003f [Thu Dec 12 07:42:52 2019][ 0.702081] Brought up 1 CPUs [Thu Dec 12 07:42:52 2019][ 0.705057] smpboot: Max logical packages: 1 [Thu Dec 12 07:42:52 2019][ 0.709327] smpboot: Total of 1 processors activated (3992.40 BogoMIPS) [Thu Dec 12 07:42:52 2019][ 0.716482] NMI watchdog: enabled on all CPUs, permanently consumes one hw-PMU counter. [Thu Dec 12 07:42:52 2019][ 0.724926] devtmpfs: initialized [Thu Dec 12 07:42:52 2019][ 0.730542] EVM: security.selinux [Thu Dec 12 07:42:52 2019][ 0.733862] EVM: security.ima [Thu Dec 12 07:42:53 2019][ 0.736836] EVM: security.capability [Thu Dec 12 07:42:53 2019][ 0.740524] PM: Registering ACPI NVS region [mem 0x0008f000-0x0008ffff] (4096 bytes) [Thu Dec 12 07:42:53 2019][ 0.748272] PM: Registering ACPI NVS region [mem 0x6efcf000-0x6fdfefff] (14876672 bytes) [Thu Dec 12 07:42:53 2019][ 0.757526] atomic64 test passed for x86-64 platform with CX8 and with SSE [Thu Dec 12 07:42:53 2019][ 0.764402] pinctrl core: initialized pinctrl subsystem [Thu Dec 12 07:42:53 2019][ 0.769697] RTC time: 15:42:52, date: 12/12/19 [Thu Dec 12 07:42:53 2019][ 0.774239] NET: Registered protocol family 16 [Thu Dec 12 07:42:53 2019][ 0.778956] ACPI FADT declares the system doesn't support PCIe ASPM, so disable it [Thu Dec 12 07:42:53 2019][ 0.786524] ACPI: bus type PCI registered [Thu Dec 12 07:42:53 2019][ 0.790536] acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 [Thu Dec 12 07:42:53 2019][ 0.797040] PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0x80000000-0x8fffffff] (base 0x80000000) [Thu Dec 12 07:42:53 2019][ 0.806337] PCI: MMCONFIG at [mem 0x80000000-0x8fffffff] reserved in E820 [Thu Dec 12 07:42:53 2019][ 0.813130] PCI: Using configuration type 1 for base access [Thu Dec 12 07:42:53 2019][ 0.818712] PCI: Dell System detected, enabling pci=bfsort. [Thu Dec 12 07:42:53 2019][ 0.825384] ACPI: Added _OSI(Module Device) [Thu Dec 12 07:42:53 2019][ 0.829575] ACPI: Added _OSI(Processor Device) [Thu Dec 12 07:42:53 2019][ 0.834018] ACPI: Added _OSI(3.0 _SCP Extensions) [Thu Dec 12 07:42:53 2019][ 0.838724] ACPI: Added _OSI(Processor Aggregator Device) [Thu Dec 12 07:42:53 2019][ 0.844128] ACPI: Added _OSI(Linux-Dell-Video) [Thu Dec 12 07:42:53 2019][ 0.850357] ACPI: Executed 2 blocks of module-level executable AML code [Thu Dec 12 07:42:53 2019][ 0.862294] ACPI: Interpreter enabled [Thu Dec 12 07:42:53 2019][ 0.865972] ACPI: (supports S0 S5) [Thu Dec 12 07:42:53 2019][ 0.869379] ACPI: Using IOAPIC for interrupt routing [Thu Dec 12 07:42:53 2019][ 0.874558] HEST: Table parsing has been initialized. [Thu Dec 12 07:42:53 2019][ 0.879618] PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug [Thu Dec 12 07:42:53 2019][ 0.888762] ACPI: Enabled 1 GPEs in block 00 to 1F [Thu Dec 12 07:42:53 2019][ 0.900490] ACPI: PCI Interrupt Link [LNKA] (IRQs 4 5 7 10 11 14 15) *0 [Thu Dec 12 07:42:53 2019][ 0.907401] ACPI: PCI Interrupt Link [LNKB] (IRQs 4 5 7 10 11 14 15) *0 [Thu Dec 12 07:42:53 2019][ 0.914307] ACPI: PCI Interrupt Link [LNKC] (IRQs 4 5 7 10 11 14 15) *0 [Thu Dec 12 07:42:53 2019][ 0.921213] ACPI: PCI Interrupt Link [LNKD] (IRQs 4 5 7 10 11 14 15) *0 [Thu Dec 12 07:42:53 2019][ 0.928120] ACPI: PCI Interrupt Link [LNKE] (IRQs 4 5 7 10 11 14 15) *0 [Thu Dec 12 07:42:53 2019][ 0.935027] ACPI: PCI Interrupt Link [LNKF] (IRQs 4 5 7 10 11 14 15) *0 [Thu Dec 12 07:42:53 2019][ 0.941935] ACPI: PCI Interrupt Link [LNKG] (IRQs 4 5 7 10 11 14 15) *0 [Thu Dec 12 07:42:53 2019][ 0.948843] ACPI: PCI Interrupt Link [LNKH] (IRQs 4 5 7 10 11 14 15) *0 [Thu Dec 12 07:42:53 2019][ 0.955895] ACPI: PCI Root Bridge [PC00] (domain 0000 [bus 00-3f]) [Thu Dec 12 07:42:53 2019][ 0.962077] acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI] [Thu Dec 12 07:42:53 2019][ 0.970294] acpi PNP0A08:00: PCIe AER handled by firmware [Thu Dec 12 07:42:53 2019][ 0.975738] acpi PNP0A08:00: _OSC: platform does not support [SHPCHotplug] [Thu Dec 12 07:42:53 2019][ 0.982687] acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] [Thu Dec 12 07:42:53 2019][ 0.990343] acpi PNP0A08:00: FADT indicates ASPM is unsupported, using BIOS configuration [Thu Dec 12 07:42:53 2019][ 0.998799] PCI host bridge to bus 0000:00 [Thu Dec 12 07:42:53 2019][ 1.002904] pci_bus 0000:00: root bus resource [io 0x0000-0x03af window] [Thu Dec 12 07:42:53 2019][ 1.009686] pci_bus 0000:00: root bus resource [io 0x03e0-0x0cf7 window] [Thu Dec 12 07:42:53 2019][ 1.016473] pci_bus 0000:00: root bus resource [mem 0x000c0000-0x000c3fff window] [Thu Dec 12 07:42:53 2019][ 1.023952] pci_bus 0000:00: root bus resource [mem 0x000c4000-0x000c7fff window] [Thu Dec 12 07:42:53 2019][ 1.031432] pci_bus 0000:00: root bus resource [mem 0x000c8000-0x000cbfff window] [Thu Dec 12 07:42:53 2019][ 1.038912] pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000cffff window] [Thu Dec 12 07:42:53 2019][ 1.046391] pci_bus 0000:00: root bus resource [mem 0x000d0000-0x000d3fff window] [Thu Dec 12 07:42:53 2019][ 1.053871] pci_bus 0000:00: root bus resource [mem 0x000d4000-0x000d7fff window] [Thu Dec 12 07:42:53 2019][ 1.061350] pci_bus 0000:00: root bus resource [mem 0x000d8000-0x000dbfff window] [Thu Dec 12 07:42:53 2019][ 1.068831] pci_bus 0000:00: root bus resource [mem 0x000dc000-0x000dffff window] [Thu Dec 12 07:42:53 2019][ 1.076310] pci_bus 0000:00: root bus resource [mem 0x000e0000-0x000e3fff window] [Thu Dec 12 07:42:53 2019][ 1.083790] pci_bus 0000:00: root bus resource [mem 0x000e4000-0x000e7fff window] [Thu Dec 12 07:42:53 2019][ 1.091269] pci_bus 0000:00: root bus resource [mem 0x000e8000-0x000ebfff window] [Thu Dec 12 07:42:53 2019][ 1.098749] pci_bus 0000:00: root bus resource [mem 0x000ec000-0x000effff window] [Thu Dec 12 07:42:53 2019][ 1.106228] pci_bus 0000:00: root bus resource [mem 0x000f0000-0x000fffff window] [Thu Dec 12 07:42:53 2019][ 1.113708] pci_bus 0000:00: root bus resource [io 0x0d00-0x3fff window] [Thu Dec 12 07:42:53 2019][ 1.120493] pci_bus 0000:00: root bus resource [mem 0xe1000000-0xfebfffff window] [Thu Dec 12 07:42:53 2019][ 1.127974] pci_bus 0000:00: root bus resource [mem 0x10000000000-0x2bf3fffffff window] [Thu Dec 12 07:42:53 2019][ 1.135972] pci_bus 0000:00: root bus resource [bus 00-3f] [Thu Dec 12 07:42:53 2019][ 1.147270] pci 0000:00:03.1: PCI bridge to [bus 01] [Thu Dec 12 07:42:53 2019][ 1.153318] pci 0000:00:07.1: PCI bridge to [bus 02] [Thu Dec 12 07:42:53 2019][ 1.159239] pci 0000:00:08.1: PCI bridge to [bus 03] [Thu Dec 12 07:42:53 2019][ 1.164603] ACPI: PCI Root Bridge [PC01] (domain 0000 [bus 40-7f]) [Thu Dec 12 07:42:53 2019][ 1.170790] acpi PNP0A08:01: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI] [Thu Dec 12 07:42:53 2019][ 1.178997] acpi PNP0A08:01: PCIe AER handled by firmware [Thu Dec 12 07:42:53 2019][ 1.184435] acpi PNP0A08:01: _OSC: platform does not support [SHPCHotplug] [Thu Dec 12 07:42:53 2019][ 1.191381] acpi PNP0A08:01: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] [Thu Dec 12 07:42:53 2019][ 1.199033] acpi PNP0A08:01: FADT indicates ASPM is unsupported, using BIOS configuration [Thu Dec 12 07:42:53 2019][ 1.207443] PCI host bridge to bus 0000:40 [Thu Dec 12 07:42:53 2019][ 1.211541] pci_bus 0000:40: root bus resource [io 0x4000-0x7fff window] [Thu Dec 12 07:42:53 2019][ 1.218325] pci_bus 0000:40: root bus resource [mem 0xc6000000-0xe0ffffff window] [Thu Dec 12 07:42:53 2019][ 1.225806] pci_bus 0000:40: root bus resource [mem 0x2bf40000000-0x47e7fffffff window] [Thu Dec 12 07:42:53 2019][ 1.233805] pci_bus 0000:40: root bus resource [bus 40-7f] [Thu Dec 12 07:42:53 2019][ 1.241110] pci 0000:40:07.1: PCI bridge to [bus 41] [Thu Dec 12 07:42:53 2019][ 1.246605] pci 0000:40:08.1: PCI bridge to [bus 42] [Thu Dec 12 07:42:53 2019][ 1.251760] ACPI: PCI Root Bridge [PC02] (domain 0000 [bus 80-bf]) [Thu Dec 12 07:42:53 2019][ 1.257945] acpi PNP0A08:02: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI] [Thu Dec 12 07:42:53 2019][ 1.266154] acpi PNP0A08:02: PCIe AER handled by firmware [Thu Dec 12 07:42:53 2019][ 1.271597] acpi PNP0A08:02: _OSC: platform does not support [SHPCHotplug] [Thu Dec 12 07:42:53 2019][ 1.278545] acpi PNP0A08:02: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] [Thu Dec 12 07:42:53 2019][ 1.286197] acpi PNP0A08:02: FADT indicates ASPM is unsupported, using BIOS configuration [Thu Dec 12 07:42:53 2019][ 1.294631] PCI host bridge to bus 0000:80 [Thu Dec 12 07:42:53 2019][ 1.298732] pci_bus 0000:80: root bus resource [io 0x03b0-0x03df window] [Thu Dec 12 07:42:53 2019][ 1.305515] pci_bus 0000:80: root bus resource [mem 0x000a0000-0x000bffff window] [Thu Dec 12 07:42:53 2019][ 1.312995] pci_bus 0000:80: root bus resource [io 0x8000-0xbfff window] [Thu Dec 12 07:42:53 2019][ 1.319780] pci_bus 0000:80: root bus resource [mem 0xab000000-0xc5ffffff window] [Thu Dec 12 07:42:53 2019][ 1.327259] pci_bus 0000:80: root bus resource [mem 0x47e80000000-0x63dbfffffff window] [Thu Dec 12 07:42:53 2019][ 1.335259] pci_bus 0000:80: root bus resource [bus 80-bf] [Thu Dec 12 07:42:53 2019][ 1.343677] pci 0000:80:01.1: PCI bridge to [bus 81] [Thu Dec 12 07:42:53 2019][ 1.350699] pci 0000:80:01.2: PCI bridge to [bus 82-83] [Thu Dec 12 07:42:53 2019][ 1.356152] pci 0000:82:00.0: PCI bridge to [bus 83] [Thu Dec 12 07:42:53 2019][ 1.362697] pci 0000:80:03.1: PCI bridge to [bus 84] [Thu Dec 12 07:42:53 2019][ 1.368262] pci 0000:80:07.1: PCI bridge to [bus 85] [Thu Dec 12 07:42:53 2019][ 1.373695] pci 0000:80:08.1: PCI bridge to [bus 86] [Thu Dec 12 07:42:53 2019][ 1.378860] ACPI: PCI Root Bridge [PC03] (domain 0000 [bus c0-ff]) [Thu Dec 12 07:42:53 2019][ 1.385045] acpi PNP0A08:03: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI] [Thu Dec 12 07:42:53 2019][ 1.393251] acpi PNP0A08:03: PCIe AER handled by firmware [Thu Dec 12 07:42:53 2019][ 1.398687] acpi PNP0A08:03: _OSC: platform does not support [SHPCHotplug] [Thu Dec 12 07:42:53 2019][ 1.405635] acpi PNP0A08:03: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] [Thu Dec 12 07:42:53 2019][ 1.413288] acpi PNP0A08:03: FADT indicates ASPM is unsupported, using BIOS configuration [Thu Dec 12 07:42:53 2019][ 1.421608] acpi PNP0A08:03: host bridge window [mem 0x63dc0000000-0xffffffffffff window] ([0x80000000000-0xffffffffffff] ignored, not CPU addressable) [Thu Dec 12 07:42:53 2019][ 1.435240] PCI host bridge to bus 0000:c0 [Thu Dec 12 07:42:53 2019][ 1.439342] pci_bus 0000:c0: root bus resource [io 0xc000-0xffff window] [Thu Dec 12 07:42:53 2019][ 1.446126] pci_bus 0000:c0: root bus resource [mem 0x90000000-0xaaffffff window] [Thu Dec 12 07:42:53 2019][ 1.453606] pci_bus 0000:c0: root bus resource [mem 0x63dc0000000-0x7ffffffffff window] [Thu Dec 12 07:42:53 2019][ 1.461606] pci_bus 0000:c0: root bus resource [bus c0-ff] [Thu Dec 12 07:42:53 2019][ 1.468905] pci 0000:c0:01.1: PCI bridge to [bus c1] [Thu Dec 12 07:42:53 2019][ 1.474504] pci 0000:c0:07.1: PCI bridge to [bus c2] [Thu Dec 12 07:42:53 2019][ 1.479812] pci 0000:c0:08.1: PCI bridge to [bus c3] [Thu Dec 12 07:42:53 2019][ 1.485164] acpi ACPI0007:15: BIOS reported wrong ACPI id 0 for the processor [Thu Dec 12 07:42:53 2019][ 1.494143] vgaarb: device added: PCI:0000:83:00.0,decodes=io+mem,owns=io+mem,locks=none [Thu Dec 12 07:42:53 2019][ 1.502228] vgaarb: loaded [Thu Dec 12 07:42:53 2019][ 1.504939] vgaarb: bridge control possible 0000:83:00.0 [Thu Dec 12 07:42:53 2019][ 1.510312] SCSI subsystem initialized [Thu Dec 12 07:42:53 2019][ 1.514088] ACPI: bus type USB registered [Thu Dec 12 07:42:53 2019][ 1.518119] usbcore: registered new interface driver usbfs [Thu Dec 12 07:42:53 2019][ 1.523616] usbcore: registered new interface driver hub [Thu Dec 12 07:42:53 2019][ 1.528939] usbcore: registered new device driver usb [Thu Dec 12 07:42:53 2019][ 1.534045] EDAC MC: Ver: 3.0.0 [Thu Dec 12 07:42:53 2019][ 1.537415] PCI: Using ACPI for IRQ routing [Thu Dec 12 07:42:53 2019][ 1.560910] NetLabel: Initializing [Thu Dec 12 07:42:53 2019][ 1.564319] NetLabel: domain hash size = 128 [Thu Dec 12 07:42:53 2019][ 1.568674] NetLabel: protocols = UNLABELED CIPSOv4 [Thu Dec 12 07:42:53 2019][ 1.573654] NetLabel: unlabeled traffic allowed by default [Thu Dec 12 07:42:53 2019][ 1.579307] hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 [Thu Dec 12 07:42:53 2019][ 1.584293] hpet0: 3 comparators, 32-bit 14.318180 MHz counter [Thu Dec 12 07:42:53 2019][ 1.592186] Switched to clocksource hpet [Thu Dec 12 07:42:53 2019][ 1.599830] pnp: PnP ACPI init [Thu Dec 12 07:42:53 2019][ 1.602912] ACPI: bus type PNP registered [Thu Dec 12 07:42:53 2019][ 1.607095] system 00:00: [mem 0x80000000-0x8fffffff] has been reserved [Thu Dec 12 07:42:53 2019][ 1.614293] pnp: PnP ACPI: found 4 devices [Thu Dec 12 07:42:53 2019][ 1.618397] ACPI: bus type PNP unregistered [Thu Dec 12 07:42:53 2019][ 1.628247] pci 0000:01:00.0: can't claim BAR 6 [mem 0xfff00000-0xffffffff pref]: no compatible bridge window [Thu Dec 12 07:42:53 2019][ 1.638161] pci 0000:81:00.0: can't claim BAR 6 [mem 0xfffc0000-0xffffffff pref]: no compatible bridge window [Thu Dec 12 07:42:53 2019][ 1.648076] pci 0000:81:00.1: can't claim BAR 6 [mem 0xfffc0000-0xffffffff pref]: no compatible bridge window [Thu Dec 12 07:42:53 2019][ 1.657990] pci 0000:84:00.0: can't claim BAR 6 [mem 0xfffc0000-0xffffffff pref]: no compatible bridge window [Thu Dec 12 07:42:53 2019][ 1.667904] pci 0000:c1:00.0: can't claim BAR 6 [mem 0xfff00000-0xffffffff pref]: no compatible bridge window [Thu Dec 12 07:42:53 2019][ 1.677838] pci 0000:01:00.0: BAR 6: assigned [mem 0xe1000000-0xe10fffff pref] [Thu Dec 12 07:42:53 2019][ 1.685065] pci 0000:00:03.1: PCI bridge to [bus 01] [Thu Dec 12 07:42:53 2019][ 1.690042] pci 0000:00:03.1: bridge window [mem 0xe1000000-0xe10fffff] [Thu Dec 12 07:42:53 2019][ 1.696844] pci 0000:00:03.1: bridge window [mem 0xe2000000-0xe3ffffff 64bit pref] [Thu Dec 12 07:42:53 2019][ 1.704593] pci 0000:00:07.1: PCI bridge to [bus 02] [Thu Dec 12 07:42:53 2019][ 1.709567] pci 0000:00:07.1: bridge window [mem 0xf7200000-0xf74fffff] [Thu Dec 12 07:42:53 2019][ 1.716363] pci 0000:00:08.1: PCI bridge to [bus 03] [Thu Dec 12 07:42:53 2019][ 1.721335] pci 0000:00:08.1: bridge window [mem 0xf7000000-0xf71fffff] [Thu Dec 12 07:42:54 2019][ 1.728181] pci 0000:40:07.1: PCI bridge to [bus 41] [Thu Dec 12 07:42:54 2019][ 1.733159] pci 0000:40:07.1: bridge window [mem 0xdb200000-0xdb4fffff] [Thu Dec 12 07:42:54 2019][ 1.739954] pci 0000:40:08.1: PCI bridge to [bus 42] [Thu Dec 12 07:42:54 2019][ 1.744926] pci 0000:40:08.1: bridge window [mem 0xdb000000-0xdb1fffff] [Thu Dec 12 07:42:54 2019][ 1.751764] pci 0000:81:00.0: BAR 6: assigned [mem 0xac300000-0xac33ffff pref] [Thu Dec 12 07:42:54 2019][ 1.758993] pci 0000:81:00.1: BAR 6: assigned [mem 0xac340000-0xac37ffff pref] [Thu Dec 12 07:42:54 2019][ 1.766220] pci 0000:80:01.1: PCI bridge to [bus 81] [Thu Dec 12 07:42:54 2019][ 1.771196] pci 0000:80:01.1: bridge window [mem 0xac300000-0xac3fffff] [Thu Dec 12 07:42:54 2019][ 1.777991] pci 0000:80:01.1: bridge window [mem 0xac200000-0xac2fffff 64bit pref] [Thu Dec 12 07:42:54 2019][ 1.785742] pci 0000:82:00.0: PCI bridge to [bus 83] [Thu Dec 12 07:42:54 2019][ 1.790716] pci 0000:82:00.0: bridge window [mem 0xc0000000-0xc08fffff] [Thu Dec 12 07:42:54 2019][ 1.797509] pci 0000:82:00.0: bridge window [mem 0xab000000-0xabffffff 64bit pref] [Thu Dec 12 07:42:54 2019][ 1.805262] pci 0000:80:01.2: PCI bridge to [bus 82-83] [Thu Dec 12 07:42:54 2019][ 1.810501] pci 0000:80:01.2: bridge window [mem 0xc0000000-0xc08fffff] [Thu Dec 12 07:42:54 2019][ 1.817295] pci 0000:80:01.2: bridge window [mem 0xab000000-0xabffffff 64bit pref] [Thu Dec 12 07:42:54 2019][ 1.825046] pci 0000:84:00.0: BAR 6: no space for [mem size 0x00040000 pref] [Thu Dec 12 07:42:54 2019][ 1.832098] pci 0000:84:00.0: BAR 6: failed to assign [mem size 0x00040000 pref] [Thu Dec 12 07:42:54 2019][ 1.839498] pci 0000:80:03.1: PCI bridge to [bus 84] [Thu Dec 12 07:42:54 2019][ 1.844475] pci 0000:80:03.1: bridge window [io 0x8000-0x8fff] [Thu Dec 12 07:42:54 2019][ 1.850576] pci 0000:80:03.1: bridge window [mem 0xc0d00000-0xc0dfffff] [Thu Dec 12 07:42:54 2019][ 1.857372] pci 0000:80:03.1: bridge window [mem 0xac000000-0xac1fffff 64bit pref] [Thu Dec 12 07:42:54 2019][ 1.865121] pci 0000:80:07.1: PCI bridge to [bus 85] [Thu Dec 12 07:42:54 2019][ 1.870094] pci 0000:80:07.1: bridge window [mem 0xc0b00000-0xc0cfffff] [Thu Dec 12 07:42:54 2019][ 1.876890] pci 0000:80:08.1: PCI bridge to [bus 86] [Thu Dec 12 07:42:54 2019][ 1.881864] pci 0000:80:08.1: bridge window [mem 0xc0900000-0xc0afffff] [Thu Dec 12 07:42:54 2019][ 1.888701] pci 0000:c1:00.0: BAR 6: no space for [mem size 0x00100000 pref] [Thu Dec 12 07:42:54 2019][ 1.895756] pci 0000:c1:00.0: BAR 6: failed to assign [mem size 0x00100000 pref] [Thu Dec 12 07:42:54 2019][ 1.903156] pci 0000:c0:01.1: PCI bridge to [bus c1] [Thu Dec 12 07:42:54 2019][ 1.908133] pci 0000:c0:01.1: bridge window [io 0xc000-0xcfff] [Thu Dec 12 07:42:54 2019][ 1.914235] pci 0000:c0:01.1: bridge window [mem 0xa5400000-0xa55fffff] [Thu Dec 12 07:42:54 2019][ 1.921032] pci 0000:c0:07.1: PCI bridge to [bus c2] [Thu Dec 12 07:42:54 2019][ 1.926005] pci 0000:c0:07.1: bridge window [mem 0xa5200000-0xa53fffff] [Thu Dec 12 07:42:54 2019][ 1.932802] pci 0000:c0:08.1: PCI bridge to [bus c3] [Thu Dec 12 07:42:54 2019][ 1.937774] pci 0000:c0:08.1: bridge window [mem 0xa5000000-0xa51fffff] [Thu Dec 12 07:42:54 2019][ 1.944609] NET: Registered protocol family 2 [Thu Dec 12 07:42:54 2019][ 1.949143] TCP established hash table entries: 2048 (order: 2, 16384 bytes) [Thu Dec 12 07:42:54 2019][ 1.956222] TCP bind hash table entries: 2048 (order: 3, 32768 bytes) [Thu Dec 12 07:42:54 2019][ 1.962679] TCP: Hash tables configured (established 2048 bind 2048) [Thu Dec 12 07:42:54 2019][ 1.969057] TCP: reno registered [Thu Dec 12 07:42:54 2019][ 1.972302] UDP hash table entries: 256 (order: 1, 8192 bytes) [Thu Dec 12 07:42:54 2019][ 1.978146] UDP-Lite hash table entries: 256 (order: 1, 8192 bytes) [Thu Dec 12 07:42:54 2019][ 1.984449] NET: Registered protocol family 1 [Thu Dec 12 07:42:54 2019][ 1.989594] Unpacking initramfs... [Thu Dec 12 07:42:54 2019][ 2.161265] Freeing initrd memory: 12196k freed [Thu Dec 12 07:42:54 2019][ 2.167610] AMD-Vi: IOMMU performance counters supported [Thu Dec 12 07:42:54 2019][ 2.173021] AMD-Vi: IOMMU performance counters supported [Thu Dec 12 07:42:54 2019][ 2.178370] AMD-Vi: IOMMU performance counters supported [Thu Dec 12 07:42:54 2019][ 2.183726] AMD-Vi: IOMMU performance counters supported [Thu Dec 12 07:42:54 2019][ 2.189173] iommu: Adding device 0000:00:01.0 to group 0 [Thu Dec 12 07:42:54 2019][ 2.194525] iommu: Adding device 0000:00:02.0 to group 1 [Thu Dec 12 07:42:54 2019][ 2.199884] iommu: Adding device 0000:00:03.0 to group 2 [Thu Dec 12 07:42:54 2019][ 2.205235] iommu: Adding device 0000:00:03.1 to group 3 [Thu Dec 12 07:42:54 2019][ 2.210584] iommu: Adding device 0000:00:04.0 to group 4 [Thu Dec 12 07:42:54 2019][ 2.215933] iommu: Adding device 0000:00:07.0 to group 5 [Thu Dec 12 07:42:54 2019][ 2.221286] iommu: Adding device 0000:00:07.1 to group 6 [Thu Dec 12 07:42:54 2019][ 2.226640] iommu: Adding device 0000:00:08.0 to group 7 [Thu Dec 12 07:42:54 2019][ 2.231989] iommu: Adding device 0000:00:08.1 to group 8 [Thu Dec 12 07:42:54 2019][ 2.237363] iommu: Adding device 0000:00:14.0 to group 9 [Thu Dec 12 07:42:54 2019][ 2.242692] iommu: Adding device 0000:00:14.3 to group 9 [Thu Dec 12 07:42:54 2019][ 2.248081] iommu: Adding device 0000:00:18.0 to group 10 [Thu Dec 12 07:42:54 2019][ 2.253500] iommu: Adding device 0000:00:18.1 to group 10 [Thu Dec 12 07:42:54 2019][ 2.258919] iommu: Adding device 0000:00:18.2 to group 10 [Thu Dec 12 07:42:54 2019][ 2.264339] iommu: Adding device 0000:00:18.3 to group 10 [Thu Dec 12 07:42:54 2019][ 2.269759] iommu: Adding device 0000:00:18.4 to group 10 [Thu Dec 12 07:42:54 2019][ 2.275176] iommu: Adding device 0000:00:18.5 to group 10 [Thu Dec 12 07:42:54 2019][ 2.280594] iommu: Adding device 0000:00:18.6 to group 10 [Thu Dec 12 07:42:54 2019][ 2.286011] iommu: Adding device 0000:00:18.7 to group 10 [Thu Dec 12 07:42:54 2019][ 2.291487] iommu: Adding device 0000:00:19.0 to group 11 [Thu Dec 12 07:42:54 2019][ 2.296905] iommu: Adding device 0000:00:19.1 to group 11 [Thu Dec 12 07:42:54 2019][ 2.302323] iommu: Adding device 0000:00:19.2 to group 11 [Thu Dec 12 07:42:54 2019][ 2.307739] iommu: Adding device 0000:00:19.3 to group 11 [Thu Dec 12 07:42:54 2019][ 2.313157] iommu: Adding device 0000:00:19.4 to group 11 [Thu Dec 12 07:42:54 2019][ 2.318572] iommu: Adding device 0000:00:19.5 to group 11 [Thu Dec 12 07:42:54 2019][ 2.323989] iommu: Adding device 0000:00:19.6 to group 11 [Thu Dec 12 07:42:54 2019][ 2.329406] iommu: Adding device 0000:00:19.7 to group 11 [Thu Dec 12 07:42:54 2019][ 2.334882] iommu: Adding device 0000:00:1a.0 to group 12 [Thu Dec 12 07:42:54 2019][ 2.340300] iommu: Adding device 0000:00:1a.1 to group 12 [Thu Dec 12 07:42:54 2019][ 2.345717] iommu: Adding device 0000:00:1a.2 to group 12 [Thu Dec 12 07:42:54 2019][ 2.351134] iommu: Adding device 0000:00:1a.3 to group 12 [Thu Dec 12 07:42:54 2019][ 2.356550] iommu: Adding device 0000:00:1a.4 to group 12 [Thu Dec 12 07:42:54 2019][ 2.361969] iommu: Adding device 0000:00:1a.5 to group 12 [Thu Dec 12 07:42:54 2019][ 2.367384] iommu: Adding device 0000:00:1a.6 to group 12 [Thu Dec 12 07:42:54 2019][ 2.372802] iommu: Adding device 0000:00:1a.7 to group 12 [Thu Dec 12 07:42:54 2019][ 2.378286] iommu: Adding device 0000:00:1b.0 to group 13 [Thu Dec 12 07:42:54 2019][ 2.383704] iommu: Adding device 0000:00:1b.1 to group 13 [Thu Dec 12 07:42:54 2019][ 2.389122] iommu: Adding device 0000:00:1b.2 to group 13 [Thu Dec 12 07:42:54 2019][ 2.394540] iommu: Adding device 0000:00:1b.3 to group 13 [Thu Dec 12 07:42:54 2019][ 2.399964] iommu: Adding device 0000:00:1b.4 to group 13 [Thu Dec 12 07:42:54 2019][ 2.405381] iommu: Adding device 0000:00:1b.5 to group 13 [Thu Dec 12 07:42:54 2019][ 2.410795] iommu: Adding device 0000:00:1b.6 to group 13 [Thu Dec 12 07:42:54 2019][ 2.416225] iommu: Adding device 0000:00:1b.7 to group 13 [Thu Dec 12 07:42:54 2019][ 2.421711] iommu: Adding device 0000:01:00.0 to group 14 [Thu Dec 12 07:42:54 2019][ 2.427157] iommu: Adding device 0000:02:00.0 to group 15 [Thu Dec 12 07:42:54 2019][ 2.432600] iommu: Adding device 0000:02:00.2 to group 16 [Thu Dec 12 07:42:54 2019][ 2.438041] iommu: Adding device 0000:02:00.3 to group 17 [Thu Dec 12 07:42:54 2019][ 2.443485] iommu: Adding device 0000:03:00.0 to group 18 [Thu Dec 12 07:42:54 2019][ 2.448927] iommu: Adding device 0000:03:00.1 to group 19 [Thu Dec 12 07:42:54 2019][ 2.454371] iommu: Adding device 0000:40:01.0 to group 20 [Thu Dec 12 07:42:54 2019][ 2.459813] iommu: Adding device 0000:40:02.0 to group 21 [Thu Dec 12 07:42:54 2019][ 2.465257] iommu: Adding device 0000:40:03.0 to group 22 [Thu Dec 12 07:42:54 2019][ 2.470699] iommu: Adding device 0000:40:04.0 to group 23 [Thu Dec 12 07:42:54 2019][ 2.476145] iommu: Adding device 0000:40:07.0 to group 24 [Thu Dec 12 07:42:54 2019][ 2.481585] iommu: Adding device 0000:40:07.1 to group 25 [Thu Dec 12 07:42:54 2019][ 2.487031] iommu: Adding device 0000:40:08.0 to group 26 [Thu Dec 12 07:42:54 2019][ 2.492470] iommu: Adding device 0000:40:08.1 to group 27 [Thu Dec 12 07:42:54 2019][ 2.497917] iommu: Adding device 0000:41:00.0 to group 28 [Thu Dec 12 07:42:54 2019][ 2.503358] iommu: Adding device 0000:41:00.2 to group 29 [Thu Dec 12 07:42:54 2019][ 2.508801] iommu: Adding device 0000:41:00.3 to group 30 [Thu Dec 12 07:42:54 2019][ 2.514256] iommu: Adding device 0000:42:00.0 to group 31 [Thu Dec 12 07:42:54 2019][ 2.519707] iommu: Adding device 0000:42:00.1 to group 32 [Thu Dec 12 07:42:54 2019][ 2.525158] iommu: Adding device 0000:80:01.0 to group 33 [Thu Dec 12 07:42:54 2019][ 2.530601] iommu: Adding device 0000:80:01.1 to group 34 [Thu Dec 12 07:42:54 2019][ 2.536043] iommu: Adding device 0000:80:01.2 to group 35 [Thu Dec 12 07:42:54 2019][ 2.541487] iommu: Adding device 0000:80:02.0 to group 36 [Thu Dec 12 07:42:54 2019][ 2.546932] iommu: Adding device 0000:80:03.0 to group 37 [Thu Dec 12 07:42:54 2019][ 2.552379] iommu: Adding device 0000:80:03.1 to group 38 [Thu Dec 12 07:42:54 2019][ 2.557825] iommu: Adding device 0000:80:04.0 to group 39 [Thu Dec 12 07:42:54 2019][ 2.563278] iommu: Adding device 0000:80:07.0 to group 40 [Thu Dec 12 07:42:54 2019][ 2.568731] iommu: Adding device 0000:80:07.1 to group 41 [Thu Dec 12 07:42:54 2019][ 2.574180] iommu: Adding device 0000:80:08.0 to group 42 [Thu Dec 12 07:42:54 2019][ 2.579628] iommu: Adding device 0000:80:08.1 to group 43 [Thu Dec 12 07:42:54 2019][ 2.585094] iommu: Adding device 0000:81:00.0 to group 44 [Thu Dec 12 07:42:54 2019][ 2.590535] iommu: Adding device 0000:81:00.1 to group 44 [Thu Dec 12 07:42:54 2019][ 2.595986] iommu: Adding device 0000:82:00.0 to group 45 [Thu Dec 12 07:42:54 2019][ 2.601402] iommu: Adding device 0000:83:00.0 to group 45 [Thu Dec 12 07:42:54 2019][ 2.606857] iommu: Adding device 0000:84:00.0 to group 46 [Thu Dec 12 07:42:54 2019][ 2.612305] iommu: Adding device 0000:85:00.0 to group 47 [Thu Dec 12 07:42:54 2019][ 2.617757] iommu: Adding device 0000:85:00.2 to group 48 [Thu Dec 12 07:42:54 2019][ 2.623224] iommu: Adding device 0000:86:00.0 to group 49 [Thu Dec 12 07:42:54 2019][ 2.628669] iommu: Adding device 0000:86:00.1 to group 50 [Thu Dec 12 07:42:54 2019][ 2.634119] iommu: Adding device 0000:86:00.2 to group 51 [Thu Dec 12 07:42:54 2019][ 2.639575] iommu: Adding device 0000:c0:01.0 to group 52 [Thu Dec 12 07:42:54 2019][ 2.645022] iommu: Adding device 0000:c0:01.1 to group 53 [Thu Dec 12 07:42:54 2019][ 2.650474] iommu: Adding device 0000:c0:02.0 to group 54 [Thu Dec 12 07:42:54 2019][ 2.655927] iommu: Adding device 0000:c0:03.0 to group 55 [Thu Dec 12 07:42:54 2019][ 2.661378] iommu: Adding device 0000:c0:04.0 to group 56 [Thu Dec 12 07:42:54 2019][ 2.666832] iommu: Adding device 0000:c0:07.0 to group 57 [Thu Dec 12 07:42:54 2019][ 2.672281] iommu: Adding device 0000:c0:07.1 to group 58 [Thu Dec 12 07:42:54 2019][ 2.677736] iommu: Adding device 0000:c0:08.0 to group 59 [Thu Dec 12 07:42:54 2019][ 2.683183] iommu: Adding device 0000:c0:08.1 to group 60 [Thu Dec 12 07:42:54 2019][ 2.690996] iommu: Adding device 0000:c1:00.0 to group 61 [Thu Dec 12 07:42:54 2019][ 2.696447] iommu: Adding device 0000:c2:00.0 to group 62 [Thu Dec 12 07:42:54 2019][ 2.701897] iommu: Adding device 0000:c2:00.2 to group 63 [Thu Dec 12 07:42:54 2019][ 2.707351] iommu: Adding device 0000:c3:00.0 to group 64 [Thu Dec 12 07:42:54 2019][ 2.712804] iommu: Adding device 0000:c3:00.1 to group 65 [Thu Dec 12 07:42:54 2019][ 2.718550] AMD-Vi: Found IOMMU at 0000:00:00.2 cap 0x40 [Thu Dec 12 07:42:54 2019][ 2.723868] AMD-Vi: Extended features (0xf77ef22294ada): [Thu Dec 12 07:42:55 2019][ 2.729190] PPR NX GT IA GA PC GA_vAPIC [Thu Dec 12 07:42:55 2019][ 2.733332] AMD-Vi: Found IOMMU at 0000:40:00.2 cap 0x40 [Thu Dec 12 07:42:55 2019][ 2.738654] AMD-Vi: Extended features (0xf77ef22294ada): [Thu Dec 12 07:42:55 2019][ 2.743975] PPR NX GT IA GA PC GA_vAPIC [Thu Dec 12 07:42:55 2019][ 2.748108] AMD-Vi: Found IOMMU at 0000:80:00.2 cap 0x40 [Thu Dec 12 07:42:55 2019][ 2.753431] AMD-Vi: Extended features (0xf77ef22294ada): [Thu Dec 12 07:42:55 2019][ 2.758752] PPR NX GT IA GA PC GA_vAPIC [Thu Dec 12 07:42:55 2019][ 2.762884] AMD-Vi: Found IOMMU at 0000:c0:00.2 cap 0x40 [Thu Dec 12 07:42:55 2019][ 2.768209] AMD-Vi: Extended features (0xf77ef22294ada): [Thu Dec 12 07:42:55 2019][ 2.773528] PPR NX GT IA GA PC GA_vAPIC [Thu Dec 12 07:42:55 2019][ 2.777670] AMD-Vi: Interrupt remapping enabled [Thu Dec 12 07:42:55 2019][ 2.782224] AMD-Vi: virtual APIC enabled [Thu Dec 12 07:42:55 2019][ 2.786378] AMD-Vi: Lazy IO/TLB flushing enabled [Thu Dec 12 07:42:55 2019][ 2.791078] perf: AMD NB counters detected [Thu Dec 12 07:42:55 2019][ 2.795191] perf: AMD LLC counters detected [Thu Dec 12 07:42:55 2019][ 2.799616] sha1_ssse3: Using SHA-NI optimized SHA-1 implementation [Thu Dec 12 07:42:55 2019][ 2.805943] sha256_ssse3: Using SHA-256-NI optimized SHA-256 implementation [Thu Dec 12 07:42:55 2019][ 2.813134] futex hash table entries: 256 (order: 2, 16384 bytes) [Thu Dec 12 07:42:55 2019][ 2.819257] Initialise system trusted keyring [Thu Dec 12 07:42:55 2019][ 2.823650] audit: initializing netlink socket (disabled) [Thu Dec 12 07:42:55 2019][ 2.829072] type=2000 audit(1576165370.564:1): initialized [Thu Dec 12 07:42:55 2019][ 2.859741] HugeTLB registered 1 GB page size, pre-allocated 0 pages [Thu Dec 12 07:42:55 2019][ 2.866103] HugeTLB registered 2 MB page size, pre-allocated 0 pages [Thu Dec 12 07:42:55 2019][ 2.873617] zpool: loaded [Thu Dec 12 07:42:55 2019][ 2.876252] zbud: loaded [Thu Dec 12 07:42:55 2019][ 2.879042] VFS: Disk quotas dquot_6.6.0 [Thu Dec 12 07:42:55 2019][ 2.883000] Dquot-cache hash table entries: 512 (order 0, 4096 bytes) [Thu Dec 12 07:42:55 2019][ 2.889825] msgmni has been set to 302 [Thu Dec 12 07:42:55 2019][ 2.893630] Key type big_key registered [Thu Dec 12 07:42:55 2019][ 2.897804] NET: Registered protocol family 38 [Thu Dec 12 07:42:55 2019][ 2.902281] Key type asymmetric registered [Thu Dec 12 07:42:55 2019][ 2.906394] Asymmetric key parser 'x509' registered [Thu Dec 12 07:42:55 2019][ 2.911312] Block layer SCSI generic (bsg) driver version 0.4 loaded (major 248) [Thu Dec 12 07:42:55 2019][ 2.918737] io scheduler noop registered [Thu Dec 12 07:42:55 2019][ 2.922673] io scheduler deadline registered (default) [Thu Dec 12 07:42:55 2019][ 2.927838] io scheduler cfq registered [Thu Dec 12 07:42:55 2019][ 2.931684] io scheduler mq-deadline registered [Thu Dec 12 07:42:55 2019][ 2.936226] io scheduler kyber registered [Thu Dec 12 07:42:55 2019][ 2.945380] pcieport 0000:00:03.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 2.952345] pci 0000:01:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 2.958891] pcieport 0000:00:07.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 2.965856] pci 0000:02:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 2.972393] pci 0000:02:00.2: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 2.978926] pci 0000:02:00.3: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 2.985478] pcieport 0000:00:08.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 2.992447] pci 0000:03:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 2.998980] pci 0000:03:00.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.005531] pcieport 0000:40:07.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.012501] pci 0000:41:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.019036] pci 0000:41:00.2: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.025571] pci 0000:41:00.3: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.032119] pcieport 0000:40:08.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.039084] pci 0000:42:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.045618] pci 0000:42:00.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.052166] pcieport 0000:80:01.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.059131] pci 0000:81:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.065664] pci 0000:81:00.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.072213] pcieport 0000:80:01.2: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.079184] pci 0000:82:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.085719] pci 0000:83:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.092267] pcieport 0000:80:03.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.099231] pci 0000:84:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.105778] pcieport 0000:80:07.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.112742] pci 0000:85:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.119279] pci 0000:85:00.2: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.125826] pcieport 0000:80:08.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.132791] pci 0000:86:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.139325] pci 0000:86:00.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.145860] pci 0000:86:00.2: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.152406] pcieport 0000:c0:01.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.159370] pci 0000:c1:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.165917] pcieport 0000:c0:07.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.172881] pci 0000:c2:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.179417] pci 0000:c2:00.2: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.185965] pcieport 0000:c0:08.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.192930] pci 0000:c3:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.199464] pci 0000:c3:00.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:42:55 2019][ 3.206018] pci_hotplug: PCI Hot Plug PCI Core version: 0.5 [Thu Dec 12 07:42:55 2019][ 3.211602] pciehp: PCI Express Hot Plug Controller Driver version: 0.4 [Thu Dec 12 07:42:55 2019][ 3.218252] shpchp: Standard Hot Plug PCI Controller Driver version: 0.4 [Thu Dec 12 07:42:55 2019][ 3.225054] efifb: probing for efifb [Thu Dec 12 07:42:55 2019][ 3.228655] efifb: framebuffer at 0xab000000, mapped to 0xffffc90000800000, using 3072k, total 3072k [Thu Dec 12 07:42:55 2019][ 3.237789] efifb: mode is 1024x768x32, linelength=4096, pages=768 [Thu Dec 12 07:42:55 2019][ 3.243976] efifb: scrolling: redraw [Thu Dec 12 07:42:55 2019][ 3.247567] efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 [Thu Dec 12 07:42:55 2019][ 3.268985] Console: switching to colour frame buffer device 128x48 [Thu Dec 12 07:42:55 2019][ 3.289357] fb0: EFI VGA frame buffer device [Thu Dec 12 07:42:55 2019][ 3.293727] input: Power Button as /devices/LNXSYSTM:00/device:00/PNP0C0C:00/input/input0 [Thu Dec 12 07:42:55 2019][ 3.301909] ACPI: Power Button [PWRB] [Thu Dec 12 07:42:55 2019][ 3.305628] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input1 [Thu Dec 12 07:42:55 2019][ 3.313027] ACPI: Power Button [PWRF] [Thu Dec 12 07:42:55 2019][ 3.317183] GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. [Thu Dec 12 07:42:55 2019][ 3.324665] Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled [Thu Dec 12 07:42:55 2019][ 3.351860] 00:02: ttyS1 at I/O 0x2f8 (irq = 3) is a 16550A [Thu Dec 12 07:42:55 2019][ 3.378383] 00:03: ttyS0 at I/O 0x3f8 (irq = 4) is a 16550A [Thu Dec 12 07:42:55 2019][ 3.384319] Non-volatile memory driver v1.3 [Thu Dec 12 07:42:55 2019][ 3.388537] Linux agpgart interface v0.103 [Thu Dec 12 07:42:55 2019][ 3.393192] crash memory driver: version 1.1 [Thu Dec 12 07:42:55 2019][ 3.397544] rdac: device handler registered [Thu Dec 12 07:42:55 2019][ 3.401759] hp_sw: device handler registered [Thu Dec 12 07:42:55 2019][ 3.406043] emc: device handler registered [Thu Dec 12 07:42:55 2019][ 3.410160] alua: device handler registered [Thu Dec 12 07:42:55 2019][ 3.414384] libphy: Fixed MDIO Bus: probed [Thu Dec 12 07:42:55 2019][ 3.418522] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver [Thu Dec 12 07:42:55 2019][ 3.425057] ehci-pci: EHCI PCI platform driver [Thu Dec 12 07:42:55 2019][ 3.429523] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver [Thu Dec 12 07:42:55 2019][ 3.435718] ohci-pci: OHCI PCI platform driver [Thu Dec 12 07:42:55 2019][ 3.440182] uhci_hcd: USB Universal Host Controller Interface driver [Thu Dec 12 07:42:55 2019][ 3.446619] xhci_hcd 0000:02:00.3: xHCI Host Controller [Thu Dec 12 07:42:55 2019][ 3.451882] xhci_hcd 0000:02:00.3: new USB bus registered, assigned bus number 1 [Thu Dec 12 07:42:55 2019][ 3.459393] xhci_hcd 0000:02:00.3: hcc params 0x0270f665 hci version 0x100 quirks 0x00000410 [Thu Dec 12 07:42:55 2019][ 3.467945] usb usb1: New USB device found, idVendor=1d6b, idProduct=0002 [Thu Dec 12 07:42:55 2019][ 3.474742] usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [Thu Dec 12 07:42:55 2019][ 3.481971] usb usb1: Product: xHCI Host Controller [Thu Dec 12 07:42:55 2019][ 3.486860] usb usb1: Manufacturer: Linux 3.10.0-957.27.2.el7_lustre.pl2.x86_64 xhci-hcd [Thu Dec 12 07:42:55 2019][ 3.494954] usb usb1: SerialNumber: 0000:02:00.3 [Thu Dec 12 07:42:55 2019][ 3.499666] hub 1-0:1.0: USB hub found [Thu Dec 12 07:42:55 2019][ 3.503439] hub 1-0:1.0: 2 ports detected [Thu Dec 12 07:42:55 2019][ 3.507674] xhci_hcd 0000:02:00.3: xHCI Host Controller [Thu Dec 12 07:42:55 2019][ 3.512941] xhci_hcd 0000:02:00.3: new USB bus registered, assigned bus number 2 [Thu Dec 12 07:42:55 2019][ 3.520359] usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. [Thu Dec 12 07:42:55 2019][ 3.528465] usb usb2: New USB device found, idVendor=1d6b, idProduct=0003 [Thu Dec 12 07:42:55 2019][ 3.535263] usb usb2: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [Thu Dec 12 07:42:55 2019][ 3.542492] usb usb2: Product: xHCI Host Controller [Thu Dec 12 07:42:55 2019][ 3.547381] usb usb2: Manufacturer: Linux 3.10.0-957.27.2.el7_lustre.pl2.x86_64 xhci-hcd [Thu Dec 12 07:42:55 2019][ 3.555475] usb usb2: SerialNumber: 0000:02:00.3 [Thu Dec 12 07:42:55 2019][ 3.560171] hub 2-0:1.0: USB hub found [Thu Dec 12 07:42:55 2019][ 3.563941] hub 2-0:1.0: 2 ports detected [Thu Dec 12 07:42:55 2019][ 3.568210] xhci_hcd 0000:41:00.3: xHCI Host Controller [Thu Dec 12 07:42:55 2019][ 3.573479] xhci_hcd 0000:41:00.3: new USB bus registered, assigned bus number 3 [Thu Dec 12 07:42:55 2019][ 3.580973] xhci_hcd 0000:41:00.3: hcc params 0x0270f665 hci version 0x100 quirks 0x00000410 [Thu Dec 12 07:42:55 2019][ 3.589510] usb usb3: New USB device found, idVendor=1d6b, idProduct=0002 [Thu Dec 12 07:42:55 2019][ 3.596304] usb usb3: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [Thu Dec 12 07:42:55 2019][ 3.603531] usb usb3: Product: xHCI Host Controller [Thu Dec 12 07:42:55 2019][ 3.608421] usb usb3: Manufacturer: Linux 3.10.0-957.27.2.el7_lustre.pl2.x86_64 xhci-hcd [Thu Dec 12 07:42:55 2019][ 3.616515] usb usb3: SerialNumber: 0000:41:00.3 [Thu Dec 12 07:42:55 2019][ 3.621219] hub 3-0:1.0: USB hub found [Thu Dec 12 07:42:55 2019][ 3.624991] hub 3-0:1.0: 2 ports detected [Thu Dec 12 07:42:55 2019][ 3.629209] xhci_hcd 0000:41:00.3: xHCI Host Controller [Thu Dec 12 07:42:55 2019][ 3.634482] xhci_hcd 0000:41:00.3: new USB bus registered, assigned bus number 4 [Thu Dec 12 07:42:55 2019][ 3.641899] usb usb4: We don't know the algorithms for LPM for this host, disabling LPM. [Thu Dec 12 07:42:55 2019][ 3.650009] usb usb4: New USB device found, idVendor=1d6b, idProduct=0003 [Thu Dec 12 07:42:55 2019][ 3.656808] usb usb4: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [Thu Dec 12 07:42:55 2019][ 3.664035] usb usb4: Product: xHCI Host Controller [Thu Dec 12 07:42:55 2019][ 3.668923] usb usb4: Manufacturer: Linux 3.10.0-957.27.2.el7_lustre.pl2.x86_64 xhci-hcd [Thu Dec 12 07:42:55 2019][ 3.677018] usb usb4: SerialNumber: 0000:41:00.3 [Thu Dec 12 07:42:55 2019][ 3.681726] hub 4-0:1.0: USB hub found [Thu Dec 12 07:42:55 2019][ 3.685494] hub 4-0:1.0: 2 ports detected [Thu Dec 12 07:42:55 2019][ 3.689736] usbcore: registered[-- root@localhost attached -- Thu Dec 12 07:42:55 2019] new interface driver usbserial_generic [Thu Dec 12 07:42:55 2019][ 3.696277] usbserial: USB Serial support registered for generic [Thu Dec 12 07:42:55 2019][ 3.702342] i8042: PNP: No PS/2 controller found. Probing ports directly. [Thu Dec 12 07:42:57 2019][ 4.740984] i8042: No controller found [Thu Dec 12 07:42:57 2019][ 4.744784] mousedev: PS/2 mouse device common for all mice [Thu Dec 12 07:42:57 2019][ 4.750462] rtc_cmos 00:01: RTC can wake from S4 [Thu Dec 12 07:42:57 2019][ 4.755289] rtc_cmos 00:01: rtc core: registered rtc_cmos as rtc0 [Thu Dec 12 07:42:57 2019][ 4.761451] rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram, hpet irqs [Thu Dec 12 07:42:57 2019][ 4.769167] cpuidle: using governor menu [Thu Dec 12 07:42:57 2019][ 4.773328] EFI Variables Facility v0.08 2004-May-17 [Thu Dec 12 07:42:57 2019][ 4.799810] hidraw: raw HID events driver (C) Jiri Kosina [Thu Dec 12 07:42:57 2019][ 4.805313] usbcore: registered new interface driver usbhid [Thu Dec 12 07:42:57 2019][ 4.810890] usbhid: USB HID core driver [Thu Dec 12 07:42:57 2019][ 4.814801] drop_monitor: Initializing network drop monitor service [Thu Dec 12 07:42:57 2019][ 4.821136] TCP: cubic registered [Thu Dec 12 07:42:57 2019][ 4.824466] Initializing XFRM netlink socket [Thu Dec 12 07:42:57 2019][ 4.828839] NET: Registered protocol family 10 [Thu Dec 12 07:42:57 2019][ 4.833458] NET: Registered protocol family 17 [Thu Dec 12 07:42:57 2019][ 4.837920] mpls_gso: MPLS GSO support [Thu Dec 12 07:42:57 2019][ 4.841689] mce: Unable to init device /dev/mcelog (rc: -5) [Thu Dec 12 07:42:57 2019][ 4.847337] mce: Using 0 MCE banks [Thu Dec 12 07:42:57 2019][ 4.850781] microcode: CPU0: patch_level=0x08001250 [Thu Dec 12 07:42:57 2019][ 4.855702] microcode: Microcode Update Driver: v2.01 , Peter Oruba [Thu Dec 12 07:42:57 2019][ 4.864573] Loading compiled-in X.509 certificates [Thu Dec 12 07:42:57 2019][ 4.869397] Loaded X.509 cert 'CentOS Linux kpatch signing key: ea0413152cde1d98ebdca3fe6f0230904c9ef717' [Thu Dec 12 07:42:57 2019][ 4.878987] Loaded X.509 cert 'CentOS Linux Driver update signing key: 7f421ee0ab69461574bb358861dbe77762a4201b' [Thu Dec 12 07:42:57 2019][ 4.889545] Loaded X.509 cert 'CentOS Linux kernel signing key: 468656045a39b52ff2152c315f6198c3e658f24d' [Thu Dec 12 07:42:57 2019][ 4.899123] registered taskstats version 1 [Thu Dec 12 07:42:57 2019][ 4.905008] Key type trusted registered [Thu Dec 12 07:42:57 2019][ 4.910255] Key type encrypted registered [Thu Dec 12 07:42:57 2019][ 4.914307] IMA: No TPM chip found, activating TPM-bypass! (rc=-19) [Thu Dec 12 07:42:57 2019][ 4.922249] Magic number: 15:214:740 [Thu Dec 12 07:42:57 2019][ 4.926150] rtc_cmos 00:01: setting system clock to 2019-12-12 15:42:56 UTC (1576165376) [Thu Dec 12 07:42:57 2019][ 4.935336] Freeing unused kernel memory: 1876k freed [Thu Dec 12 07:42:57 2019][ 4.940661] Write protecting the kernel read-only data: 12288k [Thu Dec 12 07:42:57 2019][ 4.947663] Freeing unused kernel memory: 504k freed [Thu Dec 12 07:42:57 2019][ 4.953930] Freeing unused kernel memory: 596k freed [Thu Dec 12 07:42:57 2019][ 5.001244] random: systemd: uninitialized urandom read (16 bytes read) [Thu Dec 12 07:42:57 2019][ 5.008317] random: systemd: uninitialized urandom read (16 bytes read) [Thu Dec 12 07:42:57 2019][ 5.017715] systemd[1]: systemd 219 running in system mode. (+PAM +AUDIT +SELINUX +IMA -APPARMOR +SMACK +SYSVINIT +UTMP +LIBCRYPTSETUP +GCRYPT +GNUTLS +ACL +XZ +LZ4 -SECCOMP +BLKID +ELFUTILS +KMOD +IDN) [Thu Dec 12 07:42:57 2019][ 5.035807] systemd[1]: Detected architecture x86-64. [Thu Dec 12 07:42:57 2019][ 5.040873] systemd[1]: Running in initial RAM disk. [Thu Dec 12 07:42:57 2019] [Thu Dec 12 07:42:57 2019]Welcome to [0[ 5.046096] tsc: Refined TSC clocksource calibration: 1996.249 MHz [Thu Dec 12 07:42:57 2019];34mCentOS Linux[ 5.053658] Switched to clocksource tsc [Thu Dec 12 07:42:57 2019] 7 (Core) dracut-033-554.el7 (Initramfs)! [Thu Dec 12 07:42:57 2019] [Thu Dec 12 07:42:57 2019][ 5.064338] systemd[1]: Set hostname to . [Thu Dec 12 07:42:57 2019][ 5.084658] random: systemd: uninitialized urandom read (16 bytes read) [Thu Dec 12 07:42:57 2019][ 5.091302] random: systemd: uninitialized urandom read (16 bytes read) [Thu Dec 12 07:42:57 2019][ 5.097959] random: systemd: uninitialized urandom read (16 bytes read) [Thu Dec 12 07:42:57 2019][ 5.105068] random: systemd: uninitialized urandom read (16 bytes read) [Thu Dec 12 07:42:57 2019][ 5.111783] random: systemd: uninitialized urandom read (16 bytes read) [Thu Dec 12 07:42:57 2019][ 5.118693] random: systemd: uninitialized urandom read (16 bytes read) [Thu Dec 12 07:42:57 2019][ 5.125367] random: systemd: uninitialized urandom read (16 bytes read) [Thu Dec 12 07:42:57 2019][ 5.132182] random: systemd: uninitialized urandom read (16 bytes read) [Thu Dec 12 07:42:57 2019][ 5.141414] systemd[1]: Reached target Timers. [Thu Dec 12 07:42:57 2019][ OK ] Reached target Timers. [Thu Dec 12 07:42:57 2019][ 5.150461] systemd[1]: Created slice Root Slice. [Thu Dec 12 07:42:57 2019][ OK ] Created slice Root Slice. [Thu Dec 12 07:42:57 2019][ 5.159381] systemd[1]: Created slice System Slice. [Thu Dec 12 07:42:57 2019][ OK ] Created slice System Slice. [Thu Dec 12 07:42:57 2019][ 5.170368] systemd[1]: Listening on Journal Socket. [Thu Dec 12 07:42:57 2019][ OK [ 5.176282] usb 1-1: new high-speed USB device number 2 using xhci_hcd [Thu Dec 12 07:42:57 2019]] Listening on Journal Socket. [Thu Dec 12 07:42:57 2019][ 5.186628] systemd[1]: Starting Journal Service... [Thu Dec 12 07:42:57 2019] Starting Journal Service... [Thu Dec 12 07:42:57 2019][ 5.196705] systemd[1]: Starting Apply Kernel Variables... [Thu Dec 12 07:42:57 2019] Starting Apply Kernel Variables... [Thu Dec 12 07:42:57 2019][ 5.212360] systemd[1]: Listening on udev Kernel Socket. [Thu Dec 12 07:42:57 2019][ OK ] Listening on udev Kernel Socket. [Thu Dec 12 07:42:57 2019][ 5.226340] systemd[1]: Reached target Swap. [Thu Dec 12 07:42:57 2019][ OK ] Reached target Swap. [Thu Dec 12 07:42:57 2019][ 5.236341] systemd[1]: Reached target Slices. [Thu Dec 12 07:42:57 2019][ OK ] Reached target Slices. [Thu Dec 12 07:42:57 2019][ 5.247373] systemd[1]: Started Dispatch Password Requests to Console Directory Watch. [Thu Dec 12 07:42:57 2019][ OK ] Started Dispat[ 5.257284] usb 3-1: new high-speed USB device number 2 using xhci_hcd [Thu Dec 12 07:42:57 2019]ch Password Requests to Console Directory Watch. [Thu Dec 12 07:42:57 2019][ 5.270334] systemd[1]: Reached target Paths. [Thu Dec 12 07:42:57 2019][ 5.274791] random: fast init done [Thu Dec 12 07:42:57 2019][ OK ] Reached target Paths. [Thu Dec 12 07:42:57 2019][ 5.282363] systemd[1]: Listening on udev Control Socket. [Thu Dec 12 07:42:57 2019][ OK ] Listening on udev Control Socket. [Thu Dec 12 07:42:57 2019][ 5.293335] systemd[1]: Reached target Sockets. [Thu Dec 12 07:42:57 2019][ OK ] Reached target Sockets. [Thu Dec 12 07:42:57 2019][ 5.302668] systemd[1]: Starting Create list of required static device nodes for the current kernel... [Thu Dec 12 07:42:57 2019] Startin[ 5.314774] usb 1-1: New USB device found, idVendor=0424, idProduct=2744 [Thu Dec 12 07:42:57 2019]g Create list of[ 5.321792] usb 1-1: New USB device strings: Mfr=1, Product=2, SerialNumber=0 [Thu Dec 12 07:42:57 2019] required st... [ 5.330311] usb 1-1: Product: USB2734 [Thu Dec 12 07:42:57 2019]nodes for the cu[ 5.335360] usb 1-1: Manufacturer: Microchip Tech [Thu Dec 12 07:42:57 2019]rrent kernel... [Thu Dec 12 07:42:57 2019][ 5.344555] hub 1-1:1.0: USB hub found [Thu Dec 12 07:42:57 2019][ 5.348600] systemd[1]: Starting dracut pre-udev hook... [Thu Dec 12 07:42:57 2019][ 5.354330] hub 1-1:1.0: 4 ports detected [Thu Dec 12 07:42:57 2019] Starting dracut pre-udev hook... [Thu Dec 12 07:42:57 2019][ 5.365555] systemd[1]: Started Journal Service. [Thu Dec 12 07:42:57 2019][ OK ] Started Journal Service. [Thu Dec 12 07:42:57 2019][ OK ] Started Apply Kernel Variables. [Thu Dec 12 07:42:57 2019][ OK ] Started Create list of required sta...ce nodes for the current kernel. [Thu Dec 12 07:42:57 2019][ OK ] Started dracut pre-udev hook. [Thu Dec 12 07:42:57 2019] Starting Create Static Device Nodes in /dev... [Thu Dec 12 07:42:57 2019][ 5.404202] usb 3-1: New USB device found, idVendor=1604, idProduct=10c0 [Thu Dec 12 07:42:57 2019][ 5.410903] usb 3-1: New USB device strings: Mfr=0, Product=0, SerialNumber=0 [Thu Dec 12 07:42:57 2019][ OK ] Started Create Static Device Nodes in /dev. [Thu Dec 12 07:42:57 2019][ 5.424406] usb 2-1: new SuperSpeed USB device number 2 using xhci_hcd [Thu Dec 12 07:42:57 2019] Starting udev Kernel De[ 5.434105] hub 3-1:1.0: USB hub found [Thu Dec 12 07:42:57 2019]vice Manager... [Thu Dec 12 07:42:57 2019][ 5.439954] hub 3-1:1.0: 4 ports detected [Thu Dec 12 07:42:57 2019][ OK ] Started udev Kernel Device Man[ 5.447652] usb 2-1: New USB device found, idVendor=0424, idProduct=5744 [Thu Dec 12 07:42:57 2019]ager. [Thu Dec 12 07:42:57 2019][ 5.455304] usb 2-1: New USB device strings: Mfr=2, Product=3, SerialNumber=0 [Thu Dec 12 07:42:57 2019][ 5.463006] usb 2-1: Product: USB5734 [Thu Dec 12 07:42:57 2019][ 5.466672] usb 2-1: Manufacturer: Microchip Tech [Thu Dec 12 07:42:57 2019] Starting udev Coldplug all Devices... [Thu Dec 12 07:42:57 2019] Mounting Configuration File System... [Thu Dec 12 07:42:57 2019][ 5.488560] hub 2-1:1.0: USB hub found [Thu Dec 12 07:42:57 2019][ 5.497304] hub 2-1:1.0: 4 ports detected [Thu Dec 12 07:42:57 2019][ OK ] Mounted Configuration File System. [Thu Dec 12 07:42:57 2019][ 5.508139] usb: port power management may be unreliable [Thu Dec 12 07:42:57 2019][ OK ] Started udev Coldplug all Devices. [Thu Dec 12 07:42:57 2019] Starting dracut initqueue hook... [Thu Dec 12 07:42:57 2019][ OK ] Started dracut initqueue hook. [Thu Dec 12 07:42:57 2019][ OK ] Reached target Initrd Root File System. [Thu Dec 12 07:42:57 2019] Starting Reload Configuration from the Real Root... [Thu Dec 12 07:42:57 2019][ OK ] Reached target Remote File Systems (Pre). [Thu Dec 12 07:42:57 2019][ OK ] Reached target Remote File Systems. [Thu Dec 12 07:42:57 2019][ 5.686680] ahci 0000:86:00.2: AHCI 0001.0301 32 slots 1 ports 6 Gbps 0x1 impl SATA mode [Thu Dec 12 07:42:57 2019][ 5.694779] ahci 0000:86:00.2: flags: 64bit ncq sntf ilck pm led clo only pmp fbs pio slum part [Thu Dec 12 07:42:58 2019][ 5.741350] scsi host0: ahci [Thu Dec 12 07:42:58 2019][ 5.748780] ata1: SATA max UDMA/133 abar m4096@0xc0a02000 port 0xc0a02100 irq 56 [Thu Dec 12 07:42:58 2019][ OK ] Started Reload Configuration from the Real Root. [Thu Dec 12 07:42:58 2019][ OK ] Reached target Initrd File Systems. [Thu Dec 12 07:42:58 2019][ 5.811461] megasas: 07.705.02.00-rh1 [Thu Dec 12 07:42:58 2019][ 5.822317] megaraid_sas 0000:c1:00.0: Waiting for FW to come to ready state [Thu Dec 12 07:42:58 2019][ 5.877294] megaraid_sas 0000:c1:00.0: FW now in Ready state [Thu Dec 12 07:42:58 2019][ 5.882954] megaraid_sas 0000:c1:00.0: 64 bit DMA mask and 32 bit consistent mask [Thu Dec 12 07:42:58 2019][ 5.890751] megaraid_sas 0000:c1:00.0: firmware supports msix : (96) [Thu Dec 12 07:42:58 2019][ 5.897112] megaraid_sas 0000:c1:00.0: current msix/online cpus : (1/1) [Thu Dec 12 07:42:58 2019][ 5.903727] megaraid_sas 0000:c1:00.0: RDPQ mode : (disabled) [Thu Dec 12 07:42:58 2019][ 5.909473] megaraid_sas 0000:c1:00.0: Current firmware supports maximum commands: 240 LDIO threshold: 0 [Thu Dec 12 07:42:58 2019][ 5.919323] megaraid_sas 0000:c1:00.0: Configured max firmware commands: 99 [Thu Dec 12 07:42:58 2019][ 5.926386] megaraid_sas 0000:c1:00.0: FW supports sync cache : No [Thu Dec 12 07:42:58 2019][ 6.063311] ata1: SATA link down (SStatus 0 SControl 300) [Thu Dec 12 07:42:58 2019][ 6.100305] megaraid_sas 0000:c1:00.0: Init cmd return status SUCCESS for SCSI host 1 [Thu Dec 12 07:42:58 2019][ 6.121308] usb 3-1.1: new high-speed USB device number 3 using xhci_hcd [Thu Dec 12 07:42:58 2019][ 6.128399] megaraid_sas 0000:c1:00.0: firmware type : Legacy(64 VD) firmware [Thu Dec 12 07:42:58 2019][ 6.135534] megaraid_sas 0000:c1:00.0: controller type : iMR(0MB) [Thu Dec 12 07:42:58 2019][ 6.141633] megaraid_sas 0000:c1:00.0: Online Controller Reset(OCR) : Enabled [Thu Dec 12 07:42:58 2019][ 6.148764] megaraid_sas 0000:c1:00.0: Secure JBOD support : No [Thu Dec 12 07:42:58 2019][ 6.154685] megaraid_sas 0000:c1:00.0: NVMe passthru support : No [Thu Dec 12 07:42:58 2019][ 6.194516] megaraid_sas 0000:c1:00.0: INIT adapter done [Thu Dec 12 07:42:58 2019][ 6.199832] megaraid_sas 0000:c1:00.0: Jbod map is not supported megasas_setup_jbod_map 5146 [Thu Dec 12 07:42:58 2019][ 6.217707] megaraid_sas 0000:c1:00.0: pci id : (0x1000)/(0x005f)/(0x1028)/(0x1f4b) [Thu Dec 12 07:42:58 2019][ 6.225458] megaraid_sas 0000:c1:00.0: unevenspan support : yes [Thu Dec 12 07:42:58 2019][ 6.231383] megaraid_sas 0000:c1:00.0: firmware crash dump : no [Thu Dec 12 07:42:58 2019][ 6.237308] megaraid_sas 0000:c1:00.0: jbod sync map : no [Thu Dec 12 07:42:58 2019][ 6.242800] scsi host1: Avago SAS based MegaRAID driver [Thu Dec 12 07:42:58 2019][ 6.261226] usb 3-1.1: New USB device found, idVendor=1604, idProduct=10c0 [Thu Dec 12 07:42:58 2019][ 6.268102] usb 3-1.1: New USB device strings: Mfr=0, Product=0, SerialNumber=0 [Thu Dec 12 07:42:58 2019][ 6.276311] scsi 1:2:0:0: Direct-Access DELL PERC H330 Mini 4.30 PQ: 0 ANSI: 5 [Thu Dec 12 07:42:58 2019][ 6.339004] hub 3-1.1:1.0: USB hub found [Thu Dec 12 07:42:58 2019][ 6.345728] hub 3-1.1:1.0: 4 ports detected [Thu Dec 12 07:42:58 2019][ 6.422313] usb 3-1.4: new high-speed USB device number 4 using xhci_hcd [Thu Dec 12 07:42:58 2019][ 6.432574] sd 1:2:0:0: [sda] 467664896 512-byte logical blocks: (239 GB/223 GiB) [Thu Dec 12 07:42:58 2019][ 6.440243] sd 1:2:0:0: [sda] Write Protect is off [Thu Dec 12 07:42:58 2019][ 6.445295] sd 1:2:0:0: [sda] Write cache: disabled, read cache: disabled, supports DPO and FUA [Thu Dec 12 07:42:58 2019][ 6.456358] sda: sda1 sda2 sda3 [Thu Dec 12 07:42:58 2019][ 6.459983] sd 1:2:0:0: [sda] Attached SCSI disk [Thu Dec 12 07:42:58 2019][ OK ] Found device PERC_H330_Mini os. [Thu Dec 12 07:42:58 2019] Starting File System Check on /dev/...4-e7db-49b7-baed-d6c7905c5cdc... [Thu Dec 12 07:42:58 2019][ 6.528234] usb 3-1.4: New USB device found, idVendor=1604, idProduct=10c0 [Thu Dec 12 07:42:58 2019][ 6.535116] usb 3-1.4: New USB device strings: Mfr=0, Product=0, SerialNumber=0 [Thu Dec 12 07:42:58 2019][ 6.554134] hub 3-1.4:1.0: USB hub found [Thu Dec 12 07:42:58 2019][ 6.558857] hub 3-1.4:1.0: 4 ports detected [Thu Dec 12 07:42:58 2019][ OK ] Started File System Check on /dev/d...4c4-e7db-49b7-baed-d6c7905c5cdc. [Thu Dec 12 07:42:58 2019] Mounting /sysroot... [Thu Dec 12 07:42:58 2019][ 6.675251] EXT4-fs (sda2): mounted filesystem with ordered data mode. Opts: (null) [Thu Dec 12 07:42:58 2019][ OK ] Mounted /sysroot. [Thu Dec 12 07:42:58 2019][ OK ] Reached target Local File Systems. [Thu Dec 12 07:42:58 2019][ OK ] Reached target System Initialization. [Thu Dec 12 07:42:58 2019][ OK ] Reached target Basic System. [Thu Dec 12 07:42:59 2019][ OK ] Reached target Initrd Default Target. [Thu Dec 12 07:42:59 2019] Starting dracut pre-pivot and cleanup hook... [Thu Dec 12 07:42:59 2019][ OK ] Started dracut pre-pivot and cleanup hook. [Thu Dec 12 07:42:59 2019] Starting Kdump Vmcore Save Service... [Thu Dec 12 07:42:59 2019]kdump: dump target is /dev/sda2 [ 6.779041] EXT4-fs (sda2): re-mounted. Opts: (null) [Thu Dec 12 07:42:59 2019] [Thu Dec 12 07:42:59 2019]kdump: saving to /sysroot//var/crash/127.0.0.1-2019-12-12-07:42:58/ [Thu Dec 12 07:42:59 2019]kdump: saving vmcore-dmesg.txt [Thu Dec 12 07:42:59 2019]kdump: saving vmcore-dmesg.txt complete [Thu Dec 12 07:42:59 2019]kdump: saving vmcore [Thu Dec 12 07:42:59 2019] Checking for memory holes : [ 0.0 %] / Checking for memory holes : [100.0 %] | Excluding unnecessary pages : [ 25.8 %] \ Excluding unnecessary pages : [100.0 %] - Checking for memory holes : [100.0 %] / Excluding unnecessary pages : [ 68.5 %] | Excluding unnecessary pages : [100.0 %] \ Checking for memory holes : [100.0 %] - Excluding unnecessary pages : [100.0 %] / Checking for memory holes : [100.0 %] | Checking for memory holes : [100.0 %] \ Checking for memory holes : [100.0 %] - Checking for memory holes : [100.0 %] / Excluding unnecessary pages : [ 6.2 %] | Excluding unnecessary pages : [100.0 %] \ Copying data : [ 0.5 %] - eta: 7s[ 10.757381] random: crng init done [Thu Dec 12 07:43:03 2019] Copying data : [ 5.7 %] / eta: 17s Copying data : [ 9.5 %] | eta: 19s Copying data : [ 13.1 %] \ eta: 20s Copying data : [ 16.5 %] - eta: 20s Copying data : [ 20.0 %] / eta: 20s Copying data : [ 22.2 %] | eta: 21s Copying data : [ 25.8 %] \ eta: 20s Copying data : [ 29.8 %] - eta: 18s Copying data : [ 33.8 %] / eta: 17s Copying data : [ 38.2 %] | eta: 16s Copying data : [ 39.5 %] \ eta: 16s Copying data : [ 40.9 %] - eta: 17s Checking for memory holes : [100.0 %] / Excluding unnecessary pages : [ 59.8 %] | Excluding unnecessary pages : [100.0 %] \ Copying data : [ 42.0 %] - eta: 0s Copying data : [ 43.5 %] / eta: 1s Copying data : [ 45.9 %] | eta: 2s Copying data : [ 48.1 %] \ eta: 3s Copying data : [ 52.6 %] - eta: 3s Copying data : [ 58.2 %] / eta: 3s Copying data : [ 63.3 %] | eta: 3s Copying data : [ 67.5 %] \ eta: 3s Copying data : [ 68.9 %] - eta: 3s Copying data : [ 70.9 %] / eta: 3s Copying data : [ 72.1 %] | eta: 3s Copying data : [ 73.8 %] \ eta: 4s Copying data : [ 76.6 %] - eta: 3s Copying data : [ 79.0 %] / eta: 3s Copying data : [ 82.8 %] | eta: 3s Copying data : [ 86.2 %] \ eta: 2s Copying data : [ 90.2 %] - eta: 1s Copying data : [ 91.8 %] / eta: 1s Copying data : [ 93.2 %] | eta: 1s Checking for memory holes : [100.0 %] \ Excluding unnecessary pages : [100.0 %] - Copying data : [ 94.5 %] / eta: 0s Copying data : [ 96.3 %] | eta: 0s Copying data : [ 98.4 %] \ eta: 0s Copying data : [100.0 %] - eta: 0s Copying data : [100.0 %] / eta: 0s [Thu Dec 12 07:43:38 2019]kdump: saving vmcore complete [Thu Dec 12 07:43:38 2019][ 46.123261] systemd-shutdown[1]: Syncing filesystems and block devices. [Thu Dec 12 07:43:38 2019][ 46.130010] systemd-shutdown[1]: Sending SIGTERM to remaining processes... [Thu Dec 12 07:43:38 2019][ 46.139007] systemd-journald[82]: Received SIGTERM from PID 1 (systemd-shutdow). [Thu Dec 12 07:43:38 2019][ 46.147161] systemd-shutdown[1]: Sending SIGKILL to remaining processes... [Thu Dec 12 07:43:38 2019][ 46.155787] systemd-shutdown[1]: Unmounting file systems. [Thu Dec 12 07:43:38 2019][ 46.161407] systemd-shutdown[271]: Remounting '/sysroot' read-only in with options 'data=ordered'. [Thu Dec 12 07:43:38 2019][ 46.171252] EXT4-fs (sda2): re-mounted. Opts: data=ordered [Thu Dec 12 07:43:38 2019][ 46.176904] systemd-shutdown[272]: Unmounting '/sysroot'. [Thu Dec 12 07:43:38 2019][ 46.205276] systemd-shutdown[273]: Remounting '/' read-only in with options 'size=71208k,nr_inodes=17802'. [Thu Dec 12 07:43:38 2019][ 46.215329] systemd-shutdown[274]: Remounting '/' read-only in with options 'size=71208k,nr_inodes=17802'. [Thu Dec 12 07:43:38 2019][ 46.225267] systemd-shutdown[1]: All filesystems unmounted. [Thu Dec 12 07:43:38 2019][ 46.230844] systemd-shutdown[1]: Deactivating swaps. [Thu Dec 12 07:43:38 2019][ 46.235858] systemd-shutdown[1]: All swaps deactivated. [Thu Dec 12 07:43:38 2019][ 46.241090] systemd-shutdown[1]: Detaching loop devices. [Thu Dec 12 07:43:38 2019][ 46.246523] systemd-shutdown[1]: All loop devices detached. [Thu Dec 12 07:43:38 2019][ 46.252103] systemd-shutdown[1]: Detaching DM devices. [Thu Dec 12 07:43:38 2019][ 46.257323] systemd-shutdown[1]: All DM devices detached. [Thu Dec 12 07:43:38 2019][ 46.262896] systemd-shutdown[1]: Syncing filesystems and block devices. [Thu Dec 12 07:43:38 2019][ 46.269586] systemd-shutdown[1]: Rebooting. [Thu Dec 12 07:43:38 2019][ 46.314574] Restarting system. [Thu Dec 12 07:43:38 2019][ 46.317640] reboot: machine restart [Thu Dec 12 07:44:17 2019][=3h[=3h[=3h[=3hKEY MAPPING FOR CONSOLE REDIRECTION: [Thu Dec 12 07:44:18 2019] [Thu Dec 12 07:44:18 2019]Use the <1> key sequence for [Thu Dec 12 07:44:18 2019]Use the <2> key sequence for [Thu Dec 12 07:44:18 2019]Use the <3> key sequence for [Thu Dec 12 07:44:18 2019]Use the <0> key sequence for [Thu Dec 12 07:44:18 2019]Use the key sequence for [Thu Dec 12 07:44:18 2019]Use the <@> key sequence for [Thu Dec 12 07:44:18 2019] [Thu Dec 12 07:44:18 2019]Use the key sequence for [Thu Dec 12 07:44:18 2019]Use the key sequence for [Thu Dec 12 07:44:18 2019]Use the key sequence for [Thu Dec 12 07:44:18 2019]Use the key sequence for [Thu Dec 12 07:44:18 2019] [Thu Dec 12 07:44:18 2019]Use the key sequence for , where x is any letter [Thu Dec 12 07:44:18 2019]key, and X is the upper case of that key [Thu Dec 12 07:44:18 2019] [Thu Dec 12 07:44:18 2019]Use the key sequence for [Thu Dec 12 07:44:18 2019] [Thu Dec 12 07:44:18 2019]Press the spacebar to pause... [Thu Dec 12 07:44:20 2019][=3h[=3hInitializing PCIe, USB, and Video... Done [Thu Dec 12 07:44:23 2019]PowerEdge R6415 [Thu Dec 12 07:44:23 2019]BIOS Version: 1.10.6 [Thu Dec 12 07:44:23 2019]Console Redirection Enabled Requested by iDRAC [Thu Dec 12 07:44:23 2019] [Thu Dec 12 07:44:23 2019]F2 = System Setup [Thu Dec 12 07:44:23 2019]F10 = Lifecycle Controller (Config [Thu Dec 12 07:44:23 2019] iDRAC, Update FW, Install OS) [Thu Dec 12 07:44:23 2019]F11 = Boot Manager [Thu Dec 12 07:44:23 2019]F12 = PXE Boot [Thu Dec 12 07:44:23 2019]iDRAC IP: 10.0.1.115 [Thu Dec 12 07:44:24 2019]Initializing Firmware Interfaces... [Thu Dec 12 07:45:58 2019] [Thu Dec 12 07:46:00 2019] [Thu Dec 12 07:46:00 2019] [Thu Dec 12 07:46:00 2019] [Thu Dec 12 07:46:00 2019] [Thu Dec 12 07:46:00 2019]Enumerating Boot options... [Thu Dec 12 07:46:00 2019]Enumerating Boot options... Done [Thu Dec 12 07:46:04 2019]Loading Lifecycle Controller Drivers... [Thu Dec 12 07:46:06 2019]Loading Lifecycle Controller Drivers...Done [Thu Dec 12 07:46:06 2019]Lifecycle Controller: Collecting System Inventory... [Thu Dec 12 07:46:11 2019] [Thu Dec 12 07:46:11 2019]iDRAC IP: 10.0.1.115 [Thu Dec 12 07:46:11 2019] [Thu Dec 12 07:46:11 2019]Lifecycle Controller: Done [Thu Dec 12 07:46:11 2019]Booting... [Thu Dec 12 07:46:11 2019]Booting from Integrated RAID Controller 1: CentOS [Thu Dec 12 07:46:12 2019][?25lUse the ^ and v keys to change the selection. [Thu Dec 12 07:46:12 2019] Press 'e' to edit the selected item, or 'c' for a command prompt.   CentOS Linux (3.10.0-957.27.2.el7_lustre.pl2.x86_64) 7 (Core)  CentOS Linux (3.10.0-957.27.2.el7.x86_64) 7 (Core)  CentOS Linux (0-rescue-43a7619e3d434f55b9e35bf69fecdb89) 7 (Core)              Use the ^ and v keys to change the selection. [Thu Dec 12 07:46:12 2019] Press 'e' to edit the selected item, or 'c' for a command prompt.   CentOS Linux (3.10.0-957.27.2.el7_lustre.pl2.x86_64) 7 (Core)  CentOS Linux (3.10.0-957.27.2.el7.x86_64) 7 (Core)  CentOS Linux (0-rescue-43a7619e3d434f55b9e35bf69fecdb89) 7 (Core)                The selected entry will be started automatically in 5s.  The selected entry will be started automatically in 5s.  The selected entry will be started automatically in 4s.  The selected entry will be started automatically in 4s.  The selected entry will be started automatically in 3s.  The selected entry will be started automatically in 3s.  The selected entry will be started automatically in 2s.  The selected entry will be started automatically in 2s.  The selected entry will be started automatically in 1s.  The selected entry will be started automatically in 1s.  The selected entry will be started automatically in 0s.  The selected entry will be started automatically in 0s. [?25h[ 0.000000] Initializing cgroup subsys cpuset [Thu Dec 12 07:46:19 2019][ 0.000000] Initializing cgroup subsys cpu [Thu Dec 12 07:46:19 2019][ 0.000000] Initializing cgroup subsys cpuacct [Thu Dec 12 07:46:19 2019][ 0.000000] Linux version 3.10.0-957.27.2.el7_lustre.pl2.x86_64 (sthiell@oak-rbh01) (gcc version 4.8.5 20150623 (Red Hat 4.8.5-39) (GCC) ) #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 07:46:19 2019][ 0.000000] Command line: BOOT_IMAGE=/boot/vmlinuz-3.10.0-957.27.2.el7_lustre.pl2.x86_64 root=UUID=c4f754c4-e7db-49b7-baed-d6c7905c5cdc ro crashkernel=auto nomodeset console=ttyS0,115200 LANG=en_US.UTF-8 [Thu Dec 12 07:46:19 2019][ 0.000000] e820: BIOS-provided physical RAM map: [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x0000000000000000-0x000000000008efff] usable [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x000000000008f000-0x000000000008ffff] ACPI NVS [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x0000000000090000-0x000000000009ffff] usable [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x0000000000100000-0x000000004f773fff] usable [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x000000004f774000-0x000000005777cfff] reserved [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x000000005777d000-0x000000006cacefff] usable [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x000000006cacf000-0x000000006efcefff] reserved [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x000000006efcf000-0x000000006fdfefff] ACPI NVS [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x000000006fdff000-0x000000006fffefff] ACPI data [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x000000006ffff000-0x000000006fffffff] usable [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x0000000070000000-0x000000008fffffff] reserved [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x00000000fec10000-0x00000000fec10fff] reserved [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x00000000fed80000-0x00000000fed80fff] reserved [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x0000000100000000-0x000000107f37ffff] usable [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x000000107f380000-0x000000107fffffff] reserved [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x0000001080000000-0x000000207ff7ffff] usable [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x000000207ff80000-0x000000207fffffff] reserved [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x0000002080000000-0x000000307ff7ffff] usable [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x000000307ff80000-0x000000307fffffff] reserved [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x0000003080000000-0x000000407ff7ffff] usable [Thu Dec 12 07:46:19 2019][ 0.000000] BIOS-e820: [mem 0x000000407ff80000-0x000000407fffffff] reserved [Thu Dec 12 07:46:19 2019][ 0.000000] NX (Execute Disable) protection: active [Thu Dec 12 07:46:19 2019][ 0.000000] extended physical RAM map: [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000000000000000-0x000000000008efff] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x000000000008f000-0x000000000008ffff] ACPI NVS [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000000000090000-0x000000000009ffff] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000000000100000-0x0000000037ac001f] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000000037ac0020-0x0000000037ad865f] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000000037ad8660-0x0000000037ad901f] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000000037ad9020-0x0000000037b0265f] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b02660-0x0000000037b0301f] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b03020-0x0000000037b0b05f] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b0b060-0x0000000037b0c01f] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b0c020-0x0000000037b3dc5f] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b3dc60-0x0000000037b3e01f] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b3e020-0x0000000037b6fc5f] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b6fc60-0x0000000037b7001f] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b70020-0x0000000037c11c5f] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000000037c11c60-0x000000004f773fff] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x000000004f774000-0x000000005777cfff] reserved [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x000000005777d000-0x000000006cacefff] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x000000006cacf000-0x000000006efcefff] reserved [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x000000006efcf000-0x000000006fdfefff] ACPI NVS [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x000000006fdff000-0x000000006fffefff] ACPI data [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x000000006ffff000-0x000000006fffffff] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000000070000000-0x000000008fffffff] reserved [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x00000000fec10000-0x00000000fec10fff] reserved [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x00000000fed80000-0x00000000fed80fff] reserved [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000000100000000-0x000000107f37ffff] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x000000107f380000-0x000000107fffffff] reserved [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000001080000000-0x000000207ff7ffff] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x000000207ff80000-0x000000207fffffff] reserved [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000002080000000-0x000000307ff7ffff] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x000000307ff80000-0x000000307fffffff] reserved [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x0000003080000000-0x000000407ff7ffff] usable [Thu Dec 12 07:46:19 2019][ 0.000000] reserve setup_data: [mem 0x000000407ff80000-0x000000407fffffff] reserved [Thu Dec 12 07:46:19 2019][ 0.000000] efi: EFI v2.50 by Dell Inc. [Thu Dec 12 07:46:19 2019][ 0.000000] efi: ACPI=0x6fffe000 ACPI 2.0=0x6fffe014 SMBIOS=0x6eab5000 SMBIOS 3.0=0x6eab3000 [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem00: type=3, attr=0xf, range=[0x0000000000000000-0x0000000000001000) (0MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem01: type=2, attr=0xf, range=[0x0000000000001000-0x0000000000002000) (0MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem02: type=7, attr=0xf, range=[0x0000000000002000-0x0000000000010000) (0MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem03: type=3, attr=0xf, range=[0x0000000000010000-0x0000000000014000) (0MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem04: type=7, attr=0xf, range=[0x0000000000014000-0x0000000000063000) (0MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem05: type=3, attr=0xf, range=[0x0000000000063000-0x000000000008f000) (0MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem06: type=10, attr=0xf, range=[0x000000000008f000-0x0000000000090000) (0MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem07: type=3, attr=0xf, range=[0x0000000000090000-0x00000000000a0000) (0MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem08: type=4, attr=0xf, range=[0x0000000000100000-0x0000000000120000) (0MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem09: type=7, attr=0xf, range=[0x0000000000120000-0x0000000000c00000) (10MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem10: type=3, attr=0xf, range=[0x0000000000c00000-0x0000000001000000) (4MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem11: type=2, attr=0xf, range=[0x0000000001000000-0x000000000267b000) (22MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem12: type=7, attr=0xf, range=[0x000000000267b000-0x0000000004000000) (25MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem13: type=4, attr=0xf, range=[0x0000000004000000-0x000000000403b000) (0MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem14: type=7, attr=0xf, range=[0x000000000403b000-0x0000000037ac0000) (826MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem15: type=2, attr=0xf, range=[0x0000000037ac0000-0x000000004edd7000) (371MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem16: type=7, attr=0xf, range=[0x000000004edd7000-0x000000004eddb000) (0MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem17: type=2, attr=0xf, range=[0x000000004eddb000-0x000000004eddd000) (0MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem18: type=1, attr=0xf, range=[0x000000004eddd000-0x000000004eefa000) (1MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem19: type=2, attr=0xf, range=[0x000000004eefa000-0x000000004f019000) (1MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem20: type=1, attr=0xf, range=[0x000000004f019000-0x000000004f128000) (1MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem21: type=3, attr=0xf, range=[0x000000004f128000-0x000000004f774000) (6MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem22: type=0, attr=0xf, range=[0x000000004f774000-0x000000005777d000) (128MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem23: type=3, attr=0xf, range=[0x000000005777d000-0x000000005796e000) (1MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem24: type=4, attr=0xf, range=[0x000000005796e000-0x000000005b4cf000) (59MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem25: type=3, attr=0xf, range=[0x000000005b4cf000-0x000000005b8cf000) (4MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem26: type=7, attr=0xf, range=[0x000000005b8cf000-0x0000000064a36000) (145MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem27: type=4, attr=0xf, range=[0x0000000064a36000-0x0000000064a43000) (0MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem28: type=7, attr=0xf, range=[0x0000000064a43000-0x0000000064a47000) (0MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem29: type=4, attr=0xf, range=[0x0000000064a47000-0x0000000065061000) (6MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem30: type=7, attr=0xf, range=[0x0000000065061000-0x0000000065062000) (0MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem31: type=4, attr=0xf, range=[0x0000000065062000-0x0000000065069000) (0MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem32: type=7, attr=0xf, range=[0x0000000065069000-0x000000006506a000) (0MB) [Thu Dec 12 07:46:19 2019][ 0.000000] efi: mem33: type=4, attr=0xf, range=[0x000000006506a000-0x000000006506b000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem34: type=7, attr=0xf, range=[0x000000006506b000-0x000000006506c000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem35: type=4, attr=0xf, range=[0x000000006506c000-0x0000000065076000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem36: type=7, attr=0xf, range=[0x0000000065076000-0x0000000065077000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem37: type=4, attr=0xf, range=[0x0000000065077000-0x000000006507d000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem38: type=7, attr=0xf, range=[0x000000006507d000-0x000000006507e000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem39: type=4, attr=0xf, range=[0x000000006507e000-0x0000000065083000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem40: type=7, attr=0xf, range=[0x0000000065083000-0x0000000065086000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem41: type=4, attr=0xf, range=[0x0000000065086000-0x0000000065093000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem42: type=7, attr=0xf, range=[0x0000000065093000-0x0000000065094000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem43: type=4, attr=0xf, range=[0x0000000065094000-0x000000006509f000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem44: type=7, attr=0xf, range=[0x000000006509f000-0x00000000650a0000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem45: type=4, attr=0xf, range=[0x00000000650a0000-0x00000000650a1000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem46: type=7, attr=0xf, range=[0x00000000650a1000-0x00000000650a2000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem47: type=4, attr=0xf, range=[0x00000000650a2000-0x00000000650aa000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem48: type=7, attr=0xf, range=[0x00000000650aa000-0x00000000650ab000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem49: type=4, attr=0xf, range=[0x00000000650ab000-0x00000000650ad000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem50: type=7, attr=0xf, range=[0x00000000650ad000-0x00000000650ae000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem51: type=4, attr=0xf, range=[0x00000000650ae000-0x00000000650b4000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem52: type=7, attr=0xf, range=[0x00000000650b4000-0x00000000650b5000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem53: type=4, attr=0xf, range=[0x00000000650b5000-0x00000000650d5000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem54: type=7, attr=0xf, range=[0x00000000650d5000-0x00000000650d6000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem55: type=4, attr=0xf, range=[0x00000000650d6000-0x0000000065432000) (3MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem56: type=7, attr=0xf, range=[0x0000000065432000-0x0000000065433000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem57: type=4, attr=0xf, range=[0x0000000065433000-0x000000006543b000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem58: type=7, attr=0xf, range=[0x000000006543b000-0x000000006543c000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem59: type=4, attr=0xf, range=[0x000000006543c000-0x000000006544e000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem60: type=7, attr=0xf, range=[0x000000006544e000-0x000000006544f000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem61: type=4, attr=0xf, range=[0x000000006544f000-0x0000000065463000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem62: type=7, attr=0xf, range=[0x0000000065463000-0x0000000065464000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem63: type=4, attr=0xf, range=[0x0000000065464000-0x0000000065473000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem64: type=7, attr=0xf, range=[0x0000000065473000-0x0000000065474000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem65: type=4, attr=0xf, range=[0x0000000065474000-0x00000000654c5000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem66: type=7, attr=0xf, range=[0x00000000654c5000-0x00000000654c6000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem67: type=4, attr=0xf, range=[0x00000000654c6000-0x00000000654d9000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem68: type=7, attr=0xf, range=[0x00000000654d9000-0x00000000654db000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem69: type=4, attr=0xf, range=[0x00000000654db000-0x00000000654e0000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem70: type=7, attr=0xf, range=[0x00000000654e0000-0x00000000654e1000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem71: type=4, attr=0xf, range=[0x00000000654e1000-0x00000000654fa000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem72: type=7, attr=0xf, range=[0x00000000654fa000-0x00000000654fb000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem73: type=4, attr=0xf, range=[0x00000000654fb000-0x0000000065508000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem74: type=7, attr=0xf, range=[0x0000000065508000-0x0000000065509000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem75: type=4, attr=0xf, range=[0x0000000065509000-0x000000006550b000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem76: type=7, attr=0xf, range=[0x000000006550b000-0x000000006550c000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem77: type=4, attr=0xf, range=[0x000000006550c000-0x000000006550e000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem78: type=7, attr=0xf, range=[0x000000006550e000-0x000000006550f000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem79: type=4, attr=0xf, range=[0x000000006550f000-0x0000000065513000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem80: type=7, attr=0xf, range=[0x0000000065513000-0x0000000065514000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem81: type=4, attr=0xf, range=[0x0000000065514000-0x0000000065515000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem82: type=7, attr=0xf, range=[0x0000000065515000-0x0000000065516000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem83: type=4, attr=0xf, range=[0x0000000065516000-0x0000000065522000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem84: type=7, attr=0xf, range=[0x0000000065522000-0x0000000065523000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem85: type=4, attr=0xf, range=[0x0000000065523000-0x0000000065593000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem86: type=7, attr=0xf, range=[0x0000000065593000-0x0000000065594000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem87: type=4, attr=0xf, range=[0x0000000065594000-0x000000006559c000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem88: type=7, attr=0xf, range=[0x000000006559c000-0x000000006559d000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem89: type=4, attr=0xf, range=[0x000000006559d000-0x00000000655ba000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem90: type=7, attr=0xf, range=[0x00000000655ba000-0x00000000655bb000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem91: type=4, attr=0xf, range=[0x00000000655bb000-0x00000000655c4000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem92: type=7, attr=0xf, range=[0x00000000655c4000-0x00000000655c5000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem93: type=4, attr=0xf, range=[0x00000000655c5000-0x00000000655ea000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem94: type=7, attr=0xf, range=[0x00000000655ea000-0x00000000655eb000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem95: type=4, attr=0xf, range=[0x00000000655eb000-0x00000000655f1000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem96: type=7, attr=0xf, range=[0x00000000655f1000-0x00000000655f2000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem97: type=4, attr=0xf, range=[0x00000000655f2000-0x000000006b8cf000) (98MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem98: type=7, attr=0xf, range=[0x000000006b8cf000-0x000000006b8d0000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem99: type=3, attr=0xf, range=[0x000000006b8d0000-0x000000006cacf000) (17MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem100: type=6, attr=0x800000000000000f, range=[0x000000006cacf000-0x000000006cbcf000) (1MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem101: type=5, attr=0x800000000000000f, range=[0x000000006cbcf000-0x000000006cdcf000) (2MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem102: type=0, attr=0xf, range=[0x000000006cdcf000-0x000000006efcf000) (34MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem103: type=10, attr=0xf, range=[0x000000006efcf000-0x000000006fdff000) (14MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem104: type=9, attr=0xf, range=[0x000000006fdff000-0x000000006ffff000) (2MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem105: type=4, attr=0xf, range=[0x000000006ffff000-0x0000000070000000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem106: type=7, attr=0xf, range=[0x0000000100000000-0x000000107f380000) (63475MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem107: type=7, attr=0xf, range=[0x0000001080000000-0x000000207ff80000) (65535MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem108: type=7, attr=0xf, range=[0x0000002080000000-0x000000307ff80000) (65535MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem109: type=7, attr=0xf, range=[0x0000003080000000-0x000000407ff80000) (65535MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem110: type=0, attr=0x9, range=[0x0000000070000000-0x0000000080000000) (256MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem111: type=11, attr=0x800000000000000f, range=[0x0000000080000000-0x0000000090000000) (256MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem112: type=11, attr=0x800000000000000f, range=[0x00000000fec10000-0x00000000fec11000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem113: type=11, attr=0x800000000000000f, range=[0x00000000fed80000-0x00000000fed81000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem114: type=0, attr=0x0, range=[0x000000107f380000-0x0000001080000000) (12MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem115: type=0, attr=0x0, range=[0x000000207ff80000-0x0000002080000000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem116: type=0, attr=0x0, range=[0x000000307ff80000-0x0000003080000000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] efi: mem117: type=0, attr=0x0, range=[0x000000407ff80000-0x0000004080000000) (0MB) [Thu Dec 12 07:46:20 2019][ 0.000000] SMBIOS 3.2.0 present. [Thu Dec 12 07:46:20 2019][ 0.000000] DMI: Dell Inc. PowerEdge R6415/07YXFK, BIOS 1.10.6 08/15/2019 [Thu Dec 12 07:46:20 2019][ 0.000000] e820: last_pfn = 0x407ff80 max_arch_pfn = 0x400000000 [Thu Dec 12 07:46:20 2019][ 0.000000] PAT configuration [0-7]: WB WC UC- UC WB WP UC- UC [Thu Dec 12 07:46:20 2019][ 0.000000] e820: last_pfn = 0x70000 max_arch_pfn = 0x400000000 [Thu Dec 12 07:46:20 2019][ 0.000000] Using GB pages for direct mapping [Thu Dec 12 07:46:20 2019][ 0.000000] RAMDISK: [mem 0x37c12000-0x38f51fff] [Thu Dec 12 07:46:20 2019][ 0.000000] Early table checksum verification disabled [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: RSDP 000000006fffe014 00024 (v02 DELL ) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: XSDT 000000006fffd0e8 000AC (v01 DELL PE_SC3 00000002 DELL 00000001) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: FACP 000000006fff0000 00114 (v06 DELL PE_SC3 00000002 DELL 00000001) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: DSDT 000000006ffdc000 1038C (v02 DELL PE_SC3 00000002 DELL 00000001) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: FACS 000000006fdd3000 00040 [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: SSDT 000000006fffc000 000D2 (v02 DELL PE_SC3 00000002 MSFT 04000000) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: BERT 000000006fffb000 00030 (v01 DELL BERT 00000001 DELL 00000001) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: HEST 000000006fffa000 006DC (v01 DELL HEST 00000001 DELL 00000001) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: SSDT 000000006fff9000 00294 (v01 DELL PE_SC3 00000001 AMD 00000001) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: SRAT 000000006fff8000 00420 (v03 DELL PE_SC3 00000001 AMD 00000001) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: MSCT 000000006fff7000 0004E (v01 DELL PE_SC3 00000000 AMD 00000001) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: SLIT 000000006fff6000 0003C (v01 DELL PE_SC3 00000001 AMD 00000001) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: CRAT 000000006fff3000 02DC0 (v01 DELL PE_SC3 00000001 AMD 00000001) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: EINJ 000000006fff2000 00150 (v01 DELL PE_SC3 00000001 AMD 00000001) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: SLIC 000000006fff1000 00024 (v01 DELL PE_SC3 00000002 DELL 00000001) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: HPET 000000006ffef000 00038 (v01 DELL PE_SC3 00000002 DELL 00000001) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: APIC 000000006ffee000 004B2 (v03 DELL PE_SC3 00000002 DELL 00000001) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: MCFG 000000006ffed000 0003C (v01 DELL PE_SC3 00000002 DELL 00000001) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: SSDT 000000006ffdb000 00629 (v02 DELL xhc_port 00000001 INTL 20170119) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: IVRS 000000006ffda000 00210 (v02 DELL PE_SC3 00000001 AMD 00000000) [Thu Dec 12 07:46:20 2019][ 0.000000] ACPI: SSDT 000000006ffd8000 01658 (v01 AMD CPMCMN 00000001 INTL 20170119) [Thu Dec 12 07:46:20 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x00 -> Node 0 [Thu Dec 12 07:46:20 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x01 -> Node 0 [Thu Dec 12 07:46:20 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x02 -> Node 0 [Thu Dec 12 07:46:20 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x03 -> Node 0 [Thu Dec 12 07:46:20 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x04 -> Node 0 [Thu Dec 12 07:46:20 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x05 -> Node 0 [Thu Dec 12 07:46:20 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x08 -> Node 0 [Thu Dec 12 07:46:20 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x09 -> Node 0 [Thu Dec 12 07:46:20 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x0a -> Node 0 [Thu Dec 12 07:46:20 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x0b -> Node 0 [Thu Dec 12 07:46:20 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x0c -> Node 0 [Thu Dec 12 07:46:20 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x0d -> Node 0 [Thu Dec 12 07:46:20 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x10 -> Node 1 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x11 -> Node 1 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x12 -> Node 1 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x13 -> Node 1 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x14 -> Node 1 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x15 -> Node 1 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x18 -> Node 1 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x19 -> Node 1 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x1a -> Node 1 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x1b -> Node 1 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x1c -> Node 1 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x1d -> Node 1 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x20 -> Node 2 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x21 -> Node 2 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x22 -> Node 2 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x23 -> Node 2 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x24 -> Node 2 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x25 -> Node 2 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x28 -> Node 2 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x29 -> Node 2 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x2a -> Node 2 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x2b -> Node 2 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x2c -> Node 2 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x2d -> Node 2 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x30 -> Node 3 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x31 -> Node 3 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x32 -> Node 3 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x33 -> Node 3 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x34 -> Node 3 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x35 -> Node 3 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x38 -> Node 3 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x39 -> Node 3 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x3a -> Node 3 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x3b -> Node 3 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x3c -> Node 3 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x3d -> Node 3 [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: Node 0 PXM 0 [mem 0x100000000-0x107fffffff] [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: Node 1 PXM 1 [mem 0x1080000000-0x207fffffff] [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: Node 2 PXM 2 [mem 0x2080000000-0x307fffffff] [Thu Dec 12 07:46:21 2019][ 0.000000] SRAT: Node 3 PXM 3 [mem 0x3080000000-0x407fffffff] [Thu Dec 12 07:46:21 2019][ 0.000000] NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] [Thu Dec 12 07:46:21 2019][ 0.000000] NUMA: Node 0 [mem 0x00000000-0x7fffffff] + [mem 0x100000000-0x107fffffff] -> [mem 0x00000000-0x107fffffff] [Thu Dec 12 07:46:21 2019][ 0.000000] NODE_DATA(0) allocated [mem 0x107f359000-0x107f37ffff] [Thu Dec 12 07:46:21 2019][ 0.000000] NODE_DATA(1) allocated [mem 0x207ff59000-0x207ff7ffff] [Thu Dec 12 07:46:21 2019][ 0.000000] NODE_DATA(2) allocated [mem 0x307ff59000-0x307ff7ffff] [Thu Dec 12 07:46:21 2019][ 0.000000] NODE_DATA(3) allocated [mem 0x407ff58000-0x407ff7efff] [Thu Dec 12 07:46:21 2019][ 0.000000] Reserving 176MB of memory at 704MB for crashkernel (System RAM: 261692MB) [Thu Dec 12 07:46:21 2019][ 0.000000] Zone ranges: [Thu Dec 12 07:46:21 2019][ 0.000000] DMA [mem 0x00001000-0x00ffffff] [Thu Dec 12 07:46:21 2019][ 0.000000] DMA32 [mem 0x01000000-0xffffffff] [Thu Dec 12 07:46:21 2019][ 0.000000] Normal [mem 0x100000000-0x407ff7ffff] [Thu Dec 12 07:46:21 2019][ 0.000000] Movable zone start for each node [Thu Dec 12 07:46:21 2019][ 0.000000] Early memory node ranges [Thu Dec 12 07:46:21 2019][ 0.000000] node 0: [mem 0x00001000-0x0008efff] [Thu Dec 12 07:46:21 2019][ 0.000000] node 0: [mem 0x00090000-0x0009ffff] [Thu Dec 12 07:46:21 2019][ 0.000000] node 0: [mem 0x00100000-0x4f773fff] [Thu Dec 12 07:46:21 2019][ 0.000000] node 0: [mem 0x5777d000-0x6cacefff] [Thu Dec 12 07:46:21 2019][ 0.000000] node 0: [mem 0x6ffff000-0x6fffffff] [Thu Dec 12 07:46:21 2019][ 0.000000] node 0: [mem 0x100000000-0x107f37ffff] [Thu Dec 12 07:46:21 2019][ 0.000000] node 1: [mem 0x1080000000-0x207ff7ffff] [Thu Dec 12 07:46:21 2019][ 0.000000] node 2: [mem 0x2080000000-0x307ff7ffff] [Thu Dec 12 07:46:21 2019][ 0.000000] node 3: [mem 0x3080000000-0x407ff7ffff] [Thu Dec 12 07:46:21 2019][ 0.000000] Initmem setup node 0 [mem 0x00001000-0x107f37ffff] [Thu Dec 12 07:46:21 2019][ 0.000000] Initmem setup node 1 [mem 0x1080000000-0x207ff7ffff] [Thu Dec 12 07:46:21 2019][ 0.000000] Initmem setup node 2 [mem 0x2080000000-0x307ff7ffff] [Thu Dec 12 07:46:21 2019][ 0.000000] Initmem setup node 3 [mem 0x3080000000-0x407ff7ffff] [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: PM-Timer IO Port: 0x408 [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x00] lapic_id[0x00] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x01] lapic_id[0x10] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x02] lapic_id[0x20] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x03] lapic_id[0x30] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x04] lapic_id[0x08] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x05] lapic_id[0x18] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x06] lapic_id[0x28] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x07] lapic_id[0x38] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x08] lapic_id[0x02] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x09] lapic_id[0x12] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0a] lapic_id[0x22] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0b] lapic_id[0x32] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0c] lapic_id[0x0a] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0d] lapic_id[0x1a] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0e] lapic_id[0x2a] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0f] lapic_id[0x3a] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x10] lapic_id[0x04] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x11] lapic_id[0x14] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x12] lapic_id[0x24] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x13] lapic_id[0x34] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x14] lapic_id[0x0c] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x15] lapic_id[0x1c] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x16] lapic_id[0x2c] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x17] lapic_id[0x3c] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x18] lapic_id[0x01] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x19] lapic_id[0x11] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1a] lapic_id[0x21] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1b] lapic_id[0x31] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1c] lapic_id[0x09] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1d] lapic_id[0x19] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1e] lapic_id[0x29] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1f] lapic_id[0x39] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x20] lapic_id[0x03] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x21] lapic_id[0x13] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x22] lapic_id[0x23] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x23] lapic_id[0x33] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x24] lapic_id[0x0b] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x25] lapic_id[0x1b] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x26] lapic_id[0x2b] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x27] lapic_id[0x3b] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x28] lapic_id[0x05] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x29] lapic_id[0x15] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2a] lapic_id[0x25] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2b] lapic_id[0x35] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2c] lapic_id[0x0d] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2d] lapic_id[0x1d] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2e] lapic_id[0x2d] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2f] lapic_id[0x3d] enabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x30] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x31] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x32] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x33] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x34] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x35] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x36] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x37] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x38] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x39] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3a] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3b] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3c] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3d] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3e] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3f] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x40] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x41] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x42] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x43] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x44] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x45] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x46] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x47] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x48] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x49] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4a] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4b] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4c] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4d] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4e] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4f] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x50] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x51] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x52] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x53] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x54] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x55] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x56] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x57] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x58] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x59] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5a] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5b] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5c] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5d] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5e] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5f] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x60] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x61] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x62] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x63] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x64] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x65] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x66] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x67] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x68] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x69] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6a] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6b] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6c] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6d] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6e] lapic_id[0x00] disabled) [Thu Dec 12 07:46:21 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6f] lapic_id[0x00] disabled) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x70] lapic_id[0x00] disabled) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x71] lapic_id[0x00] disabled) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x72] lapic_id[0x00] disabled) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x73] lapic_id[0x00] disabled) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x74] lapic_id[0x00] disabled) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x75] lapic_id[0x00] disabled) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x76] lapic_id[0x00] disabled) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x77] lapic_id[0x00] disabled) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x78] lapic_id[0x00] disabled) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x79] lapic_id[0x00] disabled) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7a] lapic_id[0x00] disabled) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7b] lapic_id[0x00] disabled) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7c] lapic_id[0x00] disabled) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7d] lapic_id[0x00] disabled) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7e] lapic_id[0x00] disabled) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7f] lapic_id[0x00] disabled) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: LAPIC_NMI (acpi_id[0xff] high edge lint[0x1]) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: IOAPIC (id[0x80] address[0xfec00000] gsi_base[0]) [Thu Dec 12 07:46:22 2019][ 0.000000] IOAPIC[0]: apic_id 128, version 33, address 0xfec00000, GSI 0-23 [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: IOAPIC (id[0x81] address[0xfd880000] gsi_base[24]) [Thu Dec 12 07:46:22 2019][ 0.000000] IOAPIC[1]: apic_id 129, version 33, address 0xfd880000, GSI 24-55 [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: IOAPIC (id[0x82] address[0xe0900000] gsi_base[56]) [Thu Dec 12 07:46:22 2019][ 0.000000] IOAPIC[2]: apic_id 130, version 33, address 0xe0900000, GSI 56-87 [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: IOAPIC (id[0x83] address[0xc5900000] gsi_base[88]) [Thu Dec 12 07:46:22 2019][ 0.000000] IOAPIC[3]: apic_id 131, version 33, address 0xc5900000, GSI 88-119 [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: IOAPIC (id[0x84] address[0xaa900000] gsi_base[120]) [Thu Dec 12 07:46:22 2019][ 0.000000] IOAPIC[4]: apic_id 132, version 33, address 0xaa900000, GSI 120-151 [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 low level) [Thu Dec 12 07:46:22 2019][ 0.000000] Using ACPI (MADT) for SMP configuration information [Thu Dec 12 07:46:22 2019][ 0.000000] ACPI: HPET id: 0x10228201 base: 0xfed00000 [Thu Dec 12 07:46:22 2019][ 0.000000] smpboot: Allowing 128 CPUs, 80 hotplug CPUs [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x0008f000-0x0008ffff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x000a0000-0x000fffff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37ac0000-0x37ac0fff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37ad8000-0x37ad8fff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37ad9000-0x37ad9fff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b02000-0x37b02fff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b03000-0x37b03fff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b0b000-0x37b0bfff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b0c000-0x37b0cfff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b3d000-0x37b3dfff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b3e000-0x37b3efff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b6f000-0x37b6ffff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b70000-0x37b70fff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37c11000-0x37c11fff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x4f774000-0x5777cfff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x6cacf000-0x6efcefff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x6efcf000-0x6fdfefff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x6fdff000-0x6fffefff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x70000000-0x8fffffff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x90000000-0xfec0ffff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0xfec10000-0xfec10fff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0xfec11000-0xfed7ffff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0xfed80000-0xfed80fff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0xfed81000-0xffffffff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x107f380000-0x107fffffff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x207ff80000-0x207fffffff] [Thu Dec 12 07:46:22 2019][ 0.000000] PM: Registered nosave memory: [mem 0x307ff80000-0x307fffffff] [Thu Dec 12 07:46:22 2019][ 0.000000] e820: [mem 0x90000000-0xfec0ffff] available for PCI devices [Thu Dec 12 07:46:22 2019][ 0.000000] Booting paravirtualized kernel on bare hardware [Thu Dec 12 07:46:22 2019][ 0.000000] setup_percpu: NR_CPUS:5120 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:4 [Thu Dec 12 07:46:22 2019][ 0.000000] PERCPU: Embedded 38 pages/cpu @ffff8da7fee00000 s118784 r8192 d28672 u262144 [Thu Dec 12 07:46:22 2019][ 0.000000] Built 4 zonelists in Zone order, mobility grouping on. Total pages: 65945355 [Thu Dec 12 07:46:22 2019][ 0.000000] Policy zone: Normal [Thu Dec 12 07:46:22 2019][ 0.000000] Kernel command line: BOOT_IMAGE=/boot/vmlinuz-3.10.0-957.27.2.el7_lustre.pl2.x86_64 root=UUID=c4f754c4-e7db-49b7-baed-d6c7905c5cdc ro crashkernel=auto nomodeset console=ttyS0,115200 LANG=en_US.UTF-8 [Thu Dec 12 07:46:22 2019][ 0.000000] PID hash table entries: 4096 (order: 3, 32768 bytes) [Thu Dec 12 07:46:22 2019][ 0.000000] x86/fpu: xstate_offset[2]: 0240, xstate_sizes[2]: 0100 [Thu Dec 12 07:46:22 2019][ 0.000000] xsave: enabled xstate_bv 0x7, cntxt size 0x340 using standard form [Thu Dec 12 07:46:22 2019][ 0.000000] Memory: 9561192k/270532096k available (7676k kernel code, 2559084k absent, 4706768k reserved, 6045k data, 1876k init) [Thu Dec 12 07:46:22 2019][ 0.000000] SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=4 [Thu Dec 12 07:46:22 2019][ 0.000000] Hierarchical RCU implementation. [Thu Dec 12 07:46:22 2019][ 0.000000] RCU restricting CPUs from NR_CPUS=5120 to nr_cpu_ids=128. [Thu Dec 12 07:46:22 2019][ 0.000000] NR_IRQS:327936 nr_irqs:3624 0 [Thu Dec 12 07:46:22 2019][ 0.000000] Console: colour dummy device 80x25 [Thu Dec 12 07:46:22 2019][ 0.000000] console [ttyS0] enabled [Thu Dec 12 07:46:22 2019][ 0.000000] allocated 1072693248 bytes of page_cgroup [Thu Dec 12 07:46:22 2019][ 0.000000] please try 'cgroup_disable=memory' option if you don't want memory cgroups [Thu Dec 12 07:46:22 2019][ 0.000000] Enabling automatic NUMA balancing. Configure with numa_balancing= or the kernel.numa_balancing sysctl [Thu Dec 12 07:46:22 2019][ 0.000000] tsc: Fast TSC calibration using PIT [Thu Dec 12 07:46:22 2019][ 0.000000] tsc: Detected 1996.233 MHz processor [Thu Dec 12 07:46:22 2019][ 0.000056] Calibrating delay loop (skipped), value calculated using timer frequency.. 3992.46 BogoMIPS (lpj=1996233) [Thu Dec 12 07:46:22 2019][ 0.010703] pid_max: default: 131072 minimum: 1024 [Thu Dec 12 07:46:22 2019][ 0.016321] Security Framework initialized [Thu Dec 12 07:46:22 2019][ 0.020442] SELinux: Initializing. [Thu Dec 12 07:46:22 2019][ 0.024001] Yama: becoming mindful. [Thu Dec 12 07:46:22 2019][ 0.044033] Dentry cache hash table entries: 33554432 (order: 16, 268435456 bytes) [Thu Dec 12 07:46:22 2019][ 0.099746] Inode-cache hash table entries: 16777216 (order: 15, 134217728 bytes) [Thu Dec 12 07:46:22 2019][ 0.127267] Mount-cache hash table entries: 524288 (order: 10, 4194304 bytes) [Thu Dec 12 07:46:22 2019][ 0.134687] Mountpoint-cache hash table entries: 524288 (order: 10, 4194304 bytes) [Thu Dec 12 07:46:22 2019][ 0.143824] Initializing cgroup subsys memory [Thu Dec 12 07:46:22 2019][ 0.148217] Initializing cgroup subsys devices [Thu Dec 12 07:46:22 2019][ 0.152673] Initializing cgroup subsys freezer [Thu Dec 12 07:46:22 2019][ 0.157127] Initializing cgroup subsys net_cls [Thu Dec 12 07:46:22 2019][ 0.161583] Initializing cgroup subsys blkio [Thu Dec 12 07:46:22 2019][ 0.165865] Initializing cgroup subsys perf_event [Thu Dec 12 07:46:22 2019][ 0.170585] Initializing cgroup subsys hugetlb [Thu Dec 12 07:46:22 2019][ 0.175040] Initializing cgroup subsys pids [Thu Dec 12 07:46:22 2019][ 0.179238] Initializing cgroup subsys net_prio [Thu Dec 12 07:46:22 2019][ 0.189460] LVT offset 2 assigned for vector 0xf4 [Thu Dec 12 07:46:22 2019][ 0.194189] Last level iTLB entries: 4KB 1024, 2MB 1024, 4MB 512 [Thu Dec 12 07:46:22 2019][ 0.200209] Last level dTLB entries: 4KB 1536, 2MB 1536, 4MB 768 [Thu Dec 12 07:46:22 2019][ 0.206223] tlb_flushall_shift: 6 [Thu Dec 12 07:46:22 2019][ 0.209573] Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl and seccomp [Thu Dec 12 07:46:22 2019][ 0.219147] FEATURE SPEC_CTRL Not Present [Thu Dec 12 07:46:22 2019][ 0.223167] FEATURE IBPB_SUPPORT Present [Thu Dec 12 07:46:22 2019][ 0.227104] Spectre V2 : Enabling Indirect Branch Prediction Barrier [Thu Dec 12 07:46:22 2019][ 0.233539] Spectre V2 : Mitigation: Full retpoline [Thu Dec 12 07:46:22 2019][ 0.239383] Freeing SMP alternatives: 28k freed [Thu Dec 12 07:46:22 2019][ 0.245829] ACPI: Core revision 20130517 [Thu Dec 12 07:46:22 2019][ 0.254510] ACPI: All ACPI Tables successfully acquired [Thu Dec 12 07:46:22 2019][ 0.266430] ftrace: allocating 29216 entries in 115 pages [Thu Dec 12 07:46:23 2019][ 0.606229] Switched APIC routing to physical flat. [Thu Dec 12 07:46:23 2019][ 0.613155] ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 [Thu Dec 12 07:46:23 2019][ 0.629159] smpboot: CPU0: AMD EPYC 7401P 24-Core Processor (fam: 17, model: 01, stepping: 02) [Thu Dec 12 07:46:23 2019][ 0.713602] random: fast init done [Thu Dec 12 07:46:23 2019][ 0.741602] APIC calibration not consistent with PM-Timer: 101ms instead of 100ms [Thu Dec 12 07:46:23 2019][ 0.749079] APIC delta adjusted to PM-Timer: 623827 (636297) [Thu Dec 12 07:46:23 2019][ 0.754771] Performance Events: Fam17h core perfctr, AMD PMU driver. [Thu Dec 12 07:46:23 2019][ 0.761199] ... version: 0 [Thu Dec 12 07:46:23 2019][ 0.765210] ... bit width: 48 [Thu Dec 12 07:46:23 2019][ 0.769309] ... generic registers: 6 [Thu Dec 12 07:46:23 2019][ 0.773322] ... value mask: 0000ffffffffffff [Thu Dec 12 07:46:23 2019][ 0.778635] ... max period: 00007fffffffffff [Thu Dec 12 07:46:23 2019][ 0.783946] ... fixed-purpose events: 0 [Thu Dec 12 07:46:23 2019][ 0.787959] ... event mask: 000000000000003f [Thu Dec 12 07:46:23 2019][ 0.796304] NMI watchdog: enabled on all CPUs, permanently consumes one hw-PMU counter. [Thu Dec 12 07:46:23 2019][ 0.804389] smpboot: Booting Node 1, Processors #1 OK [Thu Dec 12 07:46:23 2019][ 0.817609] smpboot: Booting Node 2, Processors #2 OK [Thu Dec 12 07:46:23 2019][ 0.830811] smpboot: Booting Node 3, Processors #3 OK [Thu Dec 12 07:46:23 2019][ 0.844002] smpboot: Booting Node 0, Processors #4 OK [Thu Dec 12 07:46:23 2019][ 0.857183] smpboot: Booting Node 1, Processors #5 OK [Thu Dec 12 07:46:23 2019][ 0.870362] smpboot: Booting Node 2, Processors #6 OK [Thu Dec 12 07:46:23 2019][ 0.883543] smpboot: Booting Node 3, Processors #7 OK [Thu Dec 12 07:46:23 2019][ 0.896726] smpboot: Booting Node 0, Processors #8 OK [Thu Dec 12 07:46:23 2019][ 0.910119] smpboot: Booting Node 1, Processors #9 OK [Thu Dec 12 07:46:23 2019][ 0.923315] smpboot: Booting Node 2, Processors #10 OK [Thu Dec 12 07:46:23 2019][ 0.936593] smpboot: Booting Node 3, Processors #11 OK [Thu Dec 12 07:46:23 2019][ 0.949864] smpboot: Booting Node 0, Processors #12 OK [Thu Dec 12 07:46:23 2019][ 0.963135] smpboot: Booting Node 1, Processors #13 OK [Thu Dec 12 07:46:23 2019][ 0.976407] smpboot: Booting Node 2, Processors #14 OK [Thu Dec 12 07:46:23 2019][ 0.989677] smpboot: Booting Node 3, Processors #15 OK [Thu Dec 12 07:46:23 2019][ 1.002949] smpboot: Booting Node 0, Processors #16 OK [Thu Dec 12 07:46:23 2019][ 1.016328] smpboot: Booting Node 1, Processors #17 OK [Thu Dec 12 07:46:23 2019][ 1.029605] smpboot: Booting Node 2, Processors #18 OK [Thu Dec 12 07:46:23 2019][ 1.042895] smpboot: Booting Node 3, Processors #19 OK [Thu Dec 12 07:46:23 2019][ 1.056163] smpboot: Booting Node 0, Processors #20 OK [Thu Dec 12 07:46:23 2019][ 1.069428] smpboot: Booting Node 1, Processors #21 OK [Thu Dec 12 07:46:23 2019][ 1.082706] smpboot: Booting Node 2, Processors #22 OK [Thu Dec 12 07:46:23 2019][ 1.095978] smpboot: Booting Node 3, Processors #23 OK [Thu Dec 12 07:46:23 2019][ 1.109244] smpboot: Booting Node 0, Processors #24 OK [Thu Dec 12 07:46:23 2019][ 1.122982] smpboot: Booting Node 1, Processors #25 OK [Thu Dec 12 07:46:23 2019][ 1.136217] smpboot: Booting Node 2, Processors #26 OK [Thu Dec 12 07:46:23 2019][ 1.149448] smpboot: Booting Node 3, Processors #27 OK [Thu Dec 12 07:46:23 2019][ 1.162683] smpboot: Booting Node 0, Processors #28 OK [Thu Dec 12 07:46:23 2019][ 1.175910] smpboot: Booting Node 1, Processors #29 OK [Thu Dec 12 07:46:23 2019][ 1.189144] smpboot: Booting Node 2, Processors #30 OK [Thu Dec 12 07:46:23 2019][ 1.202385] smpboot: Booting Node 3, Processors #31 OK [Thu Dec 12 07:46:23 2019][ 1.215611] smpboot: Booting Node 0, Processors #32 OK [Thu Dec 12 07:46:23 2019][ 1.228943] smpboot: Booting Node 1, Processors #33 OK [Thu Dec 12 07:46:23 2019][ 1.242185] smpboot: Booting Node 2, Processors #34 OK [Thu Dec 12 07:46:23 2019][ 1.255520] smpboot: Booting Node 3, Processors #35 OK [Thu Dec 12 07:46:23 2019][ 1.268747] smpboot: Booting Node 0, Processors #36 OK [Thu Dec 12 07:46:23 2019][ 1.281982] smpboot: Booting Node 1, Processors #37 OK [Thu Dec 12 07:46:24 2019][ 1.295217] smpboot: Booting Node 2, Processors #38 OK [Thu Dec 12 07:46:24 2019][ 1.308465] smpboot: Booting Node 3, Processors #39 OK [Thu Dec 12 07:46:24 2019][ 1.321701] smpboot: Booting Node 0, Processors #40 OK [Thu Dec 12 07:46:24 2019][ 1.335032] smpboot: Booting Node 1, Processors #41 OK [Thu Dec 12 07:46:24 2019][ 1.348273] smpboot: Booting Node 2, Processors #42 OK [Thu Dec 12 07:46:24 2019][ 1.361610] smpboot: Booting Node 3, Processors #43 OK [Thu Dec 12 07:46:24 2019][ 1.374856] smpboot: Booting Node 0, Processors #44 OK [Thu Dec 12 07:46:24 2019][ 1.388081] smpboot: Booting Node 1, Processors #45 OK [Thu Dec 12 07:46:24 2019][ 1.401416] smpboot: Booting Node 2, Processors #46 OK [Thu Dec 12 07:46:24 2019][ 1.414746] smpboot: Booting Node 3, Processors #47 [Thu Dec 12 07:46:24 2019][ 1.427447] Brought up 48 CPUs [Thu Dec 12 07:46:24 2019][ 1.430705] smpboot: Max logical packages: 3 [Thu Dec 12 07:46:24 2019][ 1.434981] smpboot: Total of 48 processors activated (191638.36 BogoMIPS) [Thu Dec 12 07:46:24 2019][ 1.723439] node 0 initialised, 15462980 pages in 274ms [Thu Dec 12 07:46:24 2019][ 1.732348] node 2 initialised, 15989367 pages in 279ms [Thu Dec 12 07:46:24 2019][ 1.732362] node 3 initialised, 15989248 pages in 279ms [Thu Dec 12 07:46:24 2019][ 1.735104] node 1 initialised, 15984664 pages in 282ms [Thu Dec 12 07:46:24 2019][ 1.748572] devtmpfs: initialized [Thu Dec 12 07:46:24 2019][ 1.774442] EVM: security.selinux [Thu Dec 12 07:46:24 2019][ 1.777763] EVM: security.ima [Thu Dec 12 07:46:24 2019][ 1.780735] EVM: security.capability [Thu Dec 12 07:46:24 2019][ 1.784416] PM: Registering ACPI NVS region [mem 0x0008f000-0x0008ffff] (4096 bytes) [Thu Dec 12 07:46:24 2019][ 1.792162] PM: Registering ACPI NVS region [mem 0x6efcf000-0x6fdfefff] (14876672 bytes) [Thu Dec 12 07:46:24 2019][ 1.801813] atomic64 test passed for x86-64 platform with CX8 and with SSE [Thu Dec 12 07:46:24 2019][ 1.808688] pinctrl core: initialized pinctrl subsystem [Thu Dec 12 07:46:24 2019][ 1.814022] RTC time: 15:46:24, date: 12/12/19 [Thu Dec 12 07:46:24 2019][ 1.818626] NET: Registered protocol family 16 [Thu Dec 12 07:46:24 2019][ 1.823433] ACPI FADT declares the system doesn't support PCIe ASPM, so disable it [Thu Dec 12 07:46:24 2019][ 1.831006] ACPI: bus type PCI registered [Thu Dec 12 07:46:24 2019][ 1.835018] acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 [Thu Dec 12 07:46:24 2019][ 1.841603] PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0x80000000-0x8fffffff] (base 0x80000000) [Thu Dec 12 07:46:24 2019][ 1.850904] PCI: MMCONFIG at [mem 0x80000000-0x8fffffff] reserved in E820 [Thu Dec 12 07:46:24 2019][ 1.857695] PCI: Using configuration type 1 for base access [Thu Dec 12 07:46:24 2019][ 1.863281] PCI: Dell System detected, enabling pci=bfsort. [Thu Dec 12 07:46:24 2019][ 1.878192] ACPI: Added _OSI(Module Device) [Thu Dec 12 07:46:24 2019][ 1.882383] ACPI: Added _OSI(Processor Device) [Thu Dec 12 07:46:24 2019][ 1.886825] ACPI: Added _OSI(3.0 _SCP Extensions) [Thu Dec 12 07:46:24 2019][ 1.891531] ACPI: Added _OSI(Processor Aggregator Device) [Thu Dec 12 07:46:24 2019][ 1.896931] ACPI: Added _OSI(Linux-Dell-Video) [Thu Dec 12 07:46:24 2019][ 1.903177] ACPI: Executed 2 blocks of module-level executable AML code [Thu Dec 12 07:46:24 2019][ 1.915232] ACPI: Interpreter enabled [Thu Dec 12 07:46:24 2019][ 1.918909] ACPI: (supports S0 S5) [Thu Dec 12 07:46:24 2019][ 1.922316] ACPI: Using IOAPIC for interrupt routing [Thu Dec 12 07:46:24 2019][ 1.927492] HEST: Table parsing has been initialized. [Thu Dec 12 07:46:24 2019][ 1.932547] PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug [Thu Dec 12 07:46:24 2019][ 1.941701] ACPI: Enabled 1 GPEs in block 00 to 1F [Thu Dec 12 07:46:24 2019][ 1.953361] ACPI: PCI Interrupt Link [LNKA] (IRQs 4 5 7 10 11 14 15) *0 [Thu Dec 12 07:46:24 2019][ 1.960269] ACPI: PCI Interrupt Link [LNKB] (IRQs 4 5 7 10 11 14 15) *0 [Thu Dec 12 07:46:24 2019][ 1.967175] ACPI: PCI Interrupt Link [LNKC] (IRQs 4 5 7 10 11 14 15) *0 [Thu Dec 12 07:46:24 2019][ 1.974081] ACPI: PCI Interrupt Link [LNKD] (IRQs 4 5 7 10 11 14 15) *0 [Thu Dec 12 07:46:24 2019][ 1.980992] ACPI: PCI Interrupt Link [LNKE] (IRQs 4 5 7 10 11 14 15) *0 [Thu Dec 12 07:46:24 2019][ 1.987897] ACPI: PCI Interrupt Link [LNKF] (IRQs 4 5 7 10 11 14 15) *0 [Thu Dec 12 07:46:24 2019][ 1.994805] ACPI: PCI Interrupt Link [LNKG] (IRQs 4 5 7 10 11 14 15) *0 [Thu Dec 12 07:46:24 2019][ 2.001711] ACPI: PCI Interrupt Link [LNKH] (IRQs 4 5 7 10 11 14 15) *0 [Thu Dec 12 07:46:24 2019][ 2.008764] ACPI: PCI Root Bridge [PC00] (domain 0000 [bus 00-3f]) [Thu Dec 12 07:46:24 2019][ 2.014952] acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI] [Thu Dec 12 07:46:24 2019][ 2.023170] acpi PNP0A08:00: PCIe AER handled by firmware [Thu Dec 12 07:46:24 2019][ 2.028613] acpi PNP0A08:00: _OSC: platform does not support [SHPCHotplug] [Thu Dec 12 07:46:24 2019][ 2.035561] acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] [Thu Dec 12 07:46:24 2019][ 2.043217] acpi PNP0A08:00: FADT indicates ASPM is unsupported, using BIOS configuration [Thu Dec 12 07:46:24 2019][ 2.051679] PCI host bridge to bus 0000:00 [Thu Dec 12 07:46:24 2019][ 2.055777] pci_bus 0000:00: root bus resource [io 0x0000-0x03af window] [Thu Dec 12 07:46:24 2019][ 2.062561] pci_bus 0000:00: root bus resource [io 0x03e0-0x0cf7 window] [Thu Dec 12 07:46:24 2019][ 2.069348] pci_bus 0000:00: root bus resource [mem 0x000c0000-0x000c3fff window] [Thu Dec 12 07:46:24 2019][ 2.076827] pci_bus 0000:00: root bus resource [mem 0x000c4000-0x000c7fff window] [Thu Dec 12 07:46:24 2019][ 2.084307] pci_bus 0000:00: root bus resource [mem 0x000c8000-0x000cbfff window] [Thu Dec 12 07:46:24 2019][ 2.091788] pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000cffff window] [Thu Dec 12 07:46:24 2019][ 2.099265] pci_bus 0000:00: root bus resource [mem 0x000d0000-0x000d3fff window] [Thu Dec 12 07:46:24 2019][ 2.106744] pci_bus 0000:00: root bus resource [mem 0x000d4000-0x000d7fff window] [Thu Dec 12 07:46:24 2019][ 2.114224] pci_bus 0000:00: root bus resource [mem 0x000d8000-0x000dbfff window] [Thu Dec 12 07:46:24 2019][ 2.121703] pci_bus 0000:00: root bus resource [mem 0x000dc000-0x000dffff window] [Thu Dec 12 07:46:24 2019][ 2.129184] pci_bus 0000:00: root bus resource [mem 0x000e0000-0x000e3fff window] [Thu Dec 12 07:46:24 2019][ 2.136664] pci_bus 0000:00: root bus resource [mem 0x000e4000-0x000e7fff window] [Thu Dec 12 07:46:24 2019][ 2.144142] pci_bus 0000:00: root bus resource [mem 0x000e8000-0x000ebfff window] [Thu Dec 12 07:46:24 2019][ 2.151621] pci_bus 0000:00: root bus resource [mem 0x000ec000-0x000effff window] [Thu Dec 12 07:46:24 2019][ 2.159101] pci_bus 0000:00: root bus resource [mem 0x000f0000-0x000fffff window] [Thu Dec 12 07:46:24 2019][ 2.166579] pci_bus 0000:00: root bus resource [io 0x0d00-0x3fff window] [Thu Dec 12 07:46:24 2019][ 2.173367] pci_bus 0000:00: root bus resource [mem 0xe1000000-0xfebfffff window] [Thu Dec 12 07:46:24 2019][ 2.180846] pci_bus 0000:00: root bus resource [mem 0x10000000000-0x2bf3fffffff window] [Thu Dec 12 07:46:24 2019][ 2.188846] pci_bus 0000:00: root bus resource [bus 00-3f] [Thu Dec 12 07:46:24 2019][ 2.201672] pci 0000:00:03.1: PCI bridge to [bus 01] [Thu Dec 12 07:46:24 2019][ 2.207052] pci 0000:00:07.1: PCI bridge to [bus 02] [Thu Dec 12 07:46:24 2019][ 2.212834] pci 0000:00:08.1: PCI bridge to [bus 03] [Thu Dec 12 07:46:24 2019][ 2.218196] ACPI: PCI Root Bridge [PC01] (domain 0000 [bus 40-7f]) [Thu Dec 12 07:46:24 2019][ 2.224381] acpi PNP0A08:01: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI] [Thu Dec 12 07:46:24 2019][ 2.232591] acpi PNP0A08:01: PCIe AER handled by firmware [Thu Dec 12 07:46:24 2019][ 2.238035] acpi PNP0A08:01: _OSC: platform does not support [SHPCHotplug] [Thu Dec 12 07:46:24 2019][ 2.244981] acpi PNP0A08:01: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] [Thu Dec 12 07:46:24 2019][ 2.252632] acpi PNP0A08:01: FADT indicates ASPM is unsupported, using BIOS configuration [Thu Dec 12 07:46:24 2019][ 2.261046] PCI host bridge to bus 0000:40 [Thu Dec 12 07:46:24 2019][ 2.265150] pci_bus 0000:40: root bus resource [io 0x4000-0x7fff window] [Thu Dec 12 07:46:24 2019][ 2.271934] pci_bus 0000:40: root bus resource [mem 0xc6000000-0xe0ffffff window] [Thu Dec 12 07:46:24 2019][ 2.279412] pci_bus 0000:40: root bus resource [mem 0x2bf40000000-0x47e7fffffff window] [Thu Dec 12 07:46:24 2019][ 2.287414] pci_bus 0000:40: root bus resource [bus 40-7f] [Thu Dec 12 07:46:24 2019][ 2.295306] pci 0000:40:07.1: PCI bridge to [bus 41] [Thu Dec 12 07:46:24 2019][ 2.300626] pci 0000:40:08.1: PCI bridge to [bus 42] [Thu Dec 12 07:46:25 2019][ 2.305790] ACPI: PCI Root Bridge [PC02] (domain 0000 [bus 80-bf]) [Thu Dec 12 07:46:25 2019][ 2.311978] acpi PNP0A08:02: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI] [Thu Dec 12 07:46:25 2019][ 2.320184] acpi PNP0A08:02: PCIe AER handled by firmware [Thu Dec 12 07:46:25 2019][ 2.325619] acpi PNP0A08:02: _OSC: platform does not support [SHPCHotplug] [Thu Dec 12 07:46:25 2019][ 2.332568] acpi PNP0A08:02: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] [Thu Dec 12 07:46:25 2019][ 2.340218] acpi PNP0A08:02: FADT indicates ASPM is unsupported, using BIOS configuration [Thu Dec 12 07:46:25 2019][ 2.348656] PCI host bridge to bus 0000:80 [Thu Dec 12 07:46:25 2019][ 2.352762] pci_bus 0000:80: root bus resource [io 0x03b0-0x03df window] [Thu Dec 12 07:46:25 2019][ 2.359546] pci_bus 0000:80: root bus resource [mem 0x000a0000-0x000bffff window] [Thu Dec 12 07:46:25 2019][ 2.367027] pci_bus 0000:80: root bus resource [io 0x8000-0xbfff window] [Thu Dec 12 07:46:25 2019][ 2.373811] pci_bus 0000:80: root bus resource [mem 0xab000000-0xc5ffffff window] [Thu Dec 12 07:46:25 2019][ 2.381291] pci_bus 0000:80: root bus resource [mem 0x47e80000000-0x63dbfffffff window] [Thu Dec 12 07:46:25 2019][ 2.389290] pci_bus 0000:80: root bus resource [bus 80-bf] [Thu Dec 12 07:46:25 2019][ 2.398291] pci 0000:80:01.1: PCI bridge to [bus 81] [Thu Dec 12 07:46:25 2019][ 2.405823] pci 0000:80:01.2: PCI bridge to [bus 82-83] [Thu Dec 12 07:46:25 2019][ 2.411290] pci 0000:82:00.0: PCI bridge to [bus 83] [Thu Dec 12 07:46:25 2019][ 2.418821] pci 0000:80:03.1: PCI bridge to [bus 84] [Thu Dec 12 07:46:25 2019][ 2.424111] pci 0000:80:07.1: PCI bridge to [bus 85] [Thu Dec 12 07:46:25 2019][ 2.430040] pci 0000:80:08.1: PCI bridge to [bus 86] [Thu Dec 12 07:46:25 2019][ 2.435205] ACPI: PCI Root Bridge [PC03] (domain 0000 [bus c0-ff]) [Thu Dec 12 07:46:25 2019][ 2.441386] acpi PNP0A08:03: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI] [Thu Dec 12 07:46:25 2019][ 2.449596] acpi PNP0A08:03: PCIe AER handled by firmware [Thu Dec 12 07:46:25 2019][ 2.455032] acpi PNP0A08:03: _OSC: platform does not support [SHPCHotplug] [Thu Dec 12 07:46:25 2019][ 2.461979] acpi PNP0A08:03: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] [Thu Dec 12 07:46:25 2019][ 2.469630] acpi PNP0A08:03: FADT indicates ASPM is unsupported, using BIOS configuration [Thu Dec 12 07:46:25 2019][ 2.477955] acpi PNP0A08:03: host bridge window [mem 0x63dc0000000-0xffffffffffff window] ([0x80000000000-0xffffffffffff] ignored, not CPU addressable) [Thu Dec 12 07:46:25 2019][ 2.491596] PCI host bridge to bus 0000:c0 [Thu Dec 12 07:46:25 2019][ 2.495701] pci_bus 0000:c0: root bus resource [io 0xc000-0xffff window] [Thu Dec 12 07:46:25 2019][ 2.502487] pci_bus 0000:c0: root bus resource [mem 0x90000000-0xaaffffff window] [Thu Dec 12 07:46:25 2019][ 2.509966] pci_bus 0000:c0: root bus resource [mem 0x63dc0000000-0x7ffffffffff window] [Thu Dec 12 07:46:25 2019][ 2.517966] pci_bus 0000:c0: root bus resource [bus c0-ff] [Thu Dec 12 07:46:25 2019][ 2.525498] pci 0000:c0:01.1: PCI bridge to [bus c1] [Thu Dec 12 07:46:25 2019][ 2.531132] pci 0000:c0:07.1: PCI bridge to [bus c2] [Thu Dec 12 07:46:25 2019][ 2.536446] pci 0000:c0:08.1: PCI bridge to [bus c3] [Thu Dec 12 07:46:25 2019][ 2.543611] vgaarb: device added: PCI:0000:83:00.0,decodes=io+mem,owns=io+mem,locks=none [Thu Dec 12 07:46:25 2019][ 2.551700] vgaarb: loaded [Thu Dec 12 07:46:25 2019][ 2.554417] vgaarb: bridge control possible 0000:83:00.0 [Thu Dec 12 07:46:25 2019][ 2.559843] SCSI subsystem initialized [Thu Dec 12 07:46:25 2019][ 2.563621] ACPI: bus type USB registered [Thu Dec 12 07:46:25 2019][ 2.567649] usbcore: registered new interface driver usbfs [Thu Dec 12 07:46:25 2019][ 2.573144] usbcore: registered new interface driver hub [Thu Dec 12 07:46:25 2019][ 2.578680] usbcore: registered new device driver usb [Thu Dec 12 07:46:25 2019][ 2.584060] EDAC MC: Ver: 3.0.0 [Thu Dec 12 07:46:25 2019][ 2.587458] PCI: Using ACPI for IRQ routing [Thu Dec 12 07:46:25 2019][ 2.611040] NetLabel: Initializing [Thu Dec 12 07:46:25 2019][ 2.614443] NetLabel: domain hash size = 128 [Thu Dec 12 07:46:25 2019][ 2.618802] NetLabel: protocols = UNLABELED CIPSOv4 [Thu Dec 12 07:46:25 2019][ 2.623783] NetLabel: unlabeled traffic allowed by default [Thu Dec 12 07:46:25 2019][ 2.629554] hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 [Thu Dec 12 07:46:25 2019][ 2.634540] hpet0: 3 comparators, 32-bit 14.318180 MHz counter [Thu Dec 12 07:46:25 2019][ 2.642549] Switched to clocksource hpet [Thu Dec 12 07:46:25 2019][ 2.651201] pnp: PnP ACPI init [Thu Dec 12 07:46:25 2019][ 2.654277] ACPI: bus type PNP registered [Thu Dec 12 07:46:25 2019][ 2.658470] system 00:00: [mem 0x80000000-0x8fffffff] has been reserved [Thu Dec 12 07:46:25 2019][ 2.665673] pnp: PnP ACPI: found 4 devices [Thu Dec 12 07:46:25 2019][ 2.669779] ACPI: bus type PNP unregistered [Thu Dec 12 07:46:25 2019][ 2.681244] pci 0000:01:00.0: can't claim BAR 6 [mem 0xfff00000-0xffffffff pref]: no compatible bridge window [Thu Dec 12 07:46:25 2019][ 2.691164] pci 0000:81:00.0: can't claim BAR 6 [mem 0xfffc0000-0xffffffff pref]: no compatible bridge window [Thu Dec 12 07:46:25 2019][ 2.701075] pci 0000:81:00.1: can't claim BAR 6 [mem 0xfffc0000-0xffffffff pref]: no compatible bridge window [Thu Dec 12 07:46:25 2019][ 2.710993] pci 0000:84:00.0: can't claim BAR 6 [mem 0xfffc0000-0xffffffff pref]: no compatible bridge window [Thu Dec 12 07:46:25 2019][ 2.720908] pci 0000:c1:00.0: can't claim BAR 6 [mem 0xfff00000-0xffffffff pref]: no compatible bridge window [Thu Dec 12 07:46:25 2019][ 2.730845] pci 0000:00:03.1: BAR 14: assigned [mem 0xe1000000-0xe10fffff] [Thu Dec 12 07:46:25 2019][ 2.737729] pci 0000:01:00.0: BAR 6: assigned [mem 0xe1000000-0xe10fffff pref] [Thu Dec 12 07:46:25 2019][ 2.744958] pci 0000:00:03.1: PCI bridge to [bus 01] [Thu Dec 12 07:46:25 2019][ 2.749932] pci 0000:00:03.1: bridge window [mem 0xe1000000-0xe10fffff] [Thu Dec 12 07:46:25 2019][ 2.756727] pci 0000:00:03.1: bridge window [mem 0xe2000000-0xe3ffffff 64bit pref] [Thu Dec 12 07:46:25 2019][ 2.764476] pci 0000:00:07.1: PCI bridge to [bus 02] [Thu Dec 12 07:46:25 2019][ 2.769450] pci 0000:00:07.1: bridge window [mem 0xf7200000-0xf74fffff] [Thu Dec 12 07:46:25 2019][ 2.776247] pci 0000:00:08.1: PCI bridge to [bus 03] [Thu Dec 12 07:46:25 2019][ 2.781228] pci 0000:00:08.1: bridge window [mem 0xf7000000-0xf71fffff] [Thu Dec 12 07:46:25 2019][ 2.788075] pci 0000:40:07.1: PCI bridge to [bus 41] [Thu Dec 12 07:46:25 2019][ 2.793049] pci 0000:40:07.1: bridge window [mem 0xdb200000-0xdb4fffff] [Thu Dec 12 07:46:25 2019][ 2.799848] pci 0000:40:08.1: PCI bridge to [bus 42] [Thu Dec 12 07:46:25 2019][ 2.804827] pci 0000:40:08.1: bridge window [mem 0xdb000000-0xdb1fffff] [Thu Dec 12 07:46:25 2019][ 2.811665] pci 0000:80:01.1: BAR 14: assigned [mem 0xac300000-0xac3fffff] [Thu Dec 12 07:46:25 2019][ 2.818546] pci 0000:81:00.0: BAR 6: assigned [mem 0xac300000-0xac33ffff pref] [Thu Dec 12 07:46:25 2019][ 2.825773] pci 0000:81:00.1: BAR 6: assigned [mem 0xac340000-0xac37ffff pref] [Thu Dec 12 07:46:25 2019][ 2.833001] pci 0000:80:01.1: PCI bridge to [bus 81] [Thu Dec 12 07:46:25 2019][ 2.837979] pci 0000:80:01.1: bridge window [mem 0xac300000-0xac3fffff] [Thu Dec 12 07:46:25 2019][ 2.844771] pci 0000:80:01.1: bridge window [mem 0xac200000-0xac2fffff 64bit pref] [Thu Dec 12 07:46:25 2019][ 2.852521] pci 0000:82:00.0: PCI bridge to [bus 83] [Thu Dec 12 07:46:25 2019][ 2.857496] pci 0000:82:00.0: bridge window [mem 0xc0000000-0xc08fffff] [Thu Dec 12 07:46:25 2019][ 2.864292] pci 0000:82:00.0: bridge window [mem 0xab000000-0xabffffff 64bit pref] [Thu Dec 12 07:46:25 2019][ 2.872040] pci 0000:80:01.2: PCI bridge to [bus 82-83] [Thu Dec 12 07:46:25 2019][ 2.877273] pci 0000:80:01.2: bridge window [mem 0xc0000000-0xc08fffff] [Thu Dec 12 07:46:25 2019][ 2.884065] pci 0000:80:01.2: bridge window [mem 0xab000000-0xabffffff 64bit pref] [Thu Dec 12 07:46:25 2019][ 2.891818] pci 0000:84:00.0: BAR 6: no space for [mem size 0x00040000 pref] [Thu Dec 12 07:46:25 2019][ 2.898869] pci 0000:84:00.0: BAR 6: failed to assign [mem size 0x00040000 pref] [Thu Dec 12 07:46:25 2019][ 2.906270] pci 0000:80:03.1: PCI bridge to [bus 84] [Thu Dec 12 07:46:25 2019][ 2.911245] pci 0000:80:03.1: bridge window [io 0x8000-0x8fff] [Thu Dec 12 07:46:25 2019][ 2.917346] pci 0000:80:03.1: bridge window [mem 0xc0d00000-0xc0dfffff] [Thu Dec 12 07:46:25 2019][ 2.924143] pci 0000:80:03.1: bridge window [mem 0xac000000-0xac1fffff 64bit pref] [Thu Dec 12 07:46:25 2019][ 2.931892] pci 0000:80:07.1: PCI bridge to [bus 85] [Thu Dec 12 07:46:25 2019][ 2.936864] pci 0000:80:07.1: bridge window [mem 0xc0b00000-0xc0cfffff] [Thu Dec 12 07:46:25 2019][ 2.943664] pci 0000:80:08.1: PCI bridge to [bus 86] [Thu Dec 12 07:46:25 2019][ 2.948642] pci 0000:80:08.1: bridge window [mem 0xc0900000-0xc0afffff] [Thu Dec 12 07:46:25 2019][ 2.955480] pci 0000:c1:00.0: BAR 6: no space for [mem size 0x00100000 pref] [Thu Dec 12 07:46:25 2019][ 2.962535] pci 0000:c1:00.0: BAR 6: failed to assign [mem size 0x00100000 pref] [Thu Dec 12 07:46:25 2019][ 2.969938] pci 0000:c0:01.1: PCI bridge to [bus c1] [Thu Dec 12 07:46:25 2019][ 2.974911] pci 0000:c0:01.1: bridge window [io 0xc000-0xcfff] [Thu Dec 12 07:46:25 2019][ 2.981015] pci 0000:c0:01.1: bridge window [mem 0xa5400000-0xa55fffff] [Thu Dec 12 07:46:25 2019][ 2.987812] pci 0000:c0:07.1: PCI bridge to [bus c2] [Thu Dec 12 07:46:25 2019][ 2.992792] pci 0000:c0:07.1: bridge window [mem 0xa5200000-0xa53fffff] [Thu Dec 12 07:46:25 2019][ 2.999589] pci 0000:c0:08.1: PCI bridge to [bus c3] [Thu Dec 12 07:46:25 2019][ 3.004570] pci 0000:c0:08.1: bridge window [mem 0xa5000000-0xa51fffff] [Thu Dec 12 07:46:25 2019][ 3.011458] NET: Registered protocol family 2 [Thu Dec 12 07:46:25 2019][ 3.016491] TCP established hash table entries: 524288 (order: 10, 4194304 bytes) [Thu Dec 12 07:46:25 2019][ 3.024635] TCP bind hash table entries: 65536 (order: 8, 1048576 bytes) [Thu Dec 12 07:46:25 2019][ 3.031453] TCP: Hash tables configured (established 524288 bind 65536) [Thu Dec 12 07:46:25 2019][ 3.038092] TCP: reno registered [Thu Dec 12 07:46:25 2019][ 3.041439] UDP hash table entries: 65536 (order: 9, 2097152 bytes) [Thu Dec 12 07:46:25 2019][ 3.048043] UDP-Lite hash table entries: 65536 (order: 9, 2097152 bytes) [Thu Dec 12 07:46:25 2019][ 3.055234] NET: Registered protocol family 1 [Thu Dec 12 07:46:25 2019][ 3.060143] Unpacking initramfs... [Thu Dec 12 07:46:26 2019][ 3.331590] Freeing initrd memory: 19712k freed [Thu Dec 12 07:46:26 2019][ 3.338381] AMD-Vi: IOMMU performance counters supported [Thu Dec 12 07:46:26 2019][ 3.343788] AMD-Vi: IOMMU performance counters supported [Thu Dec 12 07:46:26 2019][ 3.349141] AMD-Vi: IOMMU performance counters supported [Thu Dec 12 07:46:26 2019][ 3.354496] AMD-Vi: IOMMU performance counters supported [Thu Dec 12 07:46:26 2019][ 3.361129] iommu: Adding device 0000:00:01.0 to group 0 [Thu Dec 12 07:46:26 2019][ 3.367130] iommu: Adding device 0000:00:02.0 to group 1 [Thu Dec 12 07:46:26 2019][ 3.373151] iommu: Adding device 0000:00:03.0 to group 2 [Thu Dec 12 07:46:26 2019][ 3.379260] iommu: Adding device 0000:00:03.1 to group 3 [Thu Dec 12 07:46:26 2019][ 3.385261] iommu: Adding device 0000:00:04.0 to group 4 [Thu Dec 12 07:46:26 2019][ 3.391298] iommu: Adding device 0000:00:07.0 to group 5 [Thu Dec 12 07:46:26 2019][ 3.397275] iommu: Adding device 0000:00:07.1 to group 6 [Thu Dec 12 07:46:26 2019][ 3.403253] iommu: Adding device 0000:00:08.0 to group 7 [Thu Dec 12 07:46:26 2019][ 3.409251] iommu: Adding device 0000:00:08.1 to group 8 [Thu Dec 12 07:46:26 2019][ 3.415276] iommu: Adding device 0000:00:14.0 to group 9 [Thu Dec 12 07:46:26 2019][ 3.420615] iommu: Adding device 0000:00:14.3 to group 9 [Thu Dec 12 07:46:26 2019][ 3.426713] iommu: Adding device 0000:00:18.0 to group 10 [Thu Dec 12 07:46:26 2019][ 3.432141] iommu: Adding device 0000:00:18.1 to group 10 [Thu Dec 12 07:46:26 2019][ 3.437567] iommu: Adding device 0000:00:18.2 to group 10 [Thu Dec 12 07:46:26 2019][ 3.442993] iommu: Adding device 0000:00:18.3 to group 10 [Thu Dec 12 07:46:26 2019][ 3.448418] iommu: Adding device 0000:00:18.4 to group 10 [Thu Dec 12 07:46:26 2019][ 3.453843] iommu: Adding device 0000:00:18.5 to group 10 [Thu Dec 12 07:46:26 2019][ 3.459275] iommu: Adding device 0000:00:18.6 to group 10 [Thu Dec 12 07:46:26 2019][ 3.464704] iommu: Adding device 0000:00:18.7 to group 10 [Thu Dec 12 07:46:26 2019][ 3.470867] iommu: Adding device 0000:00:19.0 to group 11 [Thu Dec 12 07:46:26 2019][ 3.476293] iommu: Adding device 0000:00:19.1 to group 11 [Thu Dec 12 07:46:26 2019][ 3.481717] iommu: Adding device 0000:00:19.2 to group 11 [Thu Dec 12 07:46:26 2019][ 3.487142] iommu: Adding device 0000:00:19.3 to group 11 [Thu Dec 12 07:46:26 2019][ 3.492568] iommu: Adding device 0000:00:19.4 to group 11 [Thu Dec 12 07:46:26 2019][ 3.497992] iommu: Adding device 0000:00:19.5 to group 11 [Thu Dec 12 07:46:26 2019][ 3.503418] iommu: Adding device 0000:00:19.6 to group 11 [Thu Dec 12 07:46:26 2019][ 3.508843] iommu: Adding device 0000:00:19.7 to group 11 [Thu Dec 12 07:46:26 2019][ 3.514989] iommu: Adding device 0000:00:1a.0 to group 12 [Thu Dec 12 07:46:26 2019][ 3.520416] iommu: Adding device 0000:00:1a.1 to group 12 [Thu Dec 12 07:46:26 2019][ 3.525839] iommu: Adding device 0000:00:1a.2 to group 12 [Thu Dec 12 07:46:26 2019][ 3.531264] iommu: Adding device 0000:00:1a.3 to group 12 [Thu Dec 12 07:46:26 2019][ 3.536692] iommu: Adding device 0000:00:1a.4 to group 12 [Thu Dec 12 07:46:26 2019][ 3.542117] iommu: Adding device 0000:00:1a.5 to group 12 [Thu Dec 12 07:46:26 2019][ 3.547553] iommu: Adding device 0000:00:1a.6 to group 12 [Thu Dec 12 07:46:26 2019][ 3.552983] iommu: Adding device 0000:00:1a.7 to group 12 [Thu Dec 12 07:46:26 2019][ 3.559166] iommu: Adding device 0000:00:1b.0 to group 13 [Thu Dec 12 07:46:26 2019][ 3.564589] iommu: Adding device 0000:00:1b.1 to group 13 [Thu Dec 12 07:46:26 2019][ 3.570017] iommu: Adding device 0000:00:1b.2 to group 13 [Thu Dec 12 07:46:26 2019][ 3.575450] iommu: Adding device 0000:00:1b.3 to group 13 [Thu Dec 12 07:46:26 2019][ 3.580873] iommu: Adding device 0000:00:1b.4 to group 13 [Thu Dec 12 07:46:26 2019][ 3.586301] iommu: Adding device 0000:00:1b.5 to group 13 [Thu Dec 12 07:46:26 2019][ 3.591726] iommu: Adding device 0000:00:1b.6 to group 13 [Thu Dec 12 07:46:26 2019][ 3.597150] iommu: Adding device 0000:00:1b.7 to group 13 [Thu Dec 12 07:46:26 2019][ 3.603316] iommu: Adding device 0000:01:00.0 to group 14 [Thu Dec 12 07:46:26 2019][ 3.609439] iommu: Adding device 0000:02:00.0 to group 15 [Thu Dec 12 07:46:26 2019][ 3.615516] iommu: Adding device 0000:02:00.2 to group 16 [Thu Dec 12 07:46:26 2019][ 3.621618] iommu: Adding device 0000:02:00.3 to group 17 [Thu Dec 12 07:46:26 2019][ 3.627700] iommu: Adding device 0000:03:00.0 to group 18 [Thu Dec 12 07:46:26 2019][ 3.633793] iommu: Adding device 0000:03:00.1 to group 19 [Thu Dec 12 07:46:26 2019][ 3.639926] iommu: Adding device 0000:40:01.0 to group 20 [Thu Dec 12 07:46:26 2019][ 3.646038] iommu: Adding device 0000:40:02.0 to group 21 [Thu Dec 12 07:46:26 2019][ 3.652136] iommu: Adding device 0000:40:03.0 to group 22 [Thu Dec 12 07:46:26 2019][ 3.658251] iommu: Adding device 0000:40:04.0 to group 23 [Thu Dec 12 07:46:26 2019][ 3.664324] iommu: Adding device 0000:40:07.0 to group 24 [Thu Dec 12 07:46:26 2019][ 3.670323] iommu: Adding device 0000:40:07.1 to group 25 [Thu Dec 12 07:46:26 2019][ 3.676365] iommu: Adding device 0000:40:08.0 to group 26 [Thu Dec 12 07:46:26 2019][ 3.682385] iommu: Adding device 0000:40:08.1 to group 27 [Thu Dec 12 07:46:26 2019][ 3.688410] iommu: Adding device 0000:41:00.0 to group 28 [Thu Dec 12 07:46:26 2019][ 3.694447] iommu: Adding device 0000:41:00.2 to group 29 [Thu Dec 12 07:46:26 2019][ 3.700494] iommu: Adding device 0000:41:00.3 to group 30 [Thu Dec 12 07:46:26 2019][ 3.706504] iommu: Adding device 0000:42:00.0 to group 31 [Thu Dec 12 07:46:26 2019][ 3.712550] iommu: Adding device 0000:42:00.1 to group 32 [Thu Dec 12 07:46:26 2019][ 3.718602] iommu: Adding device 0000:80:01.0 to group 33 [Thu Dec 12 07:46:26 2019][ 3.724630] iommu: Adding device 0000:80:01.1 to group 34 [Thu Dec 12 07:46:26 2019][ 3.730799] iommu: Adding device 0000:80:01.2 to group 35 [Thu Dec 12 07:46:26 2019][ 3.736838] iommu: Adding device 0000:80:02.0 to group 36 [Thu Dec 12 07:46:26 2019][ 3.742937] iommu: Adding device 0000:80:03.0 to group 37 [Thu Dec 12 07:46:26 2019][ 3.749020] iommu: Adding device 0000:80:03.1 to group 38 [Thu Dec 12 07:46:26 2019][ 3.755031] iommu: Adding device 0000:80:04.0 to group 39 [Thu Dec 12 07:46:26 2019][ 3.761064] iommu: Adding device 0000:80:07.0 to group 40 [Thu Dec 12 07:46:26 2019][ 3.767110] iommu: Adding device 0000:80:07.1 to group 41 [Thu Dec 12 07:46:26 2019][ 3.773166] iommu: Adding device 0000:80:08.0 to group 42 [Thu Dec 12 07:46:26 2019][ 3.779231] iommu: Adding device 0000:80:08.1 to group 43 [Thu Dec 12 07:46:26 2019][ 3.785267] iommu: Adding device 0000:81:00.0 to group 44 [Thu Dec 12 07:46:26 2019][ 3.790717] iommu: Adding device 0000:81:00.1 to group 44 [Thu Dec 12 07:46:26 2019][ 3.796762] iommu: Adding device 0000:82:00.0 to group 45 [Thu Dec 12 07:46:26 2019][ 3.802177] iommu: Adding device 0000:83:00.0 to group 45 [Thu Dec 12 07:46:26 2019][ 3.808205] iommu: Adding device 0000:84:00.0 to group 46 [Thu Dec 12 07:46:26 2019][ 3.814209] iommu: Adding device 0000:85:00.0 to group 47 [Thu Dec 12 07:46:26 2019][ 3.820260] iommu: Adding device 0000:85:00.2 to group 48 [Thu Dec 12 07:46:26 2019][ 3.826296] iommu: Adding device 0000:86:00.0 to group 49 [Thu Dec 12 07:46:26 2019][ 3.832338] iommu: Adding device 0000:86:00.1 to group 50 [Thu Dec 12 07:46:26 2019][ 3.838386] iommu: Adding device 0000:86:00.2 to group 51 [Thu Dec 12 07:46:26 2019][ 3.844440] iommu: Adding device 0000:c0:01.0 to group 52 [Thu Dec 12 07:46:26 2019][ 3.850501] iommu: Adding device 0000:c0:01.1 to group 53 [Thu Dec 12 07:46:26 2019][ 3.856507] iommu: Adding device 0000:c0:02.0 to group 54 [Thu Dec 12 07:46:26 2019][ 3.862600] iommu: Adding device 0000:c0:03.0 to group 55 [Thu Dec 12 07:46:26 2019][ 3.868643] iommu: Adding device 0000:c0:04.0 to group 56 [Thu Dec 12 07:46:26 2019][ 3.874665] iommu: Adding device 0000:c0:07.0 to group 57 [Thu Dec 12 07:46:26 2019][ 3.880716] iommu: Adding device 0000:c0:07.1 to group 58 [Thu Dec 12 07:46:26 2019][ 3.886782] iommu: Adding device 0000:c0:08.0 to group 59 [Thu Dec 12 07:46:26 2019][ 3.892813] iommu: Adding device 0000:c0:08.1 to group 60 [Thu Dec 12 07:46:26 2019][ 3.901222] iommu: Adding device 0000:c1:00.0 to group 61 [Thu Dec 12 07:46:26 2019][ 3.907267] iommu: Adding device 0000:c2:00.0 to group 62 [Thu Dec 12 07:46:26 2019][ 3.913329] iommu: Adding device 0000:c2:00.2 to group 63 [Thu Dec 12 07:46:26 2019][ 3.919379] iommu: Adding device 0000:c3:00.0 to group 64 [Thu Dec 12 07:46:26 2019][ 3.925372] iommu: Adding device 0000:c3:00.1 to group 65 [Thu Dec 12 07:46:26 2019][ 3.930972] AMD-Vi: Found IOMMU at 0000:00:00.2 cap 0x40 [Thu Dec 12 07:46:26 2019][ 3.936296] AMD-Vi: Extended features (0xf77ef22294ada): [Thu Dec 12 07:46:26 2019][ 3.941616] PPR NX GT IA GA PC GA_vAPIC [Thu Dec 12 07:46:26 2019][ 3.945758] AMD-Vi: Found IOMMU at 0000:40:00.2 cap 0x40 [Thu Dec 12 07:46:26 2019][ 3.951081] AMD-Vi: Extended features (0xf77ef22294ada): [Thu Dec 12 07:46:26 2019][ 3.956404] PPR NX GT IA GA PC GA_vAPIC [Thu Dec 12 07:46:26 2019][ 3.960536] AMD-Vi: Found IOMMU at 0000:80:00.2 cap 0x40 [Thu Dec 12 07:46:26 2019][ 3.965866] AMD-Vi: Extended features (0xf77ef22294ada): [Thu Dec 12 07:46:26 2019][ 3.971187] PPR NX GT IA GA PC GA_vAPIC [Thu Dec 12 07:46:26 2019][ 3.975322] AMD-Vi: Found IOMMU at 0000:c0:00.2 cap 0x40 [Thu Dec 12 07:46:26 2019][ 3.980643] AMD-Vi: Extended features (0xf77ef22294ada): [Thu Dec 12 07:46:26 2019][ 3.985965] PPR NX GT IA GA PC GA_vAPIC [Thu Dec 12 07:46:26 2019][ 3.990096] AMD-Vi: Interrupt remapping enabled [Thu Dec 12 07:46:26 2019][ 3.994639] AMD-Vi: virtual APIC enabled [Thu Dec 12 07:46:26 2019][ 3.998956] AMD-Vi: Lazy IO/TLB flushing enabled [Thu Dec 12 07:46:26 2019][ 4.005282] perf: AMD NB counters detected [Thu Dec 12 07:46:26 2019][ 4.009431] perf: AMD LLC counters detected [Thu Dec 12 07:46:26 2019][ 4.019571] sha1_ssse3: Using SHA-NI optimized SHA-1 implementation [Thu Dec 12 07:46:26 2019][ 4.025921] sha256_ssse3: Using SHA-256-NI optimized SHA-256 implementation [Thu Dec 12 07:46:26 2019][ 4.034500] futex hash table entries: 32768 (order: 9, 2097152 bytes) [Thu Dec 12 07:46:26 2019][ 4.041132] Initialise system trusted keyring [Thu Dec 12 07:46:26 2019][ 4.045530] audit: initializing netlink socket (disabled) [Thu Dec 12 07:46:26 2019][ 4.050948] type=2000 audit(1576165582.201:1): initialized [Thu Dec 12 07:46:26 2019][ 4.081801] HugeTLB registered 1 GB page size, pre-allocated 0 pages [Thu Dec 12 07:46:26 2019][ 4.088161] HugeTLB registered 2 MB page size, pre-allocated 0 pages [Thu Dec 12 07:46:26 2019][ 4.095806] zpool: loaded [Thu Dec 12 07:46:26 2019][ 4.098439] zbud: loaded [Thu Dec 12 07:46:26 2019][ 4.101348] VFS: Disk quotas dquot_6.6.0 [Thu Dec 12 07:46:26 2019][ 4.105377] Dquot-cache hash table entries: 512 (order 0, 4096 bytes) [Thu Dec 12 07:46:26 2019][ 4.112182] msgmni has been set to 32768 [Thu Dec 12 07:46:26 2019][ 4.116212] Key type big_key registered [Thu Dec 12 07:46:26 2019][ 4.122457] NET: Registered protocol family 38 [Thu Dec 12 07:46:26 2019][ 4.126916] Key type asymmetric registered [Thu Dec 12 07:46:26 2019][ 4.131021] Asymmetric key parser 'x509' registered [Thu Dec 12 07:46:26 2019][ 4.135956] Block layer SCSI generic (bsg) driver version 0.4 loaded (major 248) [Thu Dec 12 07:46:26 2019][ 4.143509] io scheduler noop registered [Thu Dec 12 07:46:26 2019][ 4.147442] io scheduler deadline registered (default) [Thu Dec 12 07:46:26 2019][ 4.152625] io scheduler cfq registered [Thu Dec 12 07:46:26 2019][ 4.156473] io scheduler mq-deadline registered [Thu Dec 12 07:46:26 2019][ 4.161013] io scheduler kyber registered [Thu Dec 12 07:46:26 2019][ 4.172140] pcieport 0000:00:03.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:26 2019][ 4.179112] pci 0000:01:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:26 2019][ 4.185656] pcieport 0000:00:07.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:26 2019][ 4.192620] pci 0000:02:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:26 2019][ 4.199155] pci 0000:02:00.2: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:26 2019][ 4.205691] pci 0000:02:00.3: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:26 2019][ 4.212239] pcieport 0000:00:08.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:26 2019][ 4.219201] pci 0000:03:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:26 2019][ 4.225737] pci 0000:03:00.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:26 2019][ 4.232292] pcieport 0000:40:07.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:26 2019][ 4.239257] pci 0000:41:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:26 2019][ 4.245790] pci 0000:41:00.2: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:26 2019][ 4.252327] pci 0000:41:00.3: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:26 2019][ 4.258876] pcieport 0000:40:08.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:26 2019][ 4.265838] pci 0000:42:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:26 2019][ 4.272371] pci 0000:42:00.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:26 2019][ 4.278925] pcieport 0000:80:01.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:26 2019][ 4.285892] pci 0000:81:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.292427] pci 0000:81:00.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.298979] pcieport 0000:80:01.2: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.305946] pci 0000:82:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.312481] pci 0000:83:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.319031] pcieport 0000:80:03.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.325994] pci 0000:84:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.332556] pcieport 0000:80:07.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.339522] pci 0000:85:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.346058] pci 0000:85:00.2: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.352610] pcieport 0000:80:08.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.359577] pci 0000:86:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.366112] pci 0000:86:00.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.372645] pci 0000:86:00.2: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.379199] pcieport 0000:c0:01.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.386167] pci 0000:c1:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.392716] pcieport 0000:c0:07.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.399686] pci 0000:c2:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.406220] pci 0000:c2:00.2: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.412771] pcieport 0000:c0:08.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.419742] pci 0000:c3:00.0: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.426275] pci 0000:c3:00.1: Signaling PME through PCIe PME interrupt [Thu Dec 12 07:46:27 2019][ 4.432833] pci_hotplug: PCI Hot Plug PCI Core version: 0.5 [Thu Dec 12 07:46:27 2019][ 4.438414] pciehp: PCI Express Hot Plug Controller Driver version: 0.4 [Thu Dec 12 07:46:27 2019][ 4.445085] shpchp: Standard Hot Plug PCI Controller Driver version: 0.4 [Thu Dec 12 07:46:27 2019][ 4.451898] efifb: probing for efifb [Thu Dec 12 07:46:27 2019][ 4.455501] efifb: framebuffer at 0xab000000, mapped to 0xffffb74ad9800000, using 3072k, total 3072k [Thu Dec 12 07:46:27 2019][ 4.464639] efifb: mode is 1024x768x32, linelength=4096, pages=1 [Thu Dec 12 07:46:27 2019][ 4.470653] efifb: scrolling: redraw [Thu Dec 12 07:46:27 2019][ 4.474242] efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 [Thu Dec 12 07:46:27 2019][ 4.495537] Console: switching to colour frame buffer device 128x48 [Thu Dec 12 07:46:27 2019][ 4.517279] fb0: EFI VGA frame buffer device [Thu Dec 12 07:46:27 2019][ 4.521656] input: Power Button as /devices/LNXSYSTM:00/device:00/PNP0C0C:00/input/input0 [Thu Dec 12 07:46:27 2019][ 4.529839] ACPI: Power Button [PWRB] [Thu Dec 12 07:46:27 2019][ 4.533569] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input1 [Thu Dec 12 07:46:27 2019][ 4.540975] ACPI: Power Button [PWRF] [Thu Dec 12 07:46:27 2019][ 4.545844] GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. [Thu Dec 12 07:46:27 2019][ 4.553330] Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled [Thu Dec 12 07:46:27 2019][ 4.580532] 00:02: ttyS1 at I/O 0x2f8 (irq = 3) is a 16550A [Thu Dec 12 07:46:27 2019][ 4.607071] 00:03: ttyS0 at I/O 0x3f8 (irq = 4) is a 16550A [Thu Dec 12 07:46:27 2019][ 4.613126] Non-volatile memory driver v1.3 [Thu Dec 12 07:46:27 2019][ 4.617356] Linux agpgart interface v0.103 [Thu Dec 12 07:46:27 2019][ 4.623126] crash memory driver: version 1.1 [Thu Dec 12 07:46:27 2019][ 4.627644] rdac: device handler registered [Thu Dec 12 07:46:27 2019][ 4.631887] hp_sw: device handler registered [Thu Dec 12 07:46:27 2019][ 4.636172] emc: device handler registered [Thu Dec 12 07:46:27 2019][ 4.640428] alua: device handler registered [Thu Dec 12 07:46:27 2019][ 4.644665] libphy: Fixed MDIO Bus: probed [Thu Dec 12 07:46:27 2019][ 4.648827] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver [Thu Dec 12 07:46:27 2019][ 4.655362] ehci-pci: EHCI PCI platform driver [Thu Dec 12 07:46:27 2019][ 4.659829] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver [Thu Dec 12 07:46:27 2019][ 4.666021] ohci-pci: OHCI PCI platform driver [Thu Dec 12 07:46:27 2019][ 4.670485] uhci_hcd: USB Universal Host Controller Interface driver [Thu Dec 12 07:46:27 2019][ 4.676955] xhci_hcd 0000:02:00.3: xHCI Host Controller [Thu Dec 12 07:46:27 2019][ 4.682248] xhci_hcd 0000:02:00.3: new USB bus registered, assigned bus number 1 [Thu Dec 12 07:46:27 2019][ 4.689760] xhci_hcd 0000:02:00.3: hcc params 0x0270f665 hci version 0x100 quirks 0x00000410 [Thu Dec 12 07:46:27 2019][ 4.698500] usb usb1: New USB device found, idVendor=1d6b, idProduct=0002 [Thu Dec 12 07:46:27 2019][ 4.705295] usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [Thu Dec 12 07:46:27 2019][ 4.712527] usb usb1: Product: xHCI Host Controller [Thu Dec 12 07:46:27 2019][ 4.717413] usb usb1: Manufacturer: Linux 3.10.0-957.27.2.el7_lustre.pl2.x86_64 xhci-hcd [Thu Dec 12 07:46:27 2019][ 4.725505] usb usb1: SerialNumber: 0000:02:00.3 [Thu Dec 12 07:46:27 2019][ 4.730241] hub 1-0:1.0: USB hub found [Thu Dec 12 07:46:27 2019][ 4.734005] hub 1-0:1.0: 2 ports detected [Thu Dec 12 07:46:27 2019][ 4.738250] xhci_hcd 0000:02:00.3: xHCI Host Controller [Thu Dec 12 07:46:27 2019][ 4.743532] xhci_hcd 0000:02:00.3: new USB bus registered, assigned bus number 2 [Thu Dec 12 07:46:27 2019][ 4.750947] usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. [Thu Dec 12 07:46:27 2019][ 4.759050] usb usb2: New USB device found, idVendor=1d6b, idProduct=0003 [Thu Dec 12 07:46:27 2019][ 4.765842] usb usb2: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [Thu Dec 12 07:46:27 2019][ 4.773071] usb usb2: Product: xHCI Host Controller [Thu Dec 12 07:46:27 2019][ 4.777959] usb usb2: Manufacturer: Linux 3.10.0-957.27.2.el7_lustre.pl2.x86_64 xhci-hcd [Thu Dec 12 07:46:27 2019][ 4.786052] usb usb2: SerialNumber: 0000:02:00.3 [Thu Dec 12 07:46:27 2019][ 4.790772] hub 2-0:1.0: USB hub found [Thu Dec 12 07:46:27 2019][ 4.794541] hub 2-0:1.0: 2 ports detected [Thu Dec 12 07:46:27 2019][ 4.798852] xhci_hcd 0000:41:00.3: xHCI Host Controller [Thu Dec 12 07:46:27 2019][ 4.804162] xhci_hcd 0000:41:00.3: new USB bus registered, assigned bus number 3 [Thu Dec 12 07:46:27 2019][ 4.811669] xhci_hcd 0000:41:00.3: hcc params 0x0270f665 hci version 0x100 quirks 0x00000410 [Thu Dec 12 07:46:27 2019][ 4.820440] usb usb3: New USB device found, idVendor=1d6b, idProduct=0002 [Thu Dec 12 07:46:27 2019][ 4.827239] usb usb3: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [Thu Dec 12 07:46:27 2019][ 4.834467] usb usb3: Product: xHCI Host Controller [Thu Dec 12 07:46:27 2019][ 4.839355] usb usb3: Manufacturer: Linux 3.10.0-957.27.2.el7_lustre.pl2.x86_64 xhci-hcd [Thu Dec 12 07:46:27 2019][ 4.847449] usb usb3: SerialNumber: 0000:41:00.3 [Thu Dec 12 07:46:27 2019][ 4.852179] hub 3-0:1.0: USB hub found [Thu Dec 12 07:46:27 2019][ 4.855944] hub 3-0:1.0: 2 ports detected [Thu Dec 12 07:46:27 2019][ 4.860193] xhci_hcd 0000:41:00.3: xHCI Host Controller [Thu Dec 12 07:46:27 2019][ 4.865471] xhci_hcd 0000:41:00.3: new USB bus registered, assigned bus number 4 [Thu Dec 12 07:46:27 2019][ 4.872915] usb usb4: We don't know the algorithms for LPM for this host, disabling LPM. [Thu Dec 12 07:46:27 2019][ 4.881028] usb usb4: New USB device found, idVendor=1d6b, idProduct=0003 [Thu Dec 12 07:46:27 2019][ 4.887827] usb usb4: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [Thu Dec 12 07:46:27 2019][ 4.895055] usb usb4: Product: xHCI Host Controller [Thu Dec 12 07:46:27 2019][ 4.899943] usb usb4: Manufacturer: Linux 3.10.0-957.27.2.el7_lustre.pl2.x86_64 xhci-hcd [Thu Dec 12 07:46:27 2019][ 4.908036] usb usb4: SerialNumber: 0000:41:00.3 [Thu Dec 12 07:46:27 2019][ 4.912752] hub 4-0:1.0: USB hub found [Thu Dec 12 07:46:27 2019][ 4.916516] hub 4-0:1.0: 2 ports detected [Thu Dec 12 07:46:27 2019][ 4.920786] usbcore: registered new interface driver usbserial_generic [Thu Dec 12 07:46:27 2019][ 4.927327] usbserial: USB Serial support registered for generic [Thu Dec 12 07:46:27 2019][ 4.933379] i8042: PNP: No PS/2 controller found. Probing ports directly. [Thu Dec 12 07:46:27 2019][ 5.171570] usb 3-1: new high-speed USB device number 2 using xhci_hcd [Thu Dec 12 07:46:28 2019][ 5.301546] usb 3-1: New USB device found, idVendor=1604, idProduct=10c0 [Thu Dec 12 07:46:28 2019][ 5.308250] usb 3-1: New USB device strings: Mfr=0, Product=0, SerialNumber=0 [Thu Dec 12 07:46:28 2019][ 5.321060] hub 3-1:1.0: USB hub found [Thu Dec 12 07:46:28 2019][ 5.325044] hub 3-1:1.0: 4 ports detected [Thu Dec 12 07:46:28 2019][ 5.971267] i8042: No controller found [Thu Dec 12 07:46:28 2019][ 5.975033] sched: RT throttling activated [Thu Dec 12 07:46:28 2019][ 5.975042] tsc: Refined TSC clocksource calibration: 1996.249 MHz [Thu Dec 12 07:46:28 2019][ 5.975181] mousedev: PS/2 mouse device common for all mice [Thu Dec 12 07:46:28 2019][ 5.975406] rtc_cmos 00:01: RTC can wake from S4 [Thu Dec 12 07:46:28 2019][ 5.975791] rtc_cmos 00:01: rtc core: registered rtc_cmos as rtc0 [Thu Dec 12 07:46:28 2019][ 5.975892] rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram, hpet irqs [Thu Dec 12 07:46:28 2019][ 5.975948] cpuidle: using governor menu [Thu Dec 12 07:46:28 2019][ 5.976172] EFI Variables Facility v0.08 2004-May-17 [Thu Dec 12 07:46:28 2019][ 5.997170] hidraw: raw HID events driver (C) Jiri Kosina [Thu Dec 12 07:46:28 2019][ 5.997273] usbcore: registered new interface driver usbhid [Thu Dec 12 07:46:28 2019][ 5.997273] usbhid: USB HID core driver [Thu Dec 12 07:46:28 2019][ 5.997360] drop_monitor: Initializing network drop monitor service [Thu Dec 12 07:46:28 2019][ 5.997497] TCP: cubic registered [Thu Dec 12 07:46:28 2019][ 5.997501] Initializing XFRM netlink socket [Thu Dec 12 07:46:28 2019][ 5.997716] NET: Registered protocol family 10 [Thu Dec 12 07:46:28 2019][ 5.998187] NET: Registered protocol family 17 [Thu Dec 12 07:46:28 2019][ 5.998191] mpls_gso: MPLS GSO support [Thu Dec 12 07:46:28 2019][ 5.999264] mce: Using 23 MCE banks [Thu Dec 12 07:46:28 2019][ 5.999310] microcode: CPU0: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 5.999321] microcode: CPU1: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 5.999334] microcode: CPU2: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 5.999345] microcode: CPU3: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 5.999358] microcode: CPU4: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 5.999370] microcode: CPU5: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 5.999385] microcode: CPU6: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 5.999401] microcode: CPU7: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 5.999409] microcode: CPU8: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 5.999418] microcode: CPU9: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 5.999426] microcode: CPU10: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 5.999437] microcode: CPU11: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 5.999447] microcode: CPU12: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 5.999458] microcode: CPU13: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 5.999469] microcode: CPU14: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 5.999480] microcode: CPU15: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003705] microcode: CPU16: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003713] microcode: CPU17: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003723] microcode: CPU18: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003734] microcode: CPU19: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003744] microcode: CPU20: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003754] microcode: CPU21: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003765] microcode: CPU22: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003776] microcode: CPU23: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003786] microcode: CPU24: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003797] microcode: CPU25: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003805] microcode: CPU26: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003816] microcode: CPU27: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003826] microcode: CPU28: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003837] microcode: CPU29: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003844] microcode: CPU30: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003853] microcode: CPU31: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003860] microcode: CPU32: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003867] microcode: CPU33: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003874] microcode: CPU34: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003883] microcode: CPU35: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003893] microcode: CPU36: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003901] microcode: CPU37: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003909] microcode: CPU38: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003917] microcode: CPU39: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003925] microcode: CPU40: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003933] microcode: CPU41: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003941] microcode: CPU42: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003950] microcode: CPU43: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003957] microcode: CPU44: patch_level=0x08001250 [Thu Dec 12 07:46:28 2019][ 6.003966] microcode: CPU45: patch_level=0x08001250 [Thu Dec 12 07:46:29 2019][ 6.003974] microcode: CPU46: patch_level=0x08001250 [Thu Dec 12 07:46:29 2019][ 6.003982] microcode: CPU47: patch_level=0x08001250 [Thu Dec 12 07:46:29 2019][ 6.004031] microcode: Microcode Update Driver: v2.01 , Peter Oruba [Thu Dec 12 07:46:29 2019][ 6.004191] Loading compiled-in X.509 certificates [Thu Dec 12 07:46:29 2019][ 6.004218] Loaded X.509 cert 'CentOS Linux kpatch signing key: ea0413152cde1d98ebdca3fe6f0230904c9ef717' [Thu Dec 12 07:46:29 2019][ 6.004235] Loaded X.509 cert 'CentOS Linux Driver update signing key: 7f421ee0ab69461574bb358861dbe77762a4201b' [Thu Dec 12 07:46:29 2019][ 6.004580] usb 3-1.1: new high-speed USB device number 3 using xhci_hcd [Thu Dec 12 07:46:29 2019][ 6.004619] Loaded X.509 cert 'CentOS Linux kernel signing key: 468656045a39b52ff2152c315f6198c3e658f24d' [Thu Dec 12 07:46:29 2019][ 6.004633] registered taskstats version 1 [Thu Dec 12 07:46:29 2019][ 6.006719] Key type trusted registered [Thu Dec 12 07:46:29 2019][ 6.008226] Key type encrypted registered [Thu Dec 12 07:46:29 2019][ 6.008271] IMA: No TPM chip found, activating TPM-bypass! (rc=-19) [Thu Dec 12 07:46:29 2019][ 6.009989] Magic number: 15:764:790 [Thu Dec 12 07:46:29 2019][ 6.010131] memory memory1725: hash matches [Thu Dec 12 07:46:29 2019][ 6.010151] memory memory1279: hash matches [Thu Dec 12 07:46:29 2019][ 6.010167] memory memory938: hash matches [Thu Dec 12 07:46:29 2019][ 6.017543] rtc_cmos 00:01: setting system clock to 2019-12-12 15:46:28 UTC (1576165588) [Thu Dec 12 07:46:29 2019][ 6.401577] Switched to clocksource tsc [Thu Dec 12 07:46:29 2019][ 6.406593] Freeing unused kernel memory: 1876k freed [Thu Dec 12 07:46:29 2019][ 6.411884] Write protecting the kernel read-only data: 12288k [Thu Dec 12 07:46:29 2019][ 6.413551] usb 3-1.1: New USB device found, idVendor=1604, idProduct=10c0 [Thu Dec 12 07:46:29 2019][ 6.413553] usb 3-1.1: New USB device strings: Mfr=0, Product=0, SerialNumber=0 [Thu Dec 12 07:46:29 2019][ 6.433293] Freeing unused kernel memory: 504k freed [Thu Dec 12 07:46:29 2019][ 6.439653] Freeing unused kernel memory: 596k freed [Thu Dec 12 07:46:29 2019][ 6.441069] hub 3-1.1:1.0: USB hub found [Thu Dec 12 07:46:29 2019][ 6.441424] hub 3-1.1:1.0: 4 ports detected [Thu Dec 12 07:46:29 2019][ 6.499899] systemd[1]: systemd 219 running in system mode. (+PAM +AUDIT +SELINUX +IMA -APPARMOR +SMACK +SYSVINIT +UTMP +LIBCRYPTSETUP +GCRYPT +GNUTLS +ACL +XZ +LZ4 -SECCOMP +BLKID +ELFUTILS +KMOD +IDN) [Thu Dec 12 07:46:29 2019][ 6.505569] usb 3-1.4: new high-speed USB device number 4 using xhci_hcd [Thu Dec 12 07:46:29 2019][ 6.517722] usb 2-1: new SuperSpeed USB device number 2 using xhci_hcd [Thu Dec 12 07:46:29 2019][ 6.531840] usb 2-1: New USB device found, idVendor=0424, idProduct=5744 [Thu Dec 12 07:46:29 2019][ 6.532217] systemd[1]: Detected architecture x86-64. [Thu Dec 12 07:46:29 2019][ 6.532220] systemd[1]: Running in initial RAM disk. [Thu Dec 12 07:46:29 2019][ 6.548564] usb 2-1: New USB device strings: Mfr=2, Product=3, SerialNumber=0 [Thu Dec 12 07:46:29 2019][ 6.548565] usb 2-1: Product: USB5734 [Thu Dec 12 07:46:29 2019][ 6.548567] usb 2-1: Manufacturer: Microchip Tech [Thu Dec 12 07:46:29 2019][ 6.559115] hub 2-1:1.0: USB hub found [Thu Dec 12 07:46:29 2019][ 6.559464] hub 2-1:1.0: 4 ports detected [Thu Dec 12 07:46:29 2019] [Thu Dec 12 07:46:29 2019]Welcome to CentOS Linux 7 (Core) dracut-033-554.el7 (Initramfs)! [Thu Dec 12 07:46:29 2019] [Thu Dec 12 07:46:29 2019][ 6.579552] usb 3-1.4: New USB device found, idVendor=1604, idProduct=10c0 [Thu Dec 12 07:46:29 2019][ 6.586425] usb 3-1.4: New USB device strings: Mfr=0, Product=0, SerialNumber=0 [Thu Dec 12 07:46:29 2019][ 6.593876] systemd[1]: Set hostname to . [Thu Dec 12 07:46:29 2019][ 6.601072] hub 3-1.4:1.0: USB hub found [Thu Dec 12 07:46:29 2019][ 6.605177] hub 3-1.4:1.0: 4 ports detected [Thu Dec 12 07:46:29 2019][ 6.629314] systemd[1]: Reached target Swap. [Thu Dec 12 07:46:29 2019][ OK ] Reached target Swap. [Thu Dec 12 07:46:29 2019][ 6.637650] systemd[1]: Reached target Local File Systems. [Thu Dec 12 07:46:29 2019][ 6.641572] usb 1-1: new high-speed USB device number 2 using xhci_hcd [Thu Dec 12 07:46:29 2019][ OK ] Reached target Local File Systems. [Thu Dec 12 07:46:29 2019][ 6.655838] systemd[1]: Created slice Root Slice. [Thu Dec 12 07:46:29 2019][ OK ] Created slice Root Slice. [Thu Dec 12 07:46:29 2019][ 6.666662] systemd[1]: Listening on udev Kernel Socket. [Thu Dec 12 07:46:29 2019][ OK ] Listening on udev Kernel Socket. [Thu Dec 12 07:46:29 2019][ 6.677686] systemd[1]: Created slice System Slice. [Thu Dec 12 07:46:29 2019][ OK ] Created slice System Slice. [Thu Dec 12 07:46:29 2019][ 6.688653] systemd[1]: Listening on udev Control Socket. [Thu Dec 12 07:46:29 2019][ OK ] Listening on udev Control Socket. [Thu Dec 12 07:46:29 2019][ 6.699636] systemd[1]: Reached target Slices. [Thu Dec 12 07:46:29 2019][ OK ] Reached target Slices. [Thu Dec 12 07:46:29 2019][ 6.708655] systemd[1]: Listening on Journal Socket. [Thu Dec 12 07:46:29 2019][ OK ] Listening on Journal Socket. [Thu Dec 12 07:46:29 2019][ 6.720157] systemd[1]: Starting Create list of required static device nodes for the current kernel... [Thu Dec 12 07:46:29 2019] Starting Create list of required st... nodes for the current kernel... [Thu Dec 12 07:46:29 2019][ 6.738020] systemd[1]: Starting Journal Service... [Thu Dec 12 07:46:29 2019] Starting Journal Service... [Thu Dec 12 07:46:29 2019][ 6.746628] systemd[1]: Reached target Sockets. [Thu Dec 12 07:46:29 2019][ OK ] Reached target Sockets. [Thu Dec 12 07:46:29 2019][ 6.756122] systemd[1]: Starting dracut cmdline hook... [Thu Dec 12 07:46:29 2019] Starting dracut cmdline hook... [Thu Dec 12 07:46:29 2019][ 6.767026] systemd[1]: Starting Setup Virtual Console... [Thu Dec 12 07:46:29 2019] Starting Setup Virtual Console... [Thu Dec 12 07:46:29 2019][ 6.775469] usb 1-1: New USB device found, idVendor=0424, idProduct=2744 [Thu Dec 12 07:46:29 2019][ 6.775471] usb 1-1: New USB device strings: Mfr=1, Product=2, SerialNumber=0 [Thu Dec 12 07:46:29 2019][ 6.775472] usb 1-1: Product: USB2734 [Thu Dec 12 07:46:29 2019][ 6.775474] usb 1-1: Manufacturer: Microchip Tech [Thu Dec 12 07:46:29 2019][ 6.776623] systemd[1]: Reached target Timers. [Thu Dec 12 07:46:29 2019][ 6.799127] hub 1-1:1.0: USB hub found [Thu Dec 12 07:46:29 2019][ 6.799466] hub 1-1:1.0: 4 ports detected [Thu Dec 12 07:46:29 2019][ OK ] Reached target Timers. [Thu Dec 12 07:46:29 2019][ 6.831173] systemd[1]: Starting Apply Kernel Variables... [Thu Dec 12 07:46:29 2019] Starting Apply Kernel Variables... [Thu Dec 12 07:46:29 2019][ 6.841890] systemd[1]: Started Journal Service. [Thu Dec 12 07:46:29 2019][ OK ] Started Journal Service. [Thu Dec 12 07:46:29 2019][ OK ] Started Create list of required sta...ce nodes for the current kernel. [Thu Dec 12 07:46:29 2019][ OK ] Started dracut cmdline hook. [Thu Dec 12 07:46:29 2019][ OK ] Started Setup Virtual Console. [Thu Dec 12 07:46:29 2019][ OK ] Started Apply Kernel Variables. [Thu Dec 12 07:46:29 2019] Starting dracut pre-udev hook... [Thu Dec 12 07:46:29 2019] Starting Create Static Device Nodes in /dev... [Thu Dec 12 07:46:29 2019][ OK ] Started Create Static Device Nodes in /dev. [Thu Dec 12 07:46:29 2019][ OK ] Started dracut pre-udev hook. [Thu Dec 12 07:46:29 2019] Starting udev Kernel Device Manager... [Thu Dec 12 07:46:29 2019][ OK ] Started udev Kernel Device Manager. [Thu Dec 12 07:46:29 2019] Starting udev Coldplug all Devices... [Thu Dec 12 07:46:29 2019] Mounting Configuration File System... [Thu Dec 12 07:46:29 2019][ OK ] Mounted Configuration File System. [Thu Dec 12 07:46:29 2019][ 6.981368] pps_core: LinuxPPS API ver. 1 registered [Thu Dec 12 07:46:29 2019][ 6.986345] pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti [Thu Dec 12 07:46:29 2019][ 6.999422] megasas: 07.705.02.00-rh1 [Thu Dec 12 07:46:29 2019][ OK [ 7.003412] megaraid_sas 0000:c1:00.0: FW now in Ready state [Thu Dec 12 07:46:29 2019]] Started udev C[ 7.010289] megaraid_sas 0000:c1:00.0: 64 bit DMA mask and 32 bit consistent mask [Thu Dec 12 07:46:29 2019]oldplug all Devi[ 7.019390] PTP clock support registered [Thu Dec 12 07:46:29 2019]ces. [Thu Dec 12 07:46:29 2019][ 7.021484] megaraid_sas 0000:c1:00.0: firmware supports msix : (96) [Thu Dec 12 07:46:29 2019][ 7.021486] megaraid_sas 0000:c1:00.0: current msix/online cpus : (48/48) [Thu Dec 12 07:46:29 2019][ 7.021487] megaraid_sas 0000:c1:00.0: RDPQ mode : (disabled) [Thu Dec 12 07:46:29 2019][ 7.021490] megaraid_sas 0000:c1:00.0: Current firmware supports maximum commands: 928 LDIO threshold: 237 [Thu Dec 12 07:46:29 2019][ 7.021818] megaraid_sas 0000:c1:00.0: Configured max firmware commands: 927 [Thu Dec 12 07:46:29 2019][ 7.024216] megaraid_sas 0000:c1:00.0: FW supports sync cache : No [Thu Dec 12 07:46:29 2019][ 7.067697] mlx_compat: loading out-of-tree module taints kernel. [Thu Dec 12 07:46:29 2019][ 7.067835] mpt3sas: loading out-of-tree module taints kernel. [Thu Dec 12 07:46:29 2019] Startin[ 7.079837] mpt3sas: module verification failed: signature and/or required key missing - tainting kernel [Thu Dec 12 07:46:29 2019]g Show Plymouth Boot Screen... [Thu Dec 12 07:46:29 2019][ 7.096477] Compat-mlnx-ofed backport release: 1c4bf42 [Thu Dec 12 07:46:29 2019][ 7.101630] Backport based on mlnx_ofed/mlnx-ofa_kernel-4.0.git 1c4bf42 [Thu Dec 12 07:46:29 2019][ 7.101631] compat.git: mlnx_ofed/mlnx-ofa_kernel-4.0.git [Thu Dec 12 07:46:29 2019][ 7.115395] tg3.c:v3.137 (May 11, 2014) [Thu Dec 12 07:46:29 2019][ 7.130487] mpt3sas version 31.00.00.00 loaded [Thu Dec 12 07:46:29 2019][ 7.135671] mpt3sas_cm0: 63 BIT PCI BUS DMA ADDRESSING SUPPORTED, total mem (263564420 kB) [Thu Dec 12 07:46:29 2019][ 7.136317] tg3 0000:81:00.0 eth0: Tigon3 [partno(BCM95720) rev 5720000] (PCI Express) MAC address 4c:d9:8f:7d:ad:d7 [Thu Dec 12 07:46:29 2019][ 7.136319] tg3 0000:81:00.0 eth0: attached PHY is 5720C (10/100/1000Base-T Ethernet) (WireSpeed[1], EEE[1]) [Thu Dec 12 07:46:29 2019] Startin[ 7.136321] tg3 0000:81:00.0 eth0: RXcsums[1] LinkChgREG[0] MIirq[0] ASF[1] TSOcap[1] [Thu Dec 12 07:46:29 2019]g dracut initque[ 7.136323] tg3 0000:81:00.0 eth0: dma_rwctrl[00000001] dma_mask[64-bit] [Thu Dec 12 07:46:29 2019]ue hook... [Thu Dec 12 07:46:29 2019][ 7.156722] tg3 0000:81:00.1 eth1: Tigon3 [partno(BCM95720) rev 5720000] (PCI Express) MAC address 4c:d9:8f:7d:ad:d8 [Thu Dec 12 07:46:29 2019][ OK [ 7.156725] tg3 0000:81:00.1 eth1: attached PHY is 5720C (10/100/1000Base-T Ethernet) (WireSpeed[1], EEE[1]) [Thu Dec 12 07:46:29 2019]] Reached target[ 7.156726] tg3 0000:81:00.1 eth1: RXcsums[1] LinkChgREG[0] MIirq[0] ASF[1] TSOcap[1] [Thu Dec 12 07:46:29 2019] System Initiali[ 7.156728] tg3 0000:81:00.1 eth1: dma_rwctrl[00000001] dma_mask[64-bit] [Thu Dec 12 07:46:29 2019]zation. [Thu Dec 12 07:46:29 2019][ OK ] Started Show Plymouth Boot Screen. [Thu Dec 12 07:46:29 2019][ OK ] Started Forward Passwor[ 7.229780] ahci 0000:86:00.2: AHCI 0001.0301 32 slots 1 ports 6 Gbps 0x1 impl SATA mode [Thu Dec 12 07:46:29 2019]d Requests to Pl[ 7.239054] ahci 0000:86:00.2: flags: 64bit ncq sntf ilck pm led clo only pmp fbs pio slum part [Thu Dec 12 07:46:29 2019]ymouth Directory Watch. [Thu Dec 12 07:46:29 2019][[ 7.250890] scsi host2: ahci [Thu Dec 12 07:46:29 2019] OK ] Reac[ 7.253185] mlx5_core 0000:01:00.0: firmware version: 20.26.1040 [Thu Dec 12 07:46:29 2019]hed target Paths[ 7.253214] mlx5_core 0000:01:00.0: 126.016 Gb/s available PCIe bandwidth, limited by 8 GT/s x16 link at 0000:00:03.1 (capable of 252.048 Gb/s with 16 GT/s x16 link) [Thu Dec 12 07:46:30 2019]. [Thu Dec 12 07:46:30 2019][ OK [ 7.278593] ata1: SATA max UDMA/133 abar m4096@0xc0a02000 port 0xc0a02100 irq 120 [Thu Dec 12 07:46:30 2019]] Reached target Basic System. [Thu Dec 12 07:46:30 2019]%G%G[ 7.339575] mpt3sas_cm0: IOC Number : 0 [Thu Dec 12 07:46:30 2019][ 7.344930] mpt3sas0-msix0: PCI-MSI-X enabled: IRQ 137 [Thu Dec 12 07:46:30 2019][ 7.350067] mpt3sas0-msix1: PCI-MSI-X enabled: IRQ 138 [Thu Dec 12 07:46:30 2019][ 7.355210] mpt3sas0-msix2: PCI-MSI-X enabled: IRQ 139 [Thu Dec 12 07:46:30 2019][ 7.360346] mpt3sas0-msix3: PCI-MSI-X enabled: IRQ 140 [Thu Dec 12 07:46:30 2019][ 7.365486] mpt3sas0-msix4: PCI-MSI-X enabled: IRQ 141 [Thu Dec 12 07:46:30 2019][ 7.370625] mpt3sas0-msix5: PCI-MSI-X enabled: IRQ 142 [Thu Dec 12 07:46:30 2019][ 7.375765] mpt3sas0-msix6: PCI-MSI-X enabled: IRQ 143 [Thu Dec 12 07:46:30 2019][ 7.380909] mpt3sas0-msix7: PCI-MSI-X enabled: IRQ 144 [Thu Dec 12 07:46:30 2019][ 7.380910] mpt3sas0-msix8: PCI-MSI-X enabled: IRQ 145 [Thu Dec 12 07:46:30 2019][ 7.380910] mpt3sas0-msix9: PCI-MSI-X enabled: IRQ 146 [Thu Dec 12 07:46:30 2019][ 7.380910] mpt3sas0-msix10: PCI-MSI-X enabled: IRQ 147 [Thu Dec 12 07:46:30 2019][ 7.380911] mpt3sas0-msix11: PCI-MSI-X enabled: IRQ 148 [Thu Dec 12 07:46:30 2019][ 7.380912] mpt3sas0-msix12: PCI-MSI-X enabled: IRQ 149 [Thu Dec 12 07:46:30 2019][ 7.380915] mpt3sas0-msix13: PCI-MSI-X enabled: IRQ 150 [Thu Dec 12 07:46:30 2019][ 7.380916] mpt3sas0-msix14: PCI-MSI-X enabled: IRQ 151 [Thu Dec 12 07:46:30 2019][ 7.380916] mpt3sas0-msix15: PCI-MSI-X enabled: IRQ 152 [Thu Dec 12 07:46:30 2019][ 7.380917] mpt3sas0-msix16: PCI-MSI-X enabled: IRQ 153 [Thu Dec 12 07:46:30 2019][ 7.380917] mpt3sas0-msix17: PCI-MSI-X enabled: IRQ 154 [Thu Dec 12 07:46:30 2019][ 7.380918] mpt3sas0-msix18: PCI-MSI-X enabled: IRQ 155 [Thu Dec 12 07:46:30 2019][ 7.380918] mpt3sas0-msix19: PCI-MSI-X enabled: IRQ 156 [Thu Dec 12 07:46:30 2019][ 7.380919] mpt3sas0-msix20: PCI-MSI-X enabled: IRQ 157 [Thu Dec 12 07:46:30 2019][ 7.380919] mpt3sas0-msix21: PCI-MSI-X enabled: IRQ 158 [Thu Dec 12 07:46:30 2019][ 7.380920] mpt3sas0-msix22: PCI-MSI-X enabled: IRQ 159 [Thu Dec 12 07:46:30 2019][ 7.380920] mpt3sas0-msix23: PCI-MSI-X enabled: IRQ 160 [Thu Dec 12 07:46:30 2019][ 7.380920] mpt3sas0-msix24: PCI-MSI-X enabled: IRQ 161 [Thu Dec 12 07:46:30 2019][ 7.380921] mpt3sas0-msix25: PCI-MSI-X enabled: IRQ 162 [Thu Dec 12 07:46:30 2019][ 7.380921] mpt3sas0-msix26: PCI-MSI-X enabled: IRQ 163 [Thu Dec 12 07:46:30 2019][ 7.380922] mpt3sas0-msix27: PCI-MSI-X enabled: IRQ 164 [Thu Dec 12 07:46:30 2019][ 7.380922] mpt3sas0-msix28: PCI-MSI-X enabled: IRQ 165 [Thu Dec 12 07:46:30 2019][ 7.380923] mpt3sas0-msix29: PCI-MSI-X enabled: IRQ 166 [Thu Dec 12 07:46:30 2019][ 7.380923] mpt3sas0-msix30: PCI-MSI-X enabled: IRQ 167 [Thu Dec 12 07:46:30 2019][ 7.380924] mpt3sas0-msix31: PCI-MSI-X enabled: IRQ 168 [Thu Dec 12 07:46:30 2019][ 7.380924] mpt3sas0-msix32: PCI-MSI-X enabled: IRQ 169 [Thu Dec 12 07:46:30 2019][ 7.380925] mpt3sas0-msix33: PCI-MSI-X enabled: IRQ 170 [Thu Dec 12 07:46:30 2019][ 7.380925] mpt3sas0-msix34: PCI-MSI-X enabled: IRQ 171 [Thu Dec 12 07:46:30 2019][ 7.380926] mpt3sas0-msix35: PCI-MSI-X enabled: IRQ 172 [Thu Dec 12 07:46:30 2019][ 7.380926] mpt3sas0-msix36: PCI-MSI-X enabled: IRQ 173 [Thu Dec 12 07:46:30 2019][ 7.380926] mpt3sas0-msix37: PCI-MSI-X enabled: IRQ 174 [Thu Dec 12 07:46:30 2019][ 7.380927] mpt3sas0-msix38: PCI-MSI-X enabled: IRQ 175 [Thu Dec 12 07:46:30 2019][ 7.380927] mpt3sas0-msix39: PCI-MSI-X enabled: IRQ 176 [Thu Dec 12 07:46:30 2019][ 7.380928] mpt3sas0-msix40: PCI-MSI-X enabled: IRQ 177 [Thu Dec 12 07:46:30 2019][ 7.380928] mpt3sas0-msix41: PCI-MSI-X enabled: IRQ 178 [Thu Dec 12 07:46:30 2019][ 7.380929] mpt3sas0-msix42: PCI-MSI-X enabled: IRQ 179 [Thu Dec 12 07:46:30 2019][ 7.380929] mpt3sas0-msix43: PCI-MSI-X enabled: IRQ 180 [Thu Dec 12 07:46:30 2019][ 7.380930] mpt3sas0-msix44: PCI-MSI-X enabled: IRQ 181 [Thu Dec 12 07:46:30 2019][ 7.380930] mpt3sas0-msix45: PCI-MSI-X enabled: IRQ 182 [Thu Dec 12 07:46:30 2019][ 7.380931] mpt3sas0-msix46: PCI-MSI-X enabled: IRQ 183 [Thu Dec 12 07:46:30 2019][ 7.380931] mpt3sas0-msix47: PCI-MSI-X enabled: IRQ 184 [Thu Dec 12 07:46:30 2019][ 7.380933] mpt3sas_cm0: iomem(0x00000000ac000000), mapped(0xffffb74ada200000), size(1048576) [Thu Dec 12 07:46:30 2019][ 7.380933] mpt3sas_cm0: ioport(0x0000000000008000), size(256) [Thu Dec 12 07:46:30 2019][ 7.381578] megaraid_sas 0000:c1:00.0: Init cmd return status SUCCESS for SCSI host 0 [Thu Dec 12 07:46:30 2019][ 7.402576] megaraid_sas 0000:c1:00.0: firmware type : Legacy(64 VD) firmware [Thu Dec 12 07:46:30 2019][ 7.402578] megaraid_sas 0000:c1:00.0: controller type : iMR(0MB) [Thu Dec 12 07:46:30 2019][ 7.402579] megaraid_sas 0000:c1:00.0: Online Controller Reset(OCR) : Enabled [Thu Dec 12 07:46:30 2019][ 7.402580] megaraid_sas 0000:c1:00.0: Secure JBOD support : No [Thu Dec 12 07:46:30 2019][ 7.402581] megaraid_sas 0000:c1:00.0: NVMe passthru support : No [Thu Dec 12 07:46:30 2019][ 7.424093] megaraid_sas 0000:c1:00.0: INIT adapter done [Thu Dec 12 07:46:30 2019][ 7.424096] megaraid_sas 0000:c1:00.0: Jbod map is not supported megasas_setup_jbod_map 5146 [Thu Dec 12 07:46:30 2019][ 7.450189] megaraid_sas 0000:c1:00.0: pci id : (0x1000)/(0x005f)/(0x1028)/(0x1f4b) [Thu Dec 12 07:46:30 2019][ 7.450190] megaraid_sas 0000:c1:00.0: unevenspan support : yes [Thu Dec 12 07:46:30 2019][ 7.450191] megaraid_sas 0000:c1:00.0: firmware crash dump : no [Thu Dec 12 07:46:30 2019][ 7.450192] megaraid_sas 0000:c1:00.0: jbod sync map : no [Thu Dec 12 07:46:30 2019][ 7.450197] scsi host0: Avago SAS based MegaRAID driver [Thu Dec 12 07:46:30 2019][ 7.470315] scsi 0:2:0:0: Direct-Access DELL PERC H330 Mini 4.30 PQ: 0 ANSI: 5 [Thu Dec 12 07:46:30 2019][ 7.472571] mpt3sas_cm0: IOC Number : 0 [Thu Dec 12 07:46:30 2019][ 7.472574] mpt3sas_cm0: sending message unit reset !! [Thu Dec 12 07:46:30 2019][ 7.475571] mpt3sas_cm0: message unit reset: SUCCESS [Thu Dec 12 07:46:30 2019][ 7.512082] mlx5_core 0000:01:00.0: Port module event: module 0, Cable plugged [Thu Dec 12 07:46:30 2019][ 7.512331] mlx5_core 0000:01:00.0: mlx5_pcie_event:303:(pid 316): PCIe slot advertised sufficient power (27W). [Thu Dec 12 07:46:30 2019][ 7.519741] mlx5_core 0000:01:00.0: mlx5_fw_tracer_start:776:(pid 481): FWTracer: Ownership granted and active [Thu Dec 12 07:46:30 2019][ 7.591590] ata1: SATA link down (SStatus 0 SControl 300) [Thu Dec 12 07:46:30 2019][ 7.656150] mpt3sas_cm0: Allocated physical memory: size(38831 kB) [Thu Dec 12 07:46:30 2019][ 7.656151] mpt3sas_cm0: Current Controller Queue Depth(7564), Max Controller Queue Depth(7680) [Thu Dec 12 07:46:30 2019][ 7.656152] mpt3sas_cm0: Scatter Gather Elements per IO(128) [Thu Dec 12 07:46:30 2019][ 7.782603] mlx5_ib: Mellanox Connect-IB Infiniband driver v4.7-1.0.0 [Thu Dec 12 07:46:30 2019][ 7.806178] mpt3sas_cm0: FW Package Version(12.00.00.00) [Thu Dec 12 07:46:30 2019][ 7.811815] mpt3sas_cm0: SAS3616: FWVersion(12.00.00.00), ChipRevision(0x02), BiosVersion(09.21.00.00) [Thu Dec 12 07:46:30 2019][ 7.821135] mpt3sas_cm0: Protocol=(Initiator,Target,NVMe), Capabilities=(TLR,EEDP,Diag Trace Buffer,Task Set Full,NCQ) [Thu Dec 12 07:46:30 2019][ 7.832180] mpt3sas 0000:84:00.0: Enabled Extended Tags as Controller Supports [Thu Dec 12 07:46:30 2019][ 7.839420] mpt3sas_cm0: : host protection capabilities enabled DIF1 DIF2 DIF3 [Thu Dec 12 07:46:30 2019][ 7.846735] scsi host1: Fusion MPT SAS Host [Thu Dec 12 07:46:30 2019][ 7.851162] mpt3sas_cm0: registering trace buffer support [Thu Dec 12 07:46:30 2019][ 7.851469] sd 0:2:0:0: [sda] 467664896 512-byte logical blocks: (239 GB/223 GiB) [Thu Dec 12 07:46:30 2019][ 7.851625] sd 0:2:0:0: [sda] Write Protect is off [Thu Dec 12 07:46:30 2019][ 7.851666] sd 0:2:0:0: [sda] Write cache: disabled, read cache: disabled, supports DPO and FUA [Thu Dec 12 07:46:30 2019][ 7.853904] sda: sda1 sda2 sda3 [Thu Dec 12 07:46:30 2019][ 7.854268] sd 0:2:0:0: [sda] Attached SCSI disk [Thu Dec 12 07:46:30 2019][ 7.889722] mpt3sas_cm0: Trace buffer memory 2048 KB allocated [Thu Dec 12 07:46:30 2019][ 7.895557] mpt3sas_cm0: sending port enable !! [Thu Dec 12 07:46:30 2019][ 7.900747] mpt3sas_cm0: hba_port entry: ffff8dc7f6f41100, port: 255 is added to hba_port list [Thu Dec 12 07:46:30 2019][ 7.912147] mpt3sas_cm0: host_add: handle(0x0001), sas_addr(0x500605b00e718b40), phys(21) [Thu Dec 12 07:46:30 2019][ 7.922722] mpt3sas_cm0: detecting: handle(0x0011), sas_address(0x300705b00e718b40), phy(16) [Thu Dec 12 07:46:30 2019][ 7.931160] mpt3sas_cm0: REPORT_LUNS: handle(0x0011), retries(0) [Thu Dec 12 07:46:30 2019][ 7.937197] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0011), lun(0) [Thu Dec 12 07:46:30 2019][ 7.943644] scsi 1:0:0:0: Enclosure LSI VirtualSES 03 PQ: 0 ANSI: 7 [Thu Dec 12 07:46:30 2019][ 7.951774] scsi 1:0:0:0: set ignore_delay_remove for handle(0x0011) [Thu Dec 12 07:46:30 2019][ 7.958127] scsi 1:0:0:0: SES: handle(0x0011), sas_addr(0x300705b00e718b40), phy(16), device_name(0x300705b00e718b40) [Thu Dec 12 07:46:30 2019][ 7.968726] scsi 1:0:0:0: enclosure logical id(0x300605b00e118b40), slot(16) [Thu Dec 12 07:46:30 2019][ 7.975857] scsi 1:0:0:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:30 2019][ 7.976357] random: crng init done [Thu Dec 12 07:46:30 2019][ 7.985983] scsi 1:0:0:0: serial_number(300605B00E118B40) [Thu Dec 12 07:46:30 2019][ 7.991385] scsi 1:0:0:0: qdepth(1), tagged(0), simple(0), ordered(0), scsi_level(8), cmd_que(0) [Thu Dec 12 07:46:30 2019][ 8.000183] mpt3sas_cm0: log_info(0x31200206): originator(PL), code(0x20), sub_code(0x0206) [Thu Dec 12 07:46:30 2019][ 8.028309] mpt3sas_cm0: expander_add: handle(0x0058), parent(0x0001), sas_addr(0x5000ccab040371fd), phys(49) [Thu Dec 12 07:46:30 2019][ 8.048115] mpt3sas_cm0: detecting: handle(0x005c), sas_address(0x5000ccab040371fc), phy(48) [Thu Dec 12 07:46:30 2019][ 8.056565] mpt3sas_cm0: REPORT_LUNS: handle(0x005c), retries(0) [Thu Dec 12 07:46:30 2019][ 8.063058] mpt3sas_cm0: TEST_UNIT_READY: handle(0x005c), lun(0) [Thu Dec 12 07:46:30 2019][ 8.070212] scsi 1:0:1:0: Enclosure HGST H4060-J 2033 PQ: 0 ANSI: 6 [Thu Dec 12 07:46:30 2019][ 8.078561] scsi 1:0:1:0: set ignore_delay_remove for handle(0x005c) [Thu Dec 12 07:46:30 2019][ 8.084911] scsi 1:0:1:0: SES: handle(0x005c), sas_addr(0x5000ccab040371fc), phy(48), device_name(0x0000000000000000) [Thu Dec 12 07:46:30 2019][ 8.095511] scsi 1:0:1:0: enclosure logical id(0x5000ccab04037180), slot(60) [Thu Dec 12 07:46:30 2019][ 8.102643] scsi 1:0:1:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:30 2019][ 8.109360] scsi 1:0:1:0: serial_number(USWSJ03918EZ0028 ) [Thu Dec 12 07:46:30 2019][ 8.115109] scsi 1:0:1:0: qdepth(1), tagged(0), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:30 2019][ 8.145244] mpt3sas_cm0: expander_add: handle(0x005a), parent(0x0058), sas_addr(0x5000ccab040371f9), phys(68) [Thu Dec 12 07:46:30 2019][ 8.165175] mpt3sas_cm0: detecting: handle(0x005d), sas_address(0x5000cca2525f2a26), phy(0) [Thu Dec 12 07:46:30 2019][ 8.173544] mpt3sas_cm0: REPORT_LUNS: handle(0x005d), retries(0) [Thu Dec 12 07:46:30 2019][ 8.179681] mpt3sas_cm0: TEST_UNIT_READY: handle(0x005d), lun(0) [Thu Dec 12 07:46:30 2019][ 8.186338] scsi 1:0:2:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:30 2019][ 8.194562] scsi 1:0:2:0: SSP: handle(0x005d), sas_addr(0x5000cca2525f2a26), phy(0), device_name(0x5000cca2525f2a27) [Thu Dec 12 07:46:30 2019][ 8.205076] scsi 1:0:2:0: enclosure logical id(0x5000ccab04037180), slot(0) [Thu Dec 12 07:46:30 2019][ 8.212121] scsi 1:0:2:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:30 2019][ 8.218840] scsi 1:0:2:0: serial_number( 7SHPAG1W) [Thu Dec 12 07:46:30 2019][ 8.224239] scsi 1:0:2:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:30 2019][ 8.249761] mpt3sas_cm0: detecting: handle(0x005e), sas_address(0x5000cca2525e977e), phy(1) [Thu Dec 12 07:46:30 2019][ 8.258112] mpt3sas_cm0: REPORT_LUNS: handle(0x005e), retries(0) [Thu Dec 12 07:46:30 2019][ 8.264247] mpt3sas_cm0: TEST_UNIT_READY: handle(0x005e), lun(0) [Thu Dec 12 07:46:30 2019][ 8.270901] scsi 1:0:3:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:30 2019][ 8.279114] scsi 1:0:3:0: SSP: handle(0x005e), sas_addr(0x5000cca2525e977e), phy(1), device_name(0x5000cca2525e977f) [Thu Dec 12 07:46:30 2019][ 8.289629] scsi 1:0:3:0: enclosure logical id(0x5000ccab04037180), slot(2) [Thu Dec 12 07:46:30 2019][ 8.296674] scsi 1:0:3:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:30 2019][ 8.303390] scsi 1:0:3:0: serial_number( 7SHP0P8W) [Thu Dec 12 07:46:31 2019][ 8.308793] scsi 1:0:3:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:31 2019][ 8.328758] mpt3sas_cm0: detecting: handle(0x005f), sas_address(0x5000cca2525ed2be), phy(2) [Thu Dec 12 07:46:31 2019][ 8.337110] mpt3sas_cm0: REPORT_LUNS: handle(0x005f), retries(0) [Thu Dec 12 07:46:31 2019][ 8.343272] mpt3sas_cm0: TEST_UNIT_READY: handle(0x005f), lun(0) [Thu Dec 12 07:46:31 2019][ 8.349907] scsi 1:0:4:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:31 2019][ 8.358118] scsi 1:0:4:0: SSP: handle(0x005f), sas_addr(0x5000cca2525ed2be), phy(2), device_name(0x5000cca2525ed2bf) [Thu Dec 12 07:46:31 2019][ 8.368625] scsi 1:0:4:0: enclosure logical id(0x5000ccab04037180), slot(11) [Thu Dec 12 07:46:31 2019][ 8.375760] scsi 1:0:4:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:31 2019][ 8.382477] scsi 1:0:4:0: serial_number( 7SHP4MLW) [Thu Dec 12 07:46:31 2019][ 8.387877] scsi 1:0:4:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:31 2019][ 8.410759] mpt3sas_cm0: detecting: handle(0x0060), sas_address(0x5000cca2525ec04a), phy(3) [Thu Dec 12 07:46:31 2019][ 8.419106] mpt3sas_cm0: REPORT_LUNS: handle(0x0060), retries(0) [Thu Dec 12 07:46:31 2019][ 8.425273] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0060), lun(0) [Thu Dec 12 07:46:31 2019][ 8.431959] scsi 1:0:5:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:31 2019][ 8.440179] scsi 1:0:5:0: SSP: handle(0x0060), sas_addr(0x5000cca2525ec04a), phy(3), device_name(0x5000cca2525ec04b) [Thu Dec 12 07:46:31 2019][ 8.450693] scsi 1:0:5:0: enclosure logical id(0x5000ccab04037180), slot(12) [Thu Dec 12 07:46:31 2019][ 8.457826] scsi 1:0:5:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:31 2019][ 8.464543] scsi 1:0:5:0: serial_number( 7SHP3DHW) [Thu Dec 12 07:46:31 2019][ 8.469944] scsi 1:0:5:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:31 2019][ 8.489762] mpt3sas_cm0: detecting: handle(0x0061), sas_address(0x5000cca2525ff612), phy(4) [Thu Dec 12 07:46:31 2019][ 8.498111] mpt3sas_cm0: REPORT_LUNS: handle(0x0061), retries(0) [Thu Dec 12 07:46:31 2019][ 8.504253] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0061), lun(0) [Thu Dec 12 07:46:31 2019][ 8.510916] scsi 1:0:6:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:31 2019][ 8.519142] scsi 1:0:6:0: SSP: handle(0x0061), sas_addr(0x5000cca2525ff612), phy(4), device_name(0x5000cca2525ff613) [Thu Dec 12 07:46:31 2019][ 8.529656] scsi 1:0:6:0: enclosure logical id(0x5000ccab04037180), slot(13) [Thu Dec 12 07:46:31 2019][ 8.536787] scsi 1:0:6:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:31 2019][ 8.543503] scsi 1:0:6:0: serial_number( 7SHPT11W) [Thu Dec 12 07:46:31 2019][ 8.548907] scsi 1:0:6:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:31 2019][ 8.568767] mpt3sas_cm0: detecting: handle(0x0062), sas_address(0x5000cca2526016ee), phy(5) [Thu Dec 12 07:46:31 2019][ 8.577128] mpt3sas_cm0: REPORT_LUNS: handle(0x0062), retries(0) [Thu Dec 12 07:46:31 2019][ 8.583271] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0062), lun(0) [Thu Dec 12 07:46:31 2019][ 8.589913] scsi 1:0:7:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:31 2019][ 8.598126] scsi 1:0:7:0: SSP: handle(0x0062), sas_addr(0x5000cca2526016ee), phy(5), device_name(0x5000cca2526016ef) [Thu Dec 12 07:46:31 2019][ 8.608633] scsi 1:0:7:0: enclosure logical id(0x5000ccab04037180), slot(14) [Thu Dec 12 07:46:31 2019][ 8.615768] scsi 1:0:7:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:31 2019][ 8.622484] scsi 1:0:7:0: serial_number( 7SHPV6WW) [Thu Dec 12 07:46:31 2019][ 8.627884] scsi 1:0:7:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:31 2019][ 8.647762] mpt3sas_cm0: detecting: handle(0x0063), sas_address(0x5000cca2525f4872), phy(6) [Thu Dec 12 07:46:31 2019][ 8.656114] mpt3sas_cm0: REPORT_LUNS: handle(0x0063), retries(0) [Thu Dec 12 07:46:31 2019][ 8.662247] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0063), lun(0) [Thu Dec 12 07:46:31 2019][ 8.668866] scsi 1:0:8:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:31 2019][ 8.677080] scsi 1:0:8:0: SSP: handle(0x0063), sas_addr(0x5000cca2525f4872), phy(6), device_name(0x5000cca2525f4873) [Thu Dec 12 07:46:31 2019][ 8.687588] scsi 1:0:8:0: enclosure logical id(0x5000ccab04037180), slot(15) [Thu Dec 12 07:46:31 2019][ 8.694720] scsi 1:0:8:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:31 2019][ 8.701440] scsi 1:0:8:0: serial_number( 7SHPDGLW) [Thu Dec 12 07:46:31 2019][ 8.706840] scsi 1:0:8:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:31 2019][ 8.726760] mpt3sas_cm0: detecting: handle(0x0064), sas_address(0x5000cca2525f568e), phy(7) [Thu Dec 12 07:46:31 2019][ 8.735114] mpt3sas_cm0: REPORT_LUNS: handle(0x0064), retries(0) [Thu Dec 12 07:46:31 2019][ 8.741286] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0064), lun(0) [Thu Dec 12 07:46:31 2019][ 8.747955] scsi 1:0:9:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:31 2019][ 8.756167] scsi 1:0:9:0: SSP: handle(0x0064), sas_addr(0x5000cca2525f568e), phy(7), device_name(0x5000cca2525f568f) [Thu Dec 12 07:46:31 2019][ 8.766682] scsi 1:0:9:0: enclosure logical id(0x5000ccab04037180), slot(16) [Thu Dec 12 07:46:31 2019][ 8.773813] scsi 1:0:9:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:31 2019][ 8.780532] scsi 1:0:9:0: serial_number( 7SHPEDRW) [Thu Dec 12 07:46:31 2019][ 8.785933] scsi 1:0:9:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:31 2019][ 8.805766] mpt3sas_cm0: detecting: handle(0x0065), sas_address(0x5000cca2525f6c26), phy(8) [Thu Dec 12 07:46:31 2019][ 8.814120] mpt3sas_cm0: REPORT_LUNS: handle(0x0065), retries(0) [Thu Dec 12 07:46:31 2019][ 8.820296] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0065), lun(0) [Thu Dec 12 07:46:31 2019][ 8.826957] scsi 1:0:10:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:31 2019][ 8.835261] scsi 1:0:10:0: SSP: handle(0x0065), sas_addr(0x5000cca2525f6c26), phy(8), device_name(0x5000cca2525f6c27) [Thu Dec 12 07:46:31 2019][ 8.845861] scsi 1:0:10:0: enclosure logical id(0x5000ccab04037180), slot(17) [Thu Dec 12 07:46:31 2019][ 8.853080] scsi 1:0:10:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:31 2019][ 8.859885] scsi 1:0:10:0: serial_number( 7SHPGV9W) [Thu Dec 12 07:46:31 2019][ 8.865373] scsi 1:0:10:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:31 2019][ 8.887768] mpt3sas_cm0: detecting: handle(0x0066), sas_address(0x5000cca2525ed402), phy(9) [Thu Dec 12 07:46:31 2019][ 8.896116] mpt3sas_cm0: REPORT_LUNS: handle(0x0066), retries(0) [Thu Dec 12 07:46:31 2019][ 8.902257] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0066), lun(0) [Thu Dec 12 07:46:31 2019][ 8.908900] scsi 1:0:11:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:31 2019][ 8.917203] scsi 1:0:11:0: SSP: handle(0x0066), sas_addr(0x5000cca2525ed402), phy(9), device_name(0x5000cca2525ed403) [Thu Dec 12 07:46:31 2019][ 8.927805] scsi 1:0:11:0: enclosure logical id(0x5000ccab04037180), slot(18) [Thu Dec 12 07:46:31 2019][ 8.935025] scsi 1:0:11:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:31 2019][ 8.941829] scsi 1:0:11:0: serial_number( 7SHP4R6W) [Thu Dec 12 07:46:31 2019][ 8.947316] scsi 1:0:11:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:31 2019][ 8.969763] mpt3sas_cm0: detecting: handle(0x0067), sas_address(0x5000cca2525e0406), phy(10) [Thu Dec 12 07:46:31 2019][ 8.978198] mpt3sas_cm0: REPORT_LUNS: handle(0x0067), retries(0) [Thu Dec 12 07:46:31 2019][ 8.984328] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0067), lun(0) [Thu Dec 12 07:46:31 2019][ 8.990964] scsi 1:0:12:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:31 2019][ 8.999257] scsi 1:0:12:0: SSP: handle(0x0067), sas_addr(0x5000cca2525e0406), phy(10), device_name(0x5000cca2525e0407) [Thu Dec 12 07:46:31 2019][ 9.009941] scsi 1:0:12:0: enclosure logical id(0x5000ccab04037180), slot(19) [Thu Dec 12 07:46:31 2019][ 9.017158] scsi 1:0:12:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:31 2019][ 9.023965] scsi 1:0:12:0: serial_number( 7SHNPVUW) [Thu Dec 12 07:46:31 2019][ 9.029453] scsi 1:0:12:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:31 2019][ 9.055847] mpt3sas_cm0: detecting: handle(0x0068), sas_address(0x5000cca2525ea9e6), phy(11) [Thu Dec 12 07:46:31 2019][ 9.064294] mpt3sas_cm0: REPORT_LUNS: handle(0x0068), retries(0) [Thu Dec 12 07:46:31 2019][ 9.070460] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0068), lun(0) [Thu Dec 12 07:46:31 2019][ 9.077117] scsi 1:0:13:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:31 2019][ 9.085417] scsi 1:0:13:0: SSP: handle(0x0068), sas_addr(0x5000cca2525ea9e6), phy(11), device_name(0x5000cca2525ea9e7) [Thu Dec 12 07:46:31 2019][ 9.096104] scsi 1:0:13:0: enclosure logical id(0x5000ccab04037180), slot(20) [Thu Dec 12 07:46:31 2019][ 9.103325] scsi 1:0:13:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:31 2019][ 9.110130] scsi 1:0:13:0: serial_number( 7SHP1X8W) [Thu Dec 12 07:46:31 2019][ 9.115615] scsi 1:0:13:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:31 2019][ 9.135779] mpt3sas_cm0: detecting: handle(0x0069), sas_address(0x5000cca2525f1d3a), phy(12) [Thu Dec 12 07:46:31 2019][ 9.144222] mpt3sas_cm0: REPORT_LUNS: handle(0x0069), retries(0) [Thu Dec 12 07:46:31 2019][ 9.150370] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0069), lun(0) [Thu Dec 12 07:46:31 2019][ 9.157047] scsi 1:0:14:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:31 2019][ 9.165367] scsi 1:0:14:0: SSP: handle(0x0069), sas_addr(0x5000cca2525f1d3a), phy(12), device_name(0x5000cca2525f1d3b) [Thu Dec 12 07:46:31 2019][ 9.176056] scsi 1:0:14:0: enclosure logical id(0x5000ccab04037180), slot(21) [Thu Dec 12 07:46:31 2019][ 9.183276] scsi 1:0:14:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:31 2019][ 9.190080] scsi 1:0:14:0: serial_number( 7SHP9LBW) [Thu Dec 12 07:46:31 2019][ 9.195566] scsi 1:0:14:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:31 2019][ 9.215773] mpt3sas_cm0: detecting: handle(0x006a), sas_address(0x5000cca2525ea49a), phy(13) [Thu Dec 12 07:46:31 2019][ 9.224213] mpt3sas_cm0: REPORT_LUNS: handle(0x006a), retries(0) [Thu Dec 12 07:46:31 2019][ 9.230380] mpt3sas_cm0: TEST_UNIT_READY: handle(0x006a), lun(0) [Thu Dec 12 07:46:31 2019][ 9.237013] scsi 1:0:15:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:31 2019][ 9.245317] scsi 1:0:15:0: SSP: handle(0x006a), sas_addr(0x5000cca2525ea49a), phy(13), device_name(0x5000cca2525ea49b) [Thu Dec 12 07:46:31 2019][ 9.256006] scsi 1:0:15:0: enclosure logical id(0x5000ccab04037180), slot(22) [Thu Dec 12 07:46:31 2019][ 9.263225] scsi 1:0:15:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:31 2019][ 9.270030] scsi 1:0:15:0: serial_number( 7SHP1KAW) [Thu Dec 12 07:46:31 2019][ 9.275518] scsi 1:0:15:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:32 2019][ 9.297771] mpt3sas_cm0: detecting: handle(0x006b), sas_address(0x5000cca2525fba06), phy(14) [Thu Dec 12 07:46:32 2019][ 9.306211] mpt3sas_cm0: REPORT_LUNS: handle(0x006b), retries(0) [Thu Dec 12 07:46:32 2019][ 9.312352] mpt3sas_cm0: TEST_UNIT_READY: handle(0x006b), lun(0) [Thu Dec 12 07:46:32 2019][ 9.318995] scsi 1:0:16:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:32 2019][ 9.327295] scsi 1:0:16:0: SSP: handle(0x006b), sas_addr(0x5000cca2525fba06), phy(14), device_name(0x5000cca2525fba07) [Thu Dec 12 07:46:32 2019][ 9.337986] scsi 1:0:16:0: enclosure logical id(0x5000ccab04037180), slot(23) [Thu Dec 12 07:46:32 2019][ 9.345205] scsi 1:0:16:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:32 2019][ 9.352011] scsi 1:0:16:0: serial_number( 7SHPN12W) [Thu Dec 12 07:46:32 2019][ 9.357497] scsi 1:0:16:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:32 2019][ 9.385769] mpt3sas_cm0: detecting: handle(0x006c), sas_address(0x5000cca2525e121e), phy(15) [Thu Dec 12 07:46:32 2019][ 9.394211] mpt3sas_cm0: REPORT_LUNS: handle(0x006c), retries(0) [Thu Dec 12 07:46:32 2019][ 9.400381] mpt3sas_cm0: TEST_UNIT_READY: handle(0x006c), lun(0) [Thu Dec 12 07:46:32 2019][ 9.407047] scsi 1:0:17:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:32 2019][ 9.415354] scsi 1:0:17:0: SSP: handle(0x006c), sas_addr(0x5000cca2525e121e), phy(15), device_name(0x5000cca2525e121f) [Thu Dec 12 07:46:32 2019][ 9.426039] scsi 1:0:17:0: enclosure logical id(0x5000ccab04037180), slot(24) [Thu Dec 12 07:46:32 2019][ 9.433257] scsi 1:0:17:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:32 2019][ 9.440063] scsi 1:0:17:0: serial_number( 7SHNRTXW) [Thu Dec 12 07:46:32 2019][ 9.445551] scsi 1:0:17:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:32 2019][ 9.465773] mpt3sas_cm0: detecting: handle(0x006d), sas_address(0x5000cca2525e98f6), phy(16) [Thu Dec 12 07:46:32 2019][ 9.474216] mpt3sas_cm0: REPORT_LUNS: handle(0x006d), retries(0) [Thu Dec 12 07:46:32 2019][ 9.480387] mpt3sas_cm0: TEST_UNIT_READY: handle(0x006d), lun(0) [Thu Dec 12 07:46:32 2019][ 9.487052] scsi 1:0:18:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:32 2019][ 9.495350] scsi 1:0:18:0: SSP: handle(0x006d), sas_addr(0x5000cca2525e98f6), phy(16), device_name(0x5000cca2525e98f7) [Thu Dec 12 07:46:32 2019][ 9.506034] scsi 1:0:18:0: enclosure logical id(0x5000ccab04037180), slot(25) [Thu Dec 12 07:46:32 2019][ 9.513252] scsi 1:0:18:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:32 2019][ 9.520059] scsi 1:0:18:0: serial_number( 7SHP0T9W) [Thu Dec 12 07:46:32 2019][ 9.525546] scsi 1:0:18:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:32 2019][ 9.545773] mpt3sas_cm0: detecting: handle(0x006e), sas_address(0x5000cca2525f8176), phy(17) [Thu Dec 12 07:46:32 2019][ 9.554232] mpt3sas_cm0: REPORT_LUNS: handle(0x006e), retries(0) [Thu Dec 12 07:46:32 2019][ 9.560374] mpt3sas_cm0: TEST_UNIT_READY: handle(0x006e), lun(0) [Thu Dec 12 07:46:32 2019][ 9.567030] scsi 1:0:19:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:32 2019][ 9.575349] scsi 1:0:19:0: SSP: handle(0x006e), sas_addr(0x5000cca2525f8176), phy(17), device_name(0x5000cca2525f8177) [Thu Dec 12 07:46:32 2019][ 9.586037] scsi 1:0:19:0: enclosure logical id(0x5000ccab04037180), slot(26) [Thu Dec 12 07:46:32 2019][ 9.593255] scsi 1:0:19:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:32 2019][ 9.600062] scsi 1:0:19:0: serial_number( 7SHPJ89W) [Thu Dec 12 07:46:32 2019][ 9.605549] scsi 1:0:19:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:32 2019][ 9.625773] mpt3sas_cm0: detecting: handle(0x006f), sas_address(0x5000cca2525fb01e), phy(18) [Thu Dec 12 07:46:32 2019][ 9.634212] mpt3sas_cm0: REPORT_LUNS: handle(0x006f), retries(0) [Thu Dec 12 07:46:32 2019][ 9.640361] mpt3sas_cm0: TEST_UNIT_READY: handle(0x006f), lun(0) [Thu Dec 12 07:46:32 2019][ 9.647019] scsi 1:0:20:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:32 2019][ 9.655327] scsi 1:0:20:0: SSP: handle(0x006f), sas_addr(0x5000cca2525fb01e), phy(18), device_name(0x5000cca2525fb01f) [Thu Dec 12 07:46:32 2019][ 9.666013] scsi 1:0:20:0: enclosure logical id(0x5000ccab04037180), slot(27) [Thu Dec 12 07:46:32 2019][ 9.673232] scsi 1:0:20:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:32 2019][ 9.680038] scsi 1:0:20:0: serial_number( 7SHPMBMW) [Thu Dec 12 07:46:32 2019][ 9.685526] scsi 1:0:20:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:32 2019][ 9.707776] mpt3sas_cm0: detecting: handle(0x0070), sas_address(0x5000cca2525ed54a), phy(19) [Thu Dec 12 07:46:32 2019][ 9.716217] mpt3sas_cm0: REPORT_LUNS: handle(0x0070), retries(0) [Thu Dec 12 07:46:32 2019][ 9.722386] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0070), lun(0) [Thu Dec 12 07:46:32 2019][ 9.729038] scsi 1:0:21:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:32 2019][ 9.737337] scsi 1:0:21:0: SSP: handle(0x0070), sas_addr(0x5000cca2525ed54a), phy(19), device_name(0x5000cca2525ed54b) [Thu Dec 12 07:46:32 2019][ 9.748026] scsi 1:0:21:0: enclosure logical id(0x5000ccab04037180), slot(28) [Thu Dec 12 07:46:32 2019][ 9.755246] scsi 1:0:21:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:32 2019][ 9.762050] scsi 1:0:21:0: serial_number( 7SHP4TVW) [Thu Dec 12 07:46:32 2019][ 9.767538] scsi 1:0:21:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:32 2019][ 9.789772] mpt3sas_cm0: detecting: handle(0x0071), sas_address(0x5000cca2525fa036), phy(20) [Thu Dec 12 07:46:32 2019][ 9.798211] mpt3sas_cm0: REPORT_LUNS: handle(0x0071), retries(0) [Thu Dec 12 07:46:32 2019][ 9.804377] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0071), lun(0) [Thu Dec 12 07:46:32 2019][ 9.811024] scsi 1:0:22:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:32 2019][ 9.819325] scsi 1:0:22:0: SSP: handle(0x0071), sas_addr(0x5000cca2525fa036), phy(20), device_name(0x5000cca2525fa037) [Thu Dec 12 07:46:32 2019][ 9.830013] scsi 1:0:22:0: enclosure logical id(0x5000ccab04037180), slot(29) [Thu Dec 12 07:46:32 2019][ 9.837232] scsi 1:0:22:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:32 2019][ 9.844038] scsi 1:0:22:0: serial_number( 7SHPL9TW) [Thu Dec 12 07:46:32 2019][ 9.849525] scsi 1:0:22:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:32 2019][ 9.871780] mpt3sas_cm0: detecting: handle(0x0072), sas_address(0x5000cca2525fb942), phy(21) [Thu Dec 12 07:46:32 2019][ 9.880219] mpt3sas_cm0: REPORT_LUNS: handle(0x0072), retries(0) [Thu Dec 12 07:46:32 2019][ 9.886357] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0072), lun(0) [Thu Dec 12 07:46:32 2019][ 9.892995] scsi 1:0:23:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:32 2019][ 9.901295] scsi 1:0:23:0: SSP: handle(0x0072), sas_addr(0x5000cca2525fb942), phy(21), device_name(0x5000cca2525fb943) [Thu Dec 12 07:46:32 2019][ 9.911985] scsi 1:0:23:0: enclosure logical id(0x5000ccab04037180), slot(30) [Thu Dec 12 07:46:32 2019][ 9.919203] scsi 1:0:23:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:32 2019][ 9.926010] scsi 1:0:23:0: serial_number( 7SHPMZHW) [Thu Dec 12 07:46:32 2019][ 9.931495] scsi 1:0:23:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:32 2019][ 9.954366] mpt3sas_cm0: detecting: handle(0x0073), sas_address(0x5000cca2525e22e6), phy(22) [Thu Dec 12 07:46:32 2019][ 9.962803] mpt3sas_cm0: REPORT_LUNS: handle(0x0073), retries(0) [Thu Dec 12 07:46:32 2019][ 9.968953] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0073), lun(0) [Thu Dec 12 07:46:32 2019][ 9.975561] scsi 1:0:24:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:32 2019][ 9.983866] scsi 1:0:24:0: SSP: handle(0x0073), sas_addr(0x5000cca2525e22e6), phy(22), device_name(0x5000cca2525e22e7) [Thu Dec 12 07:46:32 2019][ 9.994552] scsi 1:0:24:0: enclosure logical id(0x5000ccab04037180), slot(31) [Thu Dec 12 07:46:32 2019][ 10.001770] scsi 1:0:24:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:32 2019][ 10.008578] scsi 1:0:24:0: serial_number( 7SHNSXKW) [Thu Dec 12 07:46:32 2019][ 10.014066] scsi 1:0:24:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:32 2019][ 10.033776] mpt3sas_cm0: detecting: handle(0x0074), sas_address(0x5000cca2525fb5be), phy(23) [Thu Dec 12 07:46:32 2019][ 10.042216] mpt3sas_cm0: REPORT_LUNS: handle(0x0074), retries(0) [Thu Dec 12 07:46:32 2019][ 10.048392] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0074), lun(0) [Thu Dec 12 07:46:32 2019][ 10.055051] scsi 1:0:25:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:32 2019][ 10.063358] scsi 1:0:25:0: SSP: handle(0x0074), sas_addr(0x5000cca2525fb5be), phy(23), device_name(0x5000cca2525fb5bf) [Thu Dec 12 07:46:32 2019][ 10.074043] scsi 1:0:25:0: enclosure logical id(0x5000ccab04037180), slot(32) [Thu Dec 12 07:46:32 2019][ 10.081261] scsi 1:0:25:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:32 2019][ 10.088070] scsi 1:0:25:0: serial_number( 7SHPMS7W) [Thu Dec 12 07:46:32 2019][ 10.093555] scsi 1:0:25:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:32 2019][ 10.113778] mpt3sas_cm0: detecting: handle(0x0075), sas_address(0x5000cca2525eb77e), phy(24) [Thu Dec 12 07:46:32 2019][ 10.122218] mpt3sas_cm0: REPORT_LUNS: handle(0x0075), retries(0) [Thu Dec 12 07:46:32 2019][ 10.128388] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0075), lun(0) [Thu Dec 12 07:46:32 2019][ 10.135050] scsi 1:0:26:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:32 2019][ 10.143349] scsi 1:0:26:0: SSP: handle(0x0075), sas_addr(0x5000cca2525eb77e), phy(24), device_name(0x5000cca2525eb77f) [Thu Dec 12 07:46:32 2019][ 10.154038] scsi 1:0:26:0: enclosure logical id(0x5000ccab04037180), slot(33) [Thu Dec 12 07:46:32 2019][ 10.161256] scsi 1:0:26:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:32 2019][ 10.168064] scsi 1:0:26:0: serial_number( 7SHP2UAW) [Thu Dec 12 07:46:32 2019][ 10.173550] scsi 1:0:26:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:32 2019][ 10.193778] mpt3sas_cm0: detecting: handle(0x0076), sas_address(0x5000cca2525e113a), phy(25) [Thu Dec 12 07:46:32 2019][ 10.202212] mpt3sas_cm0: REPORT_LUNS: handle(0x0076), retries(0) [Thu Dec 12 07:46:32 2019][ 10.208367] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0076), lun(0) [Thu Dec 12 07:46:32 2019][ 10.215084] scsi 1:0:27:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:32 2019][ 10.223406] scsi 1:0:27:0: SSP: handle(0x0076), sas_addr(0x5000cca2525e113a), phy(25), device_name(0x5000cca2525e113b) [Thu Dec 12 07:46:32 2019][ 10.234093] scsi 1:0:27:0: enclosure logical id(0x5000ccab04037180), slot(34) [Thu Dec 12 07:46:32 2019][ 10.241312] scsi 1:0:27:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:32 2019][ 10.248119] scsi 1:0:27:0: serial_number( 7SHNRS2W) [Thu Dec 12 07:46:32 2019][ 10.253603] scsi 1:0:27:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:32 2019][ 10.279779] mpt3sas_cm0: detecting: handle(0x0077), sas_address(0x5000cca2526014fa), phy(26) [Thu Dec 12 07:46:33 2019][ 10.288222] mpt3sas_cm0: REPORT_LUNS: handle(0x0077), retries(0) [Thu Dec 12 07:46:33 2019][ 10.294390] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0077), lun(0) [Thu Dec 12 07:46:33 2019][ 10.301035] scsi 1:0:28:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:33 2019][ 10.309332] scsi 1:0:28:0: SSP: handle(0x0077), sas_addr(0x5000cca2526014fa), phy(26), device_name(0x5000cca2526014fb) [Thu Dec 12 07:46:33 2019][ 10.320015] scsi 1:0:28:0: enclosure logical id(0x5000ccab04037180), slot(35) [Thu Dec 12 07:46:33 2019][ 10.327233] scsi 1:0:28:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:33 2019][ 10.334040] scsi 1:0:28:0: serial_number( 7SHPV2VW) [Thu Dec 12 07:46:33 2019][ 10.339528] scsi 1:0:28:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:33 2019][ 10.361778] mpt3sas_cm0: detecting: handle(0x0078), sas_address(0x5000cca252598786), phy(27) [Thu Dec 12 07:46:33 2019][ 10.370220] mpt3sas_cm0: REPORT_LUNS: handle(0x0078), retries(0) [Thu Dec 12 07:46:33 2019][ 10.376390] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0078), lun(0) [Thu Dec 12 07:46:33 2019][ 10.383035] scsi 1:0:29:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:33 2019][ 10.391339] scsi 1:0:29:0: SSP: handle(0x0078), sas_addr(0x5000cca252598786), phy(27), device_name(0x5000cca252598787) [Thu Dec 12 07:46:33 2019][ 10.402027] scsi 1:0:29:0: enclosure logical id(0x5000ccab04037180), slot(36) [Thu Dec 12 07:46:33 2019][ 10.409246] scsi 1:0:29:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:33 2019][ 10.416053] scsi 1:0:29:0: serial_number( 7SHL7BRW) [Thu Dec 12 07:46:33 2019][ 10.421539] scsi 1:0:29:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:33 2019][ 10.443778] mpt3sas_cm0: detecting: handle(0x0079), sas_address(0x5000cca2525f5366), phy(28) [Thu Dec 12 07:46:33 2019][ 10.452213] mpt3sas_cm0: REPORT_LUNS: handle(0x0079), retries(0) [Thu Dec 12 07:46:33 2019][ 10.458377] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0079), lun(0) [Thu Dec 12 07:46:33 2019][ 10.465020] scsi 1:0:30:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:33 2019][ 10.473318] scsi 1:0:30:0: SSP: handle(0x0079), sas_addr(0x5000cca2525f5366), phy(28), device_name(0x5000cca2525f5367) [Thu Dec 12 07:46:33 2019][ 10.484007] scsi 1:0:30:0: enclosure logical id(0x5000ccab04037180), slot(37) [Thu Dec 12 07:46:33 2019][ 10.491227] scsi 1:0:30:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:33 2019][ 10.498033] scsi 1:0:30:0: serial_number( 7SHPE66W) [Thu Dec 12 07:46:33 2019][ 10.503518] scsi 1:0:30:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:33 2019][ 10.525780] mpt3sas_cm0: detecting: handle(0x007a), sas_address(0x5000cca2525e263e), phy(29) [Thu Dec 12 07:46:33 2019][ 10.534220] mpt3sas_cm0: REPORT_LUNS: handle(0x007a), retries(0) [Thu Dec 12 07:46:33 2019][ 10.540360] mpt3sas_cm0: TEST_UNIT_READY: handle(0x007a), lun(0) [Thu Dec 12 07:46:33 2019][ 10.546984] scsi 1:0:31:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:33 2019][ 10.555299] scsi 1:0:31:0: SSP: handle(0x007a), sas_addr(0x5000cca2525e263e), phy(29), device_name(0x5000cca2525e263f) [Thu Dec 12 07:46:33 2019][ 10.565985] scsi 1:0:31:0: enclosure logical id(0x5000ccab04037180), slot(38) [Thu Dec 12 07:46:33 2019][ 10.573204] scsi 1:0:31:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:33 2019][ 10.580011] scsi 1:0:31:0: serial_number( 7SHNT4GW) [Thu Dec 12 07:46:33 2019][ 10.585497] scsi 1:0:31:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:33 2019][ 10.607779] mpt3sas_cm0: detecting: handle(0x007b), sas_address(0x5000cca2525f6082), phy(30) [Thu Dec 12 07:46:33 2019][ 10.616216] mpt3sas_cm0: REPORT_LUNS: handle(0x007b), retries(0) [Thu Dec 12 07:46:33 2019][ 10.622364] mpt3sas_cm0: TEST_UNIT_READY: handle(0x007b), lun(0) [Thu Dec 12 07:46:33 2019][ 10.628980] scsi 1:0:32:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:33 2019][ 10.637295] scsi 1:0:32:0: SSP: handle(0x007b), sas_addr(0x5000cca2525f6082), phy(30), device_name(0x5000cca2525f6083) [Thu Dec 12 07:46:33 2019][ 10.647981] scsi 1:0:32:0: enclosure logical id(0x5000ccab04037180), slot(39) [Thu Dec 12 07:46:33 2019][ 10.655201] scsi 1:0:32:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:33 2019][ 10.662008] scsi 1:0:32:0: serial_number( 7SHPG28W) [Thu Dec 12 07:46:33 2019][ 10.667493] scsi 1:0:32:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:33 2019][ 10.689781] mpt3sas_cm0: detecting: handle(0x007c), sas_address(0x5000cca2525ec83e), phy(31) [Thu Dec 12 07:46:33 2019][ 10.698219] mpt3sas_cm0: REPORT_LUNS: handle(0x007c), retries(0) [Thu Dec 12 07:46:33 2019][ 10.704389] mpt3sas_cm0: TEST_UNIT_READY: handle(0x007c), lun(0) [Thu Dec 12 07:46:33 2019][ 10.711017] scsi 1:0:33:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:33 2019][ 10.719319] scsi 1:0:33:0: SSP: handle(0x007c), sas_addr(0x5000cca2525ec83e), phy(31), device_name(0x5000cca2525ec83f) [Thu Dec 12 07:46:33 2019][ 10.730004] scsi 1:0:33:0: enclosure logical id(0x5000ccab04037180), slot(40) [Thu Dec 12 07:46:33 2019][ 10.737222] scsi 1:0:33:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:33 2019][ 10.744027] scsi 1:0:33:0: serial_number( 7SHP3XXW) [Thu Dec 12 07:46:33 2019][ 10.749517] scsi 1:0:33:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:33 2019][ 10.771786] mpt3sas_cm0: detecting: handle(0x007d), sas_address(0x5000cca2525ec01a), phy(32) [Thu Dec 12 07:46:33 2019][ 10.780226] mpt3sas_cm0: REPORT_LUNS: handle(0x007d), retries(0) [Thu Dec 12 07:46:33 2019][ 10.786369] mpt3sas_cm0: TEST_UNIT_READY: handle(0x007d), lun(0) [Thu Dec 12 07:46:33 2019][ 10.793019] scsi 1:0:34:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:33 2019][ 10.801327] scsi 1:0:34:0: SSP: handle(0x007d), sas_addr(0x5000cca2525ec01a), phy(32), device_name(0x5000cca2525ec01b) [Thu Dec 12 07:46:33 2019][ 10.812016] scsi 1:0:34:0: enclosure logical id(0x5000ccab04037180), slot(41) [Thu Dec 12 07:46:33 2019][ 10.819235] scsi 1:0:34:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:33 2019][ 10.826040] scsi 1:0:34:0: serial_number( 7SHP3D3W) [Thu Dec 12 07:46:33 2019][ 10.831528] scsi 1:0:34:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:33 2019][ 10.853786] mpt3sas_cm0: detecting: handle(0x007e), sas_address(0x5000cca2525ec55a), phy(33) [Thu Dec 12 07:46:33 2019][ 10.862221] mpt3sas_cm0: REPORT_LUNS: handle(0x007e), retries(0) [Thu Dec 12 07:46:33 2019][ 10.868373] mpt3sas_cm0: TEST_UNIT_READY: handle(0x007e), lun(0) [Thu Dec 12 07:46:33 2019][ 10.875030] scsi 1:0:35:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:33 2019][ 10.883339] scsi 1:0:35:0: SSP: handle(0x007e), sas_addr(0x5000cca2525ec55a), phy(33), device_name(0x5000cca2525ec55b) [Thu Dec 12 07:46:33 2019][ 10.894021] scsi 1:0:35:0: enclosure logical id(0x5000ccab04037180), slot(42) [Thu Dec 12 07:46:33 2019][ 10.901240] scsi 1:0:35:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:33 2019][ 10.908048] scsi 1:0:35:0: serial_number( 7SHP3RYW) [Thu Dec 12 07:46:33 2019][ 10.913535] scsi 1:0:35:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:33 2019][ 10.935780] mpt3sas_cm0: detecting: handle(0x007f), sas_address(0x5000cca2525fd4a2), phy(34) [Thu Dec 12 07:46:33 2019][ 10.944215] mpt3sas_cm0: REPORT_LUNS: handle(0x007f), retries(0) [Thu Dec 12 07:46:33 2019][ 10.950348] mpt3sas_cm0: TEST_UNIT_READY: handle(0x007f), lun(0) [Thu Dec 12 07:46:33 2019][ 10.956941] scsi 1:0:36:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:33 2019][ 10.965241] scsi 1:0:36:0: SSP: handle(0x007f), sas_addr(0x5000cca2525fd4a2), phy(34), device_name(0x5000cca2525fd4a3) [Thu Dec 12 07:46:33 2019][ 10.975922] scsi 1:0:36:0: enclosure logical id(0x5000ccab04037180), slot(43) [Thu Dec 12 07:46:33 2019][ 10.983140] scsi 1:0:36:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:33 2019][ 10.989949] scsi 1:0:36:0: serial_number( 7SHPPU0W) [Thu Dec 12 07:46:33 2019][ 10.995435] scsi 1:0:36:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:33 2019][ 11.017783] mpt3sas_cm0: detecting: handle(0x0080), sas_address(0x5000cca2525eb5f6), phy(35) [Thu Dec 12 07:46:33 2019][ 11.026223] mpt3sas_cm0: REPORT_LUNS: handle(0x0080), retries(0) [Thu Dec 12 07:46:33 2019][ 11.032386] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0080), lun(0) [Thu Dec 12 07:46:33 2019][ 11.039004] scsi 1:0:37:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:33 2019][ 11.047312] scsi 1:0:37:0: SSP: handle(0x0080), sas_addr(0x5000cca2525eb5f6), phy(35), device_name(0x5000cca2525eb5f7) [Thu Dec 12 07:46:33 2019][ 11.057997] scsi 1:0:37:0: enclosure logical id(0x5000ccab04037180), slot(44) [Thu Dec 12 07:46:33 2019][ 11.065216] scsi 1:0:37:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:33 2019][ 11.072024] scsi 1:0:37:0: serial_number( 7SHP2R5W) [Thu Dec 12 07:46:33 2019][ 11.077509] scsi 1:0:37:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:33 2019][ 11.099789] mpt3sas_cm0: detecting: handle(0x0081), sas_address(0x5000cca2525ebeb2), phy(36) [Thu Dec 12 07:46:33 2019][ 11.108227] mpt3sas_cm0: REPORT_LUNS: handle(0x0081), retries(0) [Thu Dec 12 07:46:33 2019][ 11.114367] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0081), lun(0) [Thu Dec 12 07:46:33 2019][ 11.120995] scsi 1:0:38:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:33 2019][ 11.129297] scsi 1:0:38:0: SSP: handle(0x0081), sas_addr(0x5000cca2525ebeb2), phy(36), device_name(0x5000cca2525ebeb3) [Thu Dec 12 07:46:33 2019][ 11.139984] scsi 1:0:38:0: enclosure logical id(0x5000ccab04037180), slot(45) [Thu Dec 12 07:46:33 2019][ 11.147203] scsi 1:0:38:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:33 2019][ 11.154010] scsi 1:0:38:0: serial_number( 7SHP396W) [Thu Dec 12 07:46:33 2019][ 11.159496] scsi 1:0:38:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:33 2019][ 11.181788] mpt3sas_cm0: detecting: handle(0x0082), sas_address(0x5000cca2525f291a), phy(37) [Thu Dec 12 07:46:33 2019][ 11.190231] mpt3sas_cm0: REPORT_LUNS: handle(0x0082), retries(0) [Thu Dec 12 07:46:33 2019][ 11.196396] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0082), lun(0) [Thu Dec 12 07:46:33 2019][ 11.202977] scsi 1:0:39:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:33 2019][ 11.211272] scsi 1:0:39:0: SSP: handle(0x0082), sas_addr(0x5000cca2525f291a), phy(37), device_name(0x5000cca2525f291b) [Thu Dec 12 07:46:33 2019][ 11.221954] scsi 1:0:39:0: enclosure logical id(0x5000ccab04037180), slot(46) [Thu Dec 12 07:46:33 2019][ 11.229172] scsi 1:0:39:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:33 2019][ 11.235979] scsi 1:0:39:0: serial_number( 7SHPABWW) [Thu Dec 12 07:46:33 2019][ 11.241467] scsi 1:0:39:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:33 2019][ 11.263793] mpt3sas_cm0: detecting: handle(0x0083), sas_address(0x5000cca252602c0e), phy(38) [Thu Dec 12 07:46:33 2019][ 11.272235] mpt3sas_cm0: REPORT_LUNS: handle(0x0083), retries(0) [Thu Dec 12 07:46:33 2019][ 11.278368] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0083), lun(0) [Thu Dec 12 07:46:33 2019][ 11.284987] scsi 1:0:40:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:33 2019][ 11.293291] scsi 1:0:40:0: SSP: handle(0x0083), sas_addr(0x5000cca252602c0e), phy(38), device_name(0x5000cca252602c0f) [Thu Dec 12 07:46:34 2019][ 11.303977] scsi 1:0:40:0: enclosure logical id(0x5000ccab04037180), slot(47) [Thu Dec 12 07:46:34 2019][ 11.311195] scsi 1:0:40:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:34 2019][ 11.318002] scsi 1:0:40:0: serial_number( 7SHPWMHW) [Thu Dec 12 07:46:34 2019][ 11.323490] scsi 1:0:40:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:34 2019][ 11.345790] mpt3sas_cm0: detecting: handle(0x0084), sas_address(0x5000cca2525e7cfe), phy(39) [Thu Dec 12 07:46:34 2019][ 11.354232] mpt3sas_cm0: REPORT_LUNS: handle(0x0084), retries(0) [Thu Dec 12 07:46:34 2019][ 11.360404] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0084), lun(0) [Thu Dec 12 07:46:34 2019][ 11.367054] scsi 1:0:41:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:34 2019][ 11.375357] scsi 1:0:41:0: SSP: handle(0x0084), sas_addr(0x5000cca2525e7cfe), phy(39), device_name(0x5000cca2525e7cff) [Thu Dec 12 07:46:34 2019][ 11.386043] scsi 1:0:41:0: enclosure logical id(0x5000ccab04037180), slot(48) [Thu Dec 12 07:46:34 2019][ 11.393262] scsi 1:0:41:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:34 2019][ 11.400066] scsi 1:0:41:0: serial_number( 7SHNYXKW) [Thu Dec 12 07:46:34 2019][ 11.405556] scsi 1:0:41:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:34 2019][ 11.425788] mpt3sas_cm0: detecting: handle(0x0085), sas_address(0x5000cca2525f6a32), phy(40) [Thu Dec 12 07:46:34 2019][ 11.434226] mpt3sas_cm0: REPORT_LUNS: handle(0x0085), retries(0) [Thu Dec 12 07:46:34 2019][ 11.440391] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0085), lun(0) [Thu Dec 12 07:46:34 2019][ 11.447004] scsi 1:0:42:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:34 2019][ 11.455309] scsi 1:0:42:0: SSP: handle(0x0085), sas_addr(0x5000cca2525f6a32), phy(40), device_name(0x5000cca2525f6a33) [Thu Dec 12 07:46:34 2019][ 11.465992] scsi 1:0:42:0: enclosure logical id(0x5000ccab04037180), slot(49) [Thu Dec 12 07:46:34 2019][ 11.473211] scsi 1:0:42:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:34 2019][ 11.480018] scsi 1:0:42:0: serial_number( 7SHPGR8W) [Thu Dec 12 07:46:34 2019][ 11.485505] scsi 1:0:42:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:34 2019][ 11.507789] mpt3sas_cm0: detecting: handle(0x0086), sas_address(0x5000cca2525f7f26), phy(41) [Thu Dec 12 07:46:34 2019][ 11.516234] mpt3sas_cm0: REPORT_LUNS: handle(0x0086), retries(0) [Thu Dec 12 07:46:34 2019][ 11.522371] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0086), lun(0) [Thu Dec 12 07:46:34 2019][ 11.528987] scsi 1:0:43:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:34 2019][ 11.537290] scsi 1:0:43:0: SSP: handle(0x0086), sas_addr(0x5000cca2525f7f26), phy(41), device_name(0x5000cca2525f7f27) [Thu Dec 12 07:46:34 2019][ 11.547980] scsi 1:0:43:0: enclosure logical id(0x5000ccab04037180), slot(50) [Thu Dec 12 07:46:34 2019][ 11.555200] scsi 1:0:43:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:34 2019][ 11.562005] scsi 1:0:43:0: serial_number( 7SHPJ3JW) [Thu Dec 12 07:46:34 2019][ 11.567493] scsi 1:0:43:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:34 2019][ 11.596796] mpt3sas_cm0: detecting: handle(0x0087), sas_address(0x5000cca2525eb4b2), phy(42) [Thu Dec 12 07:46:34 2019][ 11.605237] mpt3sas_cm0: REPORT_LUNS: handle(0x0087), retries(0) [Thu Dec 12 07:46:34 2019][ 11.611366] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0087), lun(0) [Thu Dec 12 07:46:34 2019][ 11.617985] scsi 1:0:44:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:34 2019][ 11.626281] scsi 1:0:44:0: SSP: handle(0x0087), sas_addr(0x5000cca2525eb4b2), phy(42), device_name(0x5000cca2525eb4b3) [Thu Dec 12 07:46:34 2019][ 11.636970] scsi 1:0:44:0: enclosure logical id(0x5000ccab04037180), slot(51) [Thu Dec 12 07:46:34 2019][ 11.644190] scsi 1:0:44:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:34 2019][ 11.650994] scsi 1:0:44:0: serial_number( 7SHP2MKW) [Thu Dec 12 07:46:34 2019][ 11.656481] scsi 1:0:44:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:34 2019][ 11.678806] mpt3sas_cm0: detecting: handle(0x0088), sas_address(0x5000cca2525e1f9e), phy(43) [Thu Dec 12 07:46:34 2019][ 11.687244] mpt3sas_cm0: REPORT_LUNS: handle(0x0088), retries(0) [Thu Dec 12 07:46:34 2019][ 11.693384] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0088), lun(0) [Thu Dec 12 07:46:34 2019][ 11.699994] scsi 1:0:45:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:34 2019][ 11.708293] scsi 1:0:45:0: SSP: handle(0x0088), sas_addr(0x5000cca2525e1f9e), phy(43), device_name(0x5000cca2525e1f9f) [Thu Dec 12 07:46:34 2019][ 11.718975] scsi 1:0:45:0: enclosure logical id(0x5000ccab04037180), slot(52) [Thu Dec 12 07:46:34 2019][ 11.726194] scsi 1:0:45:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:34 2019][ 11.732999] scsi 1:0:45:0: serial_number( 7SHNSPTW) [Thu Dec 12 07:46:34 2019][ 11.738488] scsi 1:0:45:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:34 2019][ 11.768383] mpt3sas_cm0: detecting: handle(0x0089), sas_address(0x5000cca2525e52fe), phy(44) [Thu Dec 12 07:46:34 2019][ 11.776824] mpt3sas_cm0: REPORT_LUNS: handle(0x0089), retries(0) [Thu Dec 12 07:46:34 2019][ 11.782968] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0089), lun(0) [Thu Dec 12 07:46:34 2019][ 11.789586] scsi 1:0:46:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:34 2019][ 11.797910] scsi 1:0:46:0: SSP: handle(0x0089), sas_addr(0x5000cca2525e52fe), phy(44), device_name(0x5000cca2525e52ff) [Thu Dec 12 07:46:34 2019][ 11.808598] scsi 1:0:46:0: enclosure logical id(0x5000ccab04037180), slot(53) [Thu Dec 12 07:46:34 2019][ 11.815816] scsi 1:0:46:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:34 2019][ 11.822626] scsi 1:0:46:0: serial_number( 7SHNW3VW) [Thu Dec 12 07:46:34 2019][ 11.828119] scsi 1:0:46:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:34 2019][ 11.850791] mpt3sas_cm0: detecting: handle(0x008a), sas_address(0x5000cca2525f4e72), phy(45) [Thu Dec 12 07:46:34 2019][ 11.859226] mpt3sas_cm0: REPORT_LUNS: handle(0x008a), retries(0) [Thu Dec 12 07:46:34 2019][ 11.865369] mpt3sas_cm0: TEST_UNIT_READY: handle(0x008a), lun(0) [Thu Dec 12 07:46:34 2019][ 11.879536] scsi 1:0:47:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:34 2019][ 11.887817] scsi 1:0:47:0: SSP: handle(0x008a), sas_addr(0x5000cca2525f4e72), phy(45), device_name(0x5000cca2525f4e73) [Thu Dec 12 07:46:34 2019][ 11.898506] scsi 1:0:47:0: enclosure logical id(0x5000ccab04037180), slot(54) [Thu Dec 12 07:46:34 2019][ 11.905726] scsi 1:0:47:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:34 2019][ 11.912516] scsi 1:0:47:0: serial_number( 7SHPDVZW) [Thu Dec 12 07:46:34 2019][ 11.918000] scsi 1:0:47:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:34 2019][ 11.937792] mpt3sas_cm0: detecting: handle(0x008b), sas_address(0x5000cca2525fd49a), phy(46) [Thu Dec 12 07:46:34 2019][ 11.946231] mpt3sas_cm0: REPORT_LUNS: handle(0x008b), retries(0) [Thu Dec 12 07:46:34 2019][ 11.952371] mpt3sas_cm0: TEST_UNIT_READY: handle(0x008b), lun(0) [Thu Dec 12 07:46:34 2019][ 11.958975] scsi 1:0:48:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:34 2019][ 11.967275] scsi 1:0:48:0: SSP: handle(0x008b), sas_addr(0x5000cca2525fd49a), phy(46), device_name(0x5000cca2525fd49b) [Thu Dec 12 07:46:34 2019][ 11.977963] scsi 1:0:48:0: enclosure logical id(0x5000ccab04037180), slot(55) [Thu Dec 12 07:46:34 2019][ 11.985183] scsi 1:0:48:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:34 2019][ 11.991989] scsi 1:0:48:0: serial_number( 7SHPPTYW) [Thu Dec 12 07:46:34 2019][ 11.997475] scsi 1:0:48:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:34 2019][ 12.019792] mpt3sas_cm0: detecting: handle(0x008c), sas_address(0x5000cca2525e787a), phy(47) [Thu Dec 12 07:46:34 2019][ 12.028228] mpt3sas_cm0: REPORT_LUNS: handle(0x008c), retries(0) [Thu Dec 12 07:46:34 2019][ 12.034398] mpt3sas_cm0: TEST_UNIT_READY: handle(0x008c), lun(0) [Thu Dec 12 07:46:34 2019][ 12.041025] scsi 1:0:49:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:34 2019][ 12.049327] scsi 1:0:49:0: SSP: handle(0x008c), sas_addr(0x5000cca2525e787a), phy(47), device_name(0x5000cca2525e787b) [Thu Dec 12 07:46:34 2019][ 12.060011] scsi 1:0:49:0: enclosure logical id(0x5000ccab04037180), slot(56) [Thu Dec 12 07:46:34 2019][ 12.067230] scsi 1:0:49:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:34 2019][ 12.074039] scsi 1:0:49:0: serial_number( 7SHNYM7W) [Thu Dec 12 07:46:34 2019][ 12.079523] scsi 1:0:49:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:34 2019][ 12.101796] mpt3sas_cm0: detecting: handle(0x008d), sas_address(0x5000cca2525ca19a), phy(48) [Thu Dec 12 07:46:34 2019][ 12.110234] mpt3sas_cm0: REPORT_LUNS: handle(0x008d), retries(0) [Thu Dec 12 07:46:34 2019][ 12.116408] mpt3sas_cm0: TEST_UNIT_READY: handle(0x008d), lun(0) [Thu Dec 12 07:46:34 2019][ 12.123038] scsi 1:0:50:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:34 2019][ 12.131339] scsi 1:0:50:0: SSP: handle(0x008d), sas_addr(0x5000cca2525ca19a), phy(48), device_name(0x5000cca2525ca19b) [Thu Dec 12 07:46:34 2019][ 12.142024] scsi 1:0:50:0: enclosure logical id(0x5000ccab04037180), slot(57) [Thu Dec 12 07:46:34 2019][ 12.149245] scsi 1:0:50:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:34 2019][ 12.156049] scsi 1:0:50:0: serial_number( 7SHMY83W) [Thu Dec 12 07:46:34 2019][ 12.161539] scsi 1:0:50:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:34 2019][ 12.183796] mpt3sas_cm0: detecting: handle(0x008e), sas_address(0x5000cca2525ffb8a), phy(49) [Thu Dec 12 07:46:34 2019][ 12.192237] mpt3sas_cm0: REPORT_LUNS: handle(0x008e), retries(0) [Thu Dec 12 07:46:34 2019][ 12.198377] mpt3sas_cm0: TEST_UNIT_READY: handle(0x008e), lun(0) [Thu Dec 12 07:46:34 2019][ 12.204993] scsi 1:0:51:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:34 2019][ 12.213300] scsi 1:0:51:0: SSP: handle(0x008e), sas_addr(0x5000cca2525ffb8a), phy(49), device_name(0x5000cca2525ffb8b) [Thu Dec 12 07:46:34 2019][ 12.223986] scsi 1:0:51:0: enclosure logical id(0x5000ccab04037180), slot(58) [Thu Dec 12 07:46:34 2019][ 12.231205] scsi 1:0:51:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:34 2019][ 12.238011] scsi 1:0:51:0: serial_number( 7SHPTDAW) [Thu Dec 12 07:46:34 2019][ 12.243497] scsi 1:0:51:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:34 2019][ 12.265795] mpt3sas_cm0: detecting: handle(0x008f), sas_address(0x5000cca2525f266a), phy(50) [Thu Dec 12 07:46:34 2019][ 12.274234] mpt3sas_cm0: REPORT_LUNS: handle(0x008f), retries(0) [Thu Dec 12 07:46:34 2019][ 12.280373] mpt3sas_cm0: TEST_UNIT_READY: handle(0x008f), lun(0) [Thu Dec 12 07:46:35 2019][ 12.286989] scsi 1:0:52:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:35 2019][ 12.295294] scsi 1:0:52:0: SSP: handle(0x008f), sas_addr(0x5000cca2525f266a), phy(50), device_name(0x5000cca2525f266b) [Thu Dec 12 07:46:35 2019][ 12.305981] scsi 1:0:52:0: enclosure logical id(0x5000ccab04037180), slot(59) [Thu Dec 12 07:46:35 2019][ 12.313202] scsi 1:0:52:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:35 2019][ 12.320008] scsi 1:0:52:0: serial_number( 7SHPA6AW) [Thu Dec 12 07:46:35 2019][ 12.325496] scsi 1:0:52:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:35 2019][ 12.350286] mpt3sas_cm0: expander_add: handle(0x005b), parent(0x0058), sas_addr(0x5000ccab040371fb), phys(68) [Thu Dec 12 07:46:35 2019][ 12.369844] mpt3sas_cm0: detecting: handle(0x0090), sas_address(0x5000cca2525eacc2), phy(42) [Thu Dec 12 07:46:35 2019][ 12.378301] mpt3sas_cm0: REPORT_LUNS: handle(0x0090), retries(0) [Thu Dec 12 07:46:35 2019][ 12.384452] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0090), lun(0) [Thu Dec 12 07:46:35 2019][ 12.391075] scsi 1:0:53:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:35 2019][ 12.399377] scsi 1:0:53:0: SSP: handle(0x0090), sas_addr(0x5000cca2525eacc2), phy(42), device_name(0x5000cca2525eacc3) [Thu Dec 12 07:46:35 2019][ 12.410061] scsi 1:0:53:0: enclosure logical id(0x5000ccab04037180), slot(1) [Thu Dec 12 07:46:35 2019][ 12.417193] scsi 1:0:53:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:35 2019][ 12.424001] scsi 1:0:53:0: serial_number( 7SHP235W) [Thu Dec 12 07:46:35 2019][ 12.429487] scsi 1:0:53:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:35 2019][ 12.451800] mpt3sas_cm0: detecting: handle(0x0091), sas_address(0x5000cca2525f8152), phy(43) [Thu Dec 12 07:46:35 2019][ 12.460238] mpt3sas_cm0: REPORT_LUNS: handle(0x0091), retries(0) [Thu Dec 12 07:46:35 2019][ 12.466372] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0091), lun(0) [Thu Dec 12 07:46:35 2019][ 12.472983] scsi 1:0:54:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:35 2019][ 12.481280] scsi 1:0:54:0: SSP: handle(0x0091), sas_addr(0x5000cca2525f8152), phy(43), device_name(0x5000cca2525f8153) [Thu Dec 12 07:46:35 2019][ 12.491962] scsi 1:0:54:0: enclosure logical id(0x5000ccab04037180), slot(3) [Thu Dec 12 07:46:35 2019][ 12.499096] scsi 1:0:54:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:35 2019][ 12.505902] scsi 1:0:54:0: serial_number( 7SHPJ80W) [Thu Dec 12 07:46:35 2019][ 12.511388] scsi 1:0:54:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:35 2019][ 12.534392] mpt3sas_cm0: detecting: handle(0x0092), sas_address(0x5000cca2525ef83a), phy(44) [Thu Dec 12 07:46:35 2019][ 12.542836] mpt3sas_cm0: REPORT_LUNS: handle(0x0092), retries(0) [Thu Dec 12 07:46:35 2019][ 12.549012] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0092), lun(0) [Thu Dec 12 07:46:35 2019][ 12.555644] scsi 1:0:55:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:35 2019][ 12.563942] scsi 1:0:55:0: SSP: handle(0x0092), sas_addr(0x5000cca2525ef83a), phy(44), device_name(0x5000cca2525ef83b) [Thu Dec 12 07:46:35 2019][ 12.574626] scsi 1:0:55:0: enclosure logical id(0x5000ccab04037180), slot(4) [Thu Dec 12 07:46:35 2019][ 12.581757] scsi 1:0:55:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:35 2019][ 12.588568] scsi 1:0:55:0: serial_number( 7SHP73ZW) [Thu Dec 12 07:46:35 2019][ 12.594060] scsi 1:0:55:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:35 2019][ 12.613802] mpt3sas_cm0: detecting: handle(0x0093), sas_address(0x5000cca2525e72aa), phy(45) [Thu Dec 12 07:46:35 2019][ 12.622236] mpt3sas_cm0: REPORT_LUNS: handle(0x0093), retries(0) [Thu Dec 12 07:46:35 2019][ 12.628401] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0093), lun(0) [Thu Dec 12 07:46:35 2019][ 12.635002] scsi 1:0:56:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:35 2019][ 12.643298] scsi 1:0:56:0: SSP: handle(0x0093), sas_addr(0x5000cca2525e72aa), phy(45), device_name(0x5000cca2525e72ab) [Thu Dec 12 07:46:35 2019][ 12.653987] scsi 1:0:56:0: enclosure logical id(0x5000ccab04037180), slot(5) [Thu Dec 12 07:46:35 2019][ 12.661120] scsi 1:0:56:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:35 2019][ 12.667926] scsi 1:0:56:0: serial_number( 7SHNY77W) [Thu Dec 12 07:46:35 2019][ 12.673411] scsi 1:0:56:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:35 2019][ 12.696796] mpt3sas_cm0: detecting: handle(0x0094), sas_address(0x5000cca2525d3c8a), phy(46) [Thu Dec 12 07:46:35 2019][ 12.705231] mpt3sas_cm0: REPORT_LUNS: handle(0x0094), retries(0) [Thu Dec 12 07:46:35 2019][ 12.711373] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0094), lun(0) [Thu Dec 12 07:46:35 2019][ 12.717978] scsi 1:0:57:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:35 2019][ 12.726275] scsi 1:0:57:0: SSP: handle(0x0094), sas_addr(0x5000cca2525d3c8a), phy(46), device_name(0x5000cca2525d3c8b) [Thu Dec 12 07:46:35 2019][ 12.736962] scsi 1:0:57:0: enclosure logical id(0x5000ccab04037180), slot(6) [Thu Dec 12 07:46:35 2019][ 12.744094] scsi 1:0:57:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:35 2019][ 12.750901] scsi 1:0:57:0: serial_number( 7SHN8KZW) [Thu Dec 12 07:46:35 2019][ 12.756389] scsi 1:0:57:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:35 2019][ 12.776797] mpt3sas_cm0: detecting: handle(0x0095), sas_address(0x5000cca2525fae0e), phy(47) [Thu Dec 12 07:46:35 2019][ 12.785234] mpt3sas_cm0: REPORT_LUNS: handle(0x0095), retries(0) [Thu Dec 12 07:46:35 2019][ 12.791399] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0095), lun(0) [Thu Dec 12 07:46:35 2019][ 12.798039] scsi 1:0:58:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:35 2019][ 12.806372] scsi 1:0:58:0: SSP: handle(0x0095), sas_addr(0x5000cca2525fae0e), phy(47), device_name(0x5000cca2525fae0f) [Thu Dec 12 07:46:35 2019][ 12.817061] scsi 1:0:58:0: enclosure logical id(0x5000ccab04037180), slot(7) [Thu Dec 12 07:46:35 2019][ 12.824193] scsi 1:0:58:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:35 2019][ 12.830998] scsi 1:0:58:0: serial_number( 7SHPM7BW) [Thu Dec 12 07:46:35 2019][ 12.836486] scsi 1:0:58:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:35 2019][ 12.859797] mpt3sas_cm0: detecting: handle(0x0096), sas_address(0x5000cca2525efdae), phy(48) [Thu Dec 12 07:46:35 2019][ 12.868233] mpt3sas_cm0: REPORT_LUNS: handle(0x0096), retries(0) [Thu Dec 12 07:46:35 2019][ 12.874361] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0096), lun(0) [Thu Dec 12 07:46:35 2019][ 12.880985] scsi 1:0:59:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:35 2019][ 12.889289] scsi 1:0:59:0: SSP: handle(0x0096), sas_addr(0x5000cca2525efdae), phy(48), device_name(0x5000cca2525efdaf) [Thu Dec 12 07:46:35 2019][ 12.899976] scsi 1:0:59:0: enclosure logical id(0x5000ccab04037180), slot(8) [Thu Dec 12 07:46:35 2019][ 12.907108] scsi 1:0:59:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:35 2019][ 12.913912] scsi 1:0:59:0: serial_number( 7SHP7H7W) [Thu Dec 12 07:46:35 2019][ 12.919400] scsi 1:0:59:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:35 2019][ 12.939800] mpt3sas_cm0: detecting: handle(0x0097), sas_address(0x5000cca2525fa302), phy(49) [Thu Dec 12 07:46:35 2019][ 12.948235] mpt3sas_cm0: REPORT_LUNS: handle(0x0097), retries(0) [Thu Dec 12 07:46:35 2019][ 12.954399] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0097), lun(0) [Thu Dec 12 07:46:35 2019][ 12.961009] scsi 1:0:60:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:35 2019][ 12.969305] scsi 1:0:60:0: SSP: handle(0x0097), sas_addr(0x5000cca2525fa302), phy(49), device_name(0x5000cca2525fa303) [Thu Dec 12 07:46:35 2019][ 12.979986] scsi 1:0:60:0: enclosure logical id(0x5000ccab04037180), slot(9) [Thu Dec 12 07:46:35 2019][ 12.987119] scsi 1:0:60:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:35 2019][ 12.993925] scsi 1:0:60:0: serial_number( 7SHPLHKW) [Thu Dec 12 07:46:35 2019][ 12.999411] scsi 1:0:60:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:35 2019][ 13.022799] mpt3sas_cm0: detecting: handle(0x0098), sas_address(0x5000cca2525fb4be), phy(50) [Thu Dec 12 07:46:35 2019][ 13.031239] mpt3sas_cm0: REPORT_LUNS: handle(0x0098), retries(0) [Thu Dec 12 07:46:35 2019][ 13.037380] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0098), lun(0) [Thu Dec 12 07:46:35 2019][ 13.043972] scsi 1:0:61:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:35 2019][ 13.052275] scsi 1:0:61:0: SSP: handle(0x0098), sas_addr(0x5000cca2525fb4be), phy(50), device_name(0x5000cca2525fb4bf) [Thu Dec 12 07:46:35 2019][ 13.062963] scsi 1:0:61:0: enclosure logical id(0x5000ccab04037180), slot(10) [Thu Dec 12 07:46:35 2019][ 13.070182] scsi 1:0:61:0: enclosure level(0x0000), connector name( C3 ) [Thu Dec 12 07:46:35 2019][ 13.076993] scsi 1:0:61:0: serial_number( 7SHPMP5W) [Thu Dec 12 07:46:35 2019][ 13.082484] scsi 1:0:61:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:35 2019][ 13.105331] mpt3sas_cm0: expander_add: handle(0x00da), parent(0x0002), sas_addr(0x5000ccab040371bd), phys(49) [Thu Dec 12 07:46:35 2019][ 13.125847] mpt3sas_cm0: detecting: handle(0x00de), sas_address(0x5000ccab040371bc), phy(48) [Thu Dec 12 07:46:35 2019][ 13.134284] mpt3sas_cm0: REPORT_LUNS: handle(0x00de), retries(0) [Thu Dec 12 07:46:35 2019][ 13.140782] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00de), lun(0) [Thu Dec 12 07:46:35 2019][ 13.148064] scsi 1:0:62:0: Enclosure HGST H4060-J 2033 PQ: 0 ANSI: 6 [Thu Dec 12 07:46:35 2019][ 13.156597] scsi 1:0:62:0: set ignore_delay_remove for handle(0x00de) [Thu Dec 12 07:46:35 2019][ 13.163038] scsi 1:0:62:0: SES: handle(0x00de), sas_addr(0x5000ccab040371bc), phy(48), device_name(0x0000000000000000) [Thu Dec 12 07:46:35 2019][ 13.173723] scsi 1:0:62:0: enclosure logical id(0x5000ccab04037180), slot(60) [Thu Dec 12 07:46:35 2019][ 13.180944] scsi 1:0:62:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:35 2019][ 13.187747] scsi 1:0:62:0: serial_number(USWSJ03918EZ0028 ) [Thu Dec 12 07:46:35 2019][ 13.193583] scsi 1:0:62:0: qdepth(1), tagged(0), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:35 2019][ 13.218266] mpt3sas_cm0: expander_add: handle(0x00dc), parent(0x00da), sas_addr(0x5000ccab040371bf), phys(68) [Thu Dec 12 07:46:35 2019][ 13.239340] mpt3sas_cm0: detecting: handle(0x00df), sas_address(0x5000cca2525f2a25), phy(0) [Thu Dec 12 07:46:35 2019][ 13.247690] mpt3sas_cm0: REPORT_LUNS: handle(0x00df), retries(0) [Thu Dec 12 07:46:35 2019][ 13.253838] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00df), lun(0) [Thu Dec 12 07:46:35 2019][ 13.260425] scsi 1:0:63:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:35 2019][ 13.268729] scsi 1:0:63:0: SSP: handle(0x00df), sas_addr(0x5000cca2525f2a25), phy(0), device_name(0x5000cca2525f2a27) [Thu Dec 12 07:46:35 2019][ 13.279329] scsi 1:0:63:0: enclosure logical id(0x5000ccab04037180), slot(0) [Thu Dec 12 07:46:35 2019][ 13.286460] scsi 1:0:63:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:35 2019][ 13.293264] scsi 1:0:63:0: serial_number( 7SHPAG1W) [Thu Dec 12 07:46:36 2019][ 13.298751] scsi 1:0:63:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:36 2019][ 13.318810] mpt3sas_cm0: detecting: handle(0x00e0), sas_address(0x5000cca2525e977d), phy(1) [Thu Dec 12 07:46:36 2019][ 13.327166] mpt3sas_cm0: REPORT_LUNS: handle(0x00e0), retries(0) [Thu Dec 12 07:46:36 2019][ 13.333329] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e0), lun(0) [Thu Dec 12 07:46:36 2019][ 13.339973] scsi 1:0:64:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:36 2019][ 13.348287] scsi 1:0:64:0: SSP: handle(0x00e0), sas_addr(0x5000cca2525e977d), phy(1), device_name(0x5000cca2525e977f) [Thu Dec 12 07:46:36 2019][ 13.358889] scsi 1:0:64:0: enclosure logical id(0x5000ccab04037180), slot(2) [Thu Dec 12 07:46:36 2019][ 13.366021] scsi 1:0:64:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:36 2019][ 13.372826] scsi 1:0:64:0: serial_number( 7SHP0P8W) [Thu Dec 12 07:46:36 2019][ 13.378314] scsi 1:0:64:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:36 2019][ 13.400809] mpt3sas_cm0: detecting: handle(0x00e1), sas_address(0x5000cca2525ed2bd), phy(2) [Thu Dec 12 07:46:36 2019][ 13.409162] mpt3sas_cm0: REPORT_LUNS: handle(0x00e1), retries(0) [Thu Dec 12 07:46:36 2019][ 13.415304] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e1), lun(0) [Thu Dec 12 07:46:36 2019][ 13.422206] scsi 1:0:65:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:36 2019][ 13.430521] scsi 1:0:65:0: SSP: handle(0x00e1), sas_addr(0x5000cca2525ed2bd), phy(2), device_name(0x5000cca2525ed2bf) [Thu Dec 12 07:46:36 2019][ 13.441117] scsi 1:0:65:0: enclosure logical id(0x5000ccab04037180), slot(11) [Thu Dec 12 07:46:36 2019][ 13.448337] scsi 1:0:65:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:36 2019][ 13.455144] scsi 1:0:65:0: serial_number( 7SHP4MLW) [Thu Dec 12 07:46:36 2019][ 13.460633] scsi 1:0:65:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:36 2019][ 13.480811] mpt3sas_cm0: detecting: handle(0x00e2), sas_address(0x5000cca2525ec049), phy(3) [Thu Dec 12 07:46:36 2019][ 13.489165] mpt3sas_cm0: REPORT_LUNS: handle(0x00e2), retries(0) [Thu Dec 12 07:46:36 2019][ 13.495338] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e2), lun(0) [Thu Dec 12 07:46:36 2019][ 13.501972] scsi 1:0:66:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:36 2019][ 13.510278] scsi 1:0:66:0: SSP: handle(0x00e2), sas_addr(0x5000cca2525ec049), phy(3), device_name(0x5000cca2525ec04b) [Thu Dec 12 07:46:36 2019][ 13.520879] scsi 1:0:66:0: enclosure logical id(0x5000ccab04037180), slot(12) [Thu Dec 12 07:46:36 2019][ 13.528098] scsi 1:0:66:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:36 2019][ 13.534905] scsi 1:0:66:0: serial_number( 7SHP3DHW) [Thu Dec 12 07:46:36 2019][ 13.540391] scsi 1:0:66:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:36 2019][ 13.563812] mpt3sas_cm0: detecting: handle(0x00e3), sas_address(0x5000cca2525ff611), phy(4) [Thu Dec 12 07:46:36 2019][ 13.572175] mpt3sas_cm0: REPORT_LUNS: handle(0x00e3), retries(0) [Thu Dec 12 07:46:36 2019][ 13.578328] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e3), lun(0) [Thu Dec 12 07:46:36 2019][ 13.584946] scsi 1:0:67:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:36 2019][ 13.593250] scsi 1:0:67:0: SSP: handle(0x00e3), sas_addr(0x5000cca2525ff611), phy(4), device_name(0x5000cca2525ff613) [Thu Dec 12 07:46:36 2019][ 13.603845] scsi 1:0:67:0: enclosure logical id(0x5000ccab04037180), slot(13) [Thu Dec 12 07:46:36 2019][ 13.611066] scsi 1:0:67:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:36 2019][ 13.617868] scsi 1:0:67:0: serial_number( 7SHPT11W) [Thu Dec 12 07:46:36 2019][ 13.623356] scsi 1:0:67:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:36 2019][ 13.645813] mpt3sas_cm0: detecting: handle(0x00e4), sas_address(0x5000cca2526016ed), phy(5) [Thu Dec 12 07:46:36 2019][ 13.654160] mpt3sas_cm0: REPORT_LUNS: handle(0x00e4), retries(0) [Thu Dec 12 07:46:36 2019][ 13.660301] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e4), lun(0) [Thu Dec 12 07:46:36 2019][ 13.666924] scsi 1:0:68:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:36 2019][ 13.675233] scsi 1:0:68:0: SSP: handle(0x00e4), sas_addr(0x5000cca2526016ed), phy(5), device_name(0x5000cca2526016ef) [Thu Dec 12 07:46:36 2019][ 13.685832] scsi 1:0:68:0: enclosure logical id(0x5000ccab04037180), slot(14) [Thu Dec 12 07:46:36 2019][ 13.693053] scsi 1:0:68:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:36 2019][ 13.699857] scsi 1:0:68:0: serial_number( 7SHPV6WW) [Thu Dec 12 07:46:36 2019][ 13.705344] scsi 1:0:68:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:36 2019][ 13.727811] mpt3sas_cm0: detecting: handle(0x00e5), sas_address(0x5000cca2525f4871), phy(6) [Thu Dec 12 07:46:36 2019][ 13.736165] mpt3sas_cm0: REPORT_LUNS: handle(0x00e5), retries(0) [Thu Dec 12 07:46:36 2019][ 13.742303] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e5), lun(0) [Thu Dec 12 07:46:36 2019][ 13.748915] scsi 1:0:69:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:36 2019][ 13.757221] scsi 1:0:69:0: SSP: handle(0x00e5), sas_addr(0x5000cca2525f4871), phy(6), device_name(0x5000cca2525f4873) [Thu Dec 12 07:46:36 2019][ 13.767821] scsi 1:0:69:0: enclosure logical id(0x5000ccab04037180), slot(15) [Thu Dec 12 07:46:36 2019][ 13.775040] scsi 1:0:69:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:36 2019][ 13.781844] scsi 1:0:69:0: serial_number( 7SHPDGLW) [Thu Dec 12 07:46:36 2019][ 13.787332] scsi 1:0:69:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:36 2019][ 13.809811] mpt3sas_cm0: detecting: handle(0x00e6), sas_address(0x5000cca2525f568d), phy(7) [Thu Dec 12 07:46:36 2019][ 13.818161] mpt3sas_cm0: REPORT_LUNS: handle(0x00e6), retries(0) [Thu Dec 12 07:46:36 2019][ 13.824324] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e6), lun(0) [Thu Dec 12 07:46:36 2019][ 13.830984] scsi 1:0:70:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:36 2019][ 13.839284] scsi 1:0:70:0: SSP: handle(0x00e6), sas_addr(0x5000cca2525f568d), phy(7), device_name(0x5000cca2525f568f) [Thu Dec 12 07:46:36 2019][ 13.849886] scsi 1:0:70:0: enclosure logical id(0x5000ccab04037180), slot(16) [Thu Dec 12 07:46:36 2019][ 13.857105] scsi 1:0:70:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:36 2019][ 13.863911] scsi 1:0:70:0: serial_number( 7SHPEDRW) [Thu Dec 12 07:46:36 2019][ 13.869398] scsi 1:0:70:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:36 2019][ 13.891815] mpt3sas_cm0: detecting: handle(0x00e7), sas_address(0x5000cca2525f6c25), phy(8) [Thu Dec 12 07:46:36 2019][ 13.900165] mpt3sas_cm0: REPORT_LUNS: handle(0x00e7), retries(0) [Thu Dec 12 07:46:36 2019][ 13.906333] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e7), lun(0) [Thu Dec 12 07:46:36 2019][ 13.912948] scsi 1:0:71:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:36 2019][ 13.921251] scsi 1:0:71:0: SSP: handle(0x00e7), sas_addr(0x5000cca2525f6c25), phy(8), device_name(0x5000cca2525f6c27) [Thu Dec 12 07:46:36 2019][ 13.931847] scsi 1:0:71:0: enclosure logical id(0x5000ccab04037180), slot(17) [Thu Dec 12 07:46:36 2019][ 13.939067] scsi 1:0:71:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:36 2019][ 13.945872] scsi 1:0:71:0: serial_number( 7SHPGV9W) [Thu Dec 12 07:46:36 2019][ 13.951359] scsi 1:0:71:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:36 2019][ 13.973820] mpt3sas_cm0: detecting: handle(0x00e8), sas_address(0x5000cca2525ed401), phy(9) [Thu Dec 12 07:46:36 2019][ 13.982171] mpt3sas_cm0: REPORT_LUNS: handle(0x00e8), retries(0) [Thu Dec 12 07:46:36 2019][ 13.988341] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e8), lun(0) [Thu Dec 12 07:46:36 2019][ 13.994943] scsi 1:0:72:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:36 2019][ 14.003242] scsi 1:0:72:0: SSP: handle(0x00e8), sas_addr(0x5000cca2525ed401), phy(9), device_name(0x5000cca2525ed403) [Thu Dec 12 07:46:36 2019][ 14.013844] scsi 1:0:72:0: enclosure logical id(0x5000ccab04037180), slot(18) [Thu Dec 12 07:46:36 2019][ 14.021061] scsi 1:0:72:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:36 2019][ 14.027867] scsi 1:0:72:0: serial_number( 7SHP4R6W) [Thu Dec 12 07:46:36 2019][ 14.033354] scsi 1:0:72:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:36 2019][ 14.058815] mpt3sas_cm0: detecting: handle(0x00e9), sas_address(0x5000cca2525e0405), phy(10) [Thu Dec 12 07:46:36 2019][ 14.067252] mpt3sas_cm0: REPORT_LUNS: handle(0x00e9), retries(0) [Thu Dec 12 07:46:36 2019][ 14.073416] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e9), lun(0) [Thu Dec 12 07:46:36 2019][ 14.089578] scsi 1:0:73:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:36 2019][ 14.097858] scsi 1:0:73:0: SSP: handle(0x00e9), sas_addr(0x5000cca2525e0405), phy(10), device_name(0x5000cca2525e0407) [Thu Dec 12 07:46:36 2019][ 14.108546] scsi 1:0:73:0: enclosure logical id(0x5000ccab04037180), slot(19) [Thu Dec 12 07:46:36 2019][ 14.115765] scsi 1:0:73:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:36 2019][ 14.122553] scsi 1:0:73:0: serial_number( 7SHNPVUW) [Thu Dec 12 07:46:36 2019][ 14.128039] scsi 1:0:73:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:36 2019][ 14.147815] mpt3sas_cm0: detecting: handle(0x00ea), sas_address(0x5000cca2525ea9e5), phy(11) [Thu Dec 12 07:46:36 2019][ 14.156254] mpt3sas_cm0: REPORT_LUNS: handle(0x00ea), retries(0) [Thu Dec 12 07:46:36 2019][ 14.162401] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ea), lun(0) [Thu Dec 12 07:46:36 2019][ 14.169039] scsi 1:0:74:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:36 2019][ 14.177345] scsi 1:0:74:0: SSP: handle(0x00ea), sas_addr(0x5000cca2525ea9e5), phy(11), device_name(0x5000cca2525ea9e7) [Thu Dec 12 07:46:36 2019][ 14.188028] scsi 1:0:74:0: enclosure logical id(0x5000ccab04037180), slot(20) [Thu Dec 12 07:46:36 2019][ 14.195247] scsi 1:0:74:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:36 2019][ 14.202052] scsi 1:0:74:0: serial_number( 7SHP1X8W) [Thu Dec 12 07:46:36 2019][ 14.207539] scsi 1:0:74:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:36 2019][ 14.230821] mpt3sas_cm0: detecting: handle(0x00eb), sas_address(0x5000cca2525f1d39), phy(12) [Thu Dec 12 07:46:36 2019][ 14.239262] mpt3sas_cm0: REPORT_LUNS: handle(0x00eb), retries(0) [Thu Dec 12 07:46:36 2019][ 14.245406] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00eb), lun(0) [Thu Dec 12 07:46:36 2019][ 14.252058] scsi 1:0:75:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:36 2019][ 14.260373] scsi 1:0:75:0: SSP: handle(0x00eb), sas_addr(0x5000cca2525f1d39), phy(12), device_name(0x5000cca2525f1d3b) [Thu Dec 12 07:46:36 2019][ 14.271056] scsi 1:0:75:0: enclosure logical id(0x5000ccab04037180), slot(21) [Thu Dec 12 07:46:36 2019][ 14.278275] scsi 1:0:75:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:36 2019][ 14.285079] scsi 1:0:75:0: serial_number( 7SHP9LBW) [Thu Dec 12 07:46:36 2019][ 14.290567] scsi 1:0:75:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:37 2019][ 14.310820] mpt3sas_cm0: detecting: handle(0x00ec), sas_address(0x5000cca2525ea499), phy(13) [Thu Dec 12 07:46:37 2019][ 14.319256] mpt3sas_cm0: REPORT_LUNS: handle(0x00ec), retries(0) [Thu Dec 12 07:46:37 2019][ 14.325398] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ec), lun(0) [Thu Dec 12 07:46:37 2019][ 14.332033] scsi 1:0:76:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:37 2019][ 14.340337] scsi 1:0:76:0: SSP: handle(0x00ec), sas_addr(0x5000cca2525ea499), phy(13), device_name(0x5000cca2525ea49b) [Thu Dec 12 07:46:37 2019][ 14.351022] scsi 1:0:76:0: enclosure logical id(0x5000ccab04037180), slot(22) [Thu Dec 12 07:46:37 2019][ 14.358243] scsi 1:0:76:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:37 2019][ 14.365047] scsi 1:0:76:0: serial_number( 7SHP1KAW) [Thu Dec 12 07:46:37 2019][ 14.370537] scsi 1:0:76:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:37 2019][ 14.393817] mpt3sas_cm0: detecting: handle(0x00ed), sas_address(0x5000cca2525fba05), phy(14) [Thu Dec 12 07:46:37 2019][ 14.402258] mpt3sas_cm0: REPORT_LUNS: handle(0x00ed), retries(0) [Thu Dec 12 07:46:37 2019][ 14.408425] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ed), lun(0) [Thu Dec 12 07:46:37 2019][ 14.415084] scsi 1:0:77:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:37 2019][ 14.423387] scsi 1:0:77:0: SSP: handle(0x00ed), sas_addr(0x5000cca2525fba05), phy(14), device_name(0x5000cca2525fba07) [Thu Dec 12 07:46:37 2019][ 14.434078] scsi 1:0:77:0: enclosure logical id(0x5000ccab04037180), slot(23) [Thu Dec 12 07:46:37 2019][ 14.441296] scsi 1:0:77:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:37 2019][ 14.448099] scsi 1:0:77:0: serial_number( 7SHPN12W) [Thu Dec 12 07:46:37 2019][ 14.453589] scsi 1:0:77:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:37 2019][ 14.473818] mpt3sas_cm0: detecting: handle(0x00ee), sas_address(0x5000cca2525e121d), phy(15) [Thu Dec 12 07:46:37 2019][ 14.482253] mpt3sas_cm0: REPORT_LUNS: handle(0x00ee), retries(0) [Thu Dec 12 07:46:37 2019][ 14.488396] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ee), lun(0) [Thu Dec 12 07:46:37 2019][ 14.495061] scsi 1:0:78:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:37 2019][ 14.503380] scsi 1:0:78:0: SSP: handle(0x00ee), sas_addr(0x5000cca2525e121d), phy(15), device_name(0x5000cca2525e121f) [Thu Dec 12 07:46:37 2019][ 14.514061] scsi 1:0:78:0: enclosure logical id(0x5000ccab04037180), slot(24) [Thu Dec 12 07:46:37 2019][ 14.521281] scsi 1:0:78:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:37 2019][ 14.528086] scsi 1:0:78:0: serial_number( 7SHNRTXW) [Thu Dec 12 07:46:37 2019][ 14.533573] scsi 1:0:78:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:37 2019][ 14.553825] mpt3sas_cm0: detecting: handle(0x00ef), sas_address(0x5000cca2525e98f5), phy(16) [Thu Dec 12 07:46:37 2019][ 14.562274] mpt3sas_cm0: REPORT_LUNS: handle(0x00ef), retries(0) [Thu Dec 12 07:46:37 2019][ 14.568407] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ef), lun(0) [Thu Dec 12 07:46:37 2019][ 14.575040] scsi 1:0:79:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:37 2019][ 14.583343] scsi 1:0:79:0: SSP: handle(0x00ef), sas_addr(0x5000cca2525e98f5), phy(16), device_name(0x5000cca2525e98f7) [Thu Dec 12 07:46:37 2019][ 14.594031] scsi 1:0:79:0: enclosure logical id(0x5000ccab04037180), slot(25) [Thu Dec 12 07:46:37 2019][ 14.601249] scsi 1:0:79:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:37 2019][ 14.608055] scsi 1:0:79:0: serial_number( 7SHP0T9W) [Thu Dec 12 07:46:37 2019][ 14.613542] scsi 1:0:79:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:37 2019][ 14.633818] mpt3sas_cm0: detecting: handle(0x00f0), sas_address(0x5000cca2525f8175), phy(17) [Thu Dec 12 07:46:37 2019][ 14.642256] mpt3sas_cm0: REPORT_LUNS: handle(0x00f0), retries(0) [Thu Dec 12 07:46:37 2019][ 14.648399] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f0), lun(0) [Thu Dec 12 07:46:37 2019][ 14.655058] scsi 1:0:80:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:37 2019][ 14.663394] scsi 1:0:80:0: SSP: handle(0x00f0), sas_addr(0x5000cca2525f8175), phy(17), device_name(0x5000cca2525f8177) [Thu Dec 12 07:46:37 2019][ 14.674077] scsi 1:0:80:0: enclosure logical id(0x5000ccab04037180), slot(26) [Thu Dec 12 07:46:37 2019][ 14.681296] scsi 1:0:80:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:37 2019][ 14.688100] scsi 1:0:80:0: serial_number( 7SHPJ89W) [Thu Dec 12 07:46:37 2019][ 14.693589] scsi 1:0:80:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:37 2019][ 14.713824] mpt3sas_cm0: detecting: handle(0x00f1), sas_address(0x5000cca2525fb01d), phy(18) [Thu Dec 12 07:46:37 2019][ 14.722261] mpt3sas_cm0: REPORT_LUNS: handle(0x00f1), retries(0) [Thu Dec 12 07:46:37 2019][ 14.728424] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f1), lun(0) [Thu Dec 12 07:46:37 2019][ 14.735026] scsi 1:0:81:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:37 2019][ 14.743330] scsi 1:0:81:0: SSP: handle(0x00f1), sas_addr(0x5000cca2525fb01d), phy(18), device_name(0x5000cca2525fb01f) [Thu Dec 12 07:46:37 2019][ 14.754019] scsi 1:0:81:0: enclosure logical id(0x5000ccab04037180), slot(27) [Thu Dec 12 07:46:37 2019][ 14.761238] scsi 1:0:81:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:37 2019][ 14.768044] scsi 1:0:81:0: serial_number( 7SHPMBMW) [Thu Dec 12 07:46:37 2019][ 14.773529] scsi 1:0:81:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:37 2019][ 14.793820] mpt3sas_cm0: detecting: handle(0x00f2), sas_address(0x5000cca2525ed549), phy(19) [Thu Dec 12 07:46:37 2019][ 14.802252] mpt3sas_cm0: REPORT_LUNS: handle(0x00f2), retries(0) [Thu Dec 12 07:46:37 2019][ 14.808393] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f2), lun(0) [Thu Dec 12 07:46:37 2019][ 14.815048] scsi 1:0:82:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:37 2019][ 14.823354] scsi 1:0:82:0: SSP: handle(0x00f2), sas_addr(0x5000cca2525ed549), phy(19), device_name(0x5000cca2525ed54b) [Thu Dec 12 07:46:37 2019][ 14.834040] scsi 1:0:82:0: enclosure logical id(0x5000ccab04037180), slot(28) [Thu Dec 12 07:46:37 2019][ 14.841258] scsi 1:0:82:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:37 2019][ 14.848062] scsi 1:0:82:0: serial_number( 7SHP4TVW) [Thu Dec 12 07:46:37 2019][ 14.853550] scsi 1:0:82:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:37 2019][ 14.876820] mpt3sas_cm0: detecting: handle(0x00f3), sas_address(0x5000cca2525fa035), phy(20) [Thu Dec 12 07:46:37 2019][ 14.885254] mpt3sas_cm0: REPORT_LUNS: handle(0x00f3), retries(0) [Thu Dec 12 07:46:37 2019][ 14.891386] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f3), lun(0) [Thu Dec 12 07:46:37 2019][ 14.898029] scsi 1:0:83:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:37 2019][ 14.906325] scsi 1:0:83:0: SSP: handle(0x00f3), sas_addr(0x5000cca2525fa035), phy(20), device_name(0x5000cca2525fa037) [Thu Dec 12 07:46:37 2019][ 14.917015] scsi 1:0:83:0: enclosure logical id(0x5000ccab04037180), slot(29) [Thu Dec 12 07:46:37 2019][ 14.924234] scsi 1:0:83:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:37 2019][ 14.931039] scsi 1:0:83:0: serial_number( 7SHPL9TW) [Thu Dec 12 07:46:37 2019][ 14.936525] scsi 1:0:83:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:37 2019][ 14.958825] mpt3sas_cm0: detecting: handle(0x00f4), sas_address(0x5000cca2525fb941), phy(21) [Thu Dec 12 07:46:37 2019][ 14.967263] mpt3sas_cm0: REPORT_LUNS: handle(0x00f4), retries(0) [Thu Dec 12 07:46:37 2019][ 14.973403] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f4), lun(0) [Thu Dec 12 07:46:37 2019][ 14.980026] scsi 1:0:84:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:37 2019][ 14.988330] scsi 1:0:84:0: SSP: handle(0x00f4), sas_addr(0x5000cca2525fb941), phy(21), device_name(0x5000cca2525fb943) [Thu Dec 12 07:46:37 2019][ 14.999017] scsi 1:0:84:0: enclosure logical id(0x5000ccab04037180), slot(30) [Thu Dec 12 07:46:37 2019][ 15.006237] scsi 1:0:84:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:37 2019][ 15.013042] scsi 1:0:84:0: serial_number( 7SHPMZHW) [Thu Dec 12 07:46:37 2019][ 15.018532] scsi 1:0:84:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:37 2019][ 15.042413] mpt3sas_cm0: detecting: handle(0x00f5), sas_address(0x5000cca2525e22e5), phy(22) [Thu Dec 12 07:46:37 2019][ 15.050852] mpt3sas_cm0: REPORT_LUNS: handle(0x00f5), retries(0) [Thu Dec 12 07:46:37 2019][ 15.056997] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f5), lun(0) [Thu Dec 12 07:46:37 2019][ 15.063653] scsi 1:0:85:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:37 2019][ 15.071945] scsi 1:0:85:0: SSP: handle(0x00f5), sas_addr(0x5000cca2525e22e5), phy(22), device_name(0x5000cca2525e22e7) [Thu Dec 12 07:46:37 2019][ 15.082635] scsi 1:0:85:0: enclosure logical id(0x5000ccab04037180), slot(31) [Thu Dec 12 07:46:37 2019][ 15.089856] scsi 1:0:85:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:37 2019][ 15.096660] scsi 1:0:85:0: serial_number( 7SHNSXKW) [Thu Dec 12 07:46:37 2019][ 15.102147] scsi 1:0:85:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:37 2019][ 15.124840] mpt3sas_cm0: detecting: handle(0x00f6), sas_address(0x5000cca2525fb5bd), phy(23) [Thu Dec 12 07:46:37 2019][ 15.133283] mpt3sas_cm0: REPORT_LUNS: handle(0x00f6), retries(0) [Thu Dec 12 07:46:37 2019][ 15.139422] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f6), lun(0) [Thu Dec 12 07:46:37 2019][ 15.146028] scsi 1:0:86:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:37 2019][ 15.154328] scsi 1:0:86:0: SSP: handle(0x00f6), sas_addr(0x5000cca2525fb5bd), phy(23), device_name(0x5000cca2525fb5bf) [Thu Dec 12 07:46:37 2019][ 15.165012] scsi 1:0:86:0: enclosure logical id(0x5000ccab04037180), slot(32) [Thu Dec 12 07:46:37 2019][ 15.172234] scsi 1:0:86:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:37 2019][ 15.179038] scsi 1:0:86:0: serial_number( 7SHPMS7W) [Thu Dec 12 07:46:37 2019][ 15.184525] scsi 1:0:86:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:37 2019][ 15.206828] mpt3sas_cm0: detecting: handle(0x00f7), sas_address(0x5000cca2525eb77d), phy(24) [Thu Dec 12 07:46:37 2019][ 15.215269] mpt3sas_cm0: REPORT_LUNS: handle(0x00f7), retries(0) [Thu Dec 12 07:46:37 2019][ 15.221438] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f7), lun(0) [Thu Dec 12 07:46:37 2019][ 15.228063] scsi 1:0:87:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:37 2019][ 15.236367] scsi 1:0:87:0: SSP: handle(0x00f7), sas_addr(0x5000cca2525eb77d), phy(24), device_name(0x5000cca2525eb77f) [Thu Dec 12 07:46:37 2019][ 15.247052] scsi 1:0:87:0: enclosure logical id(0x5000ccab04037180), slot(33) [Thu Dec 12 07:46:37 2019][ 15.254271] scsi 1:0:87:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:37 2019][ 15.261079] scsi 1:0:87:0: serial_number( 7SHP2UAW) [Thu Dec 12 07:46:37 2019][ 15.266565] scsi 1:0:87:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:37 2019][ 15.288828] mpt3sas_cm0: detecting: handle(0x00f8), sas_address(0x5000cca2525e1139), phy(25) [Thu Dec 12 07:46:38 2019][ 15.297264] mpt3sas_cm0: REPORT_LUNS: handle(0x00f8), retries(0) [Thu Dec 12 07:46:38 2019][ 15.303407] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f8), lun(0) [Thu Dec 12 07:46:38 2019][ 15.310037] scsi 1:0:88:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:38 2019][ 15.318335] scsi 1:0:88:0: SSP: handle(0x00f8), sas_addr(0x5000cca2525e1139), phy(25), device_name(0x5000cca2525e113b) [Thu Dec 12 07:46:38 2019][ 15.329021] scsi 1:0:88:0: enclosure logical id(0x5000ccab04037180), slot(34) [Thu Dec 12 07:46:38 2019][ 15.336243] scsi 1:0:88:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:38 2019][ 15.343049] scsi 1:0:88:0: serial_number( 7SHNRS2W) [Thu Dec 12 07:46:38 2019][ 15.348535] scsi 1:0:88:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:38 2019][ 15.370827] mpt3sas_cm0: detecting: handle(0x00f9), sas_address(0x5000cca2526014f9), phy(26) [Thu Dec 12 07:46:38 2019][ 15.379270] mpt3sas_cm0: REPORT_LUNS: handle(0x00f9), retries(0) [Thu Dec 12 07:46:38 2019][ 15.385413] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f9), lun(0) [Thu Dec 12 07:46:38 2019][ 15.392054] scsi 1:0:89:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:38 2019][ 15.400366] scsi 1:0:89:0: SSP: handle(0x00f9), sas_addr(0x5000cca2526014f9), phy(26), device_name(0x5000cca2526014fb) [Thu Dec 12 07:46:38 2019][ 15.411054] scsi 1:0:89:0: enclosure logical id(0x5000ccab04037180), slot(35) [Thu Dec 12 07:46:38 2019][ 15.418271] scsi 1:0:89:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:38 2019][ 15.425078] scsi 1:0:89:0: serial_number( 7SHPV2VW) [Thu Dec 12 07:46:38 2019][ 15.430565] scsi 1:0:89:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:38 2019][ 15.452828] mpt3sas_cm0: detecting: handle(0x00fa), sas_address(0x5000cca252598785), phy(27) [Thu Dec 12 07:46:38 2019][ 15.461268] mpt3sas_cm0: REPORT_LUNS: handle(0x00fa), retries(0) [Thu Dec 12 07:46:38 2019][ 15.467417] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00fa), lun(0) [Thu Dec 12 07:46:38 2019][ 15.474075] scsi 1:0:90:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:38 2019][ 15.482389] scsi 1:0:90:0: SSP: handle(0x00fa), sas_addr(0x5000cca252598785), phy(27), device_name(0x5000cca252598787) [Thu Dec 12 07:46:38 2019][ 15.493074] scsi 1:0:90:0: enclosure logical id(0x5000ccab04037180), slot(36) [Thu Dec 12 07:46:38 2019][ 15.500294] scsi 1:0:90:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:38 2019][ 15.507099] scsi 1:0:90:0: serial_number( 7SHL7BRW) [Thu Dec 12 07:46:38 2019][ 15.512587] scsi 1:0:90:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:38 2019][ 15.534831] mpt3sas_cm0: detecting: handle(0x00fb), sas_address(0x5000cca2525f5365), phy(28) [Thu Dec 12 07:46:38 2019][ 15.543270] mpt3sas_cm0: REPORT_LUNS: handle(0x00fb), retries(0) [Thu Dec 12 07:46:38 2019][ 15.549419] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00fb), lun(0) [Thu Dec 12 07:46:38 2019][ 15.556073] scsi 1:0:91:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:38 2019][ 15.564387] scsi 1:0:91:0: SSP: handle(0x00fb), sas_addr(0x5000cca2525f5365), phy(28), device_name(0x5000cca2525f5367) [Thu Dec 12 07:46:38 2019][ 15.575072] scsi 1:0:91:0: enclosure logical id(0x5000ccab04037180), slot(37) [Thu Dec 12 07:46:38 2019][ 15.582290] scsi 1:0:91:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:38 2019][ 15.589100] scsi 1:0:91:0: serial_number( 7SHPE66W) [Thu Dec 12 07:46:38 2019][ 15.594594] scsi 1:0:91:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:38 2019][ 15.617831] mpt3sas_cm0: detecting: handle(0x00fc), sas_address(0x5000cca2525e263d), phy(29) [Thu Dec 12 07:46:38 2019][ 15.626274] mpt3sas_cm0: REPORT_LUNS: handle(0x00fc), retries(0) [Thu Dec 12 07:46:38 2019][ 15.632423] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00fc), lun(0) [Thu Dec 12 07:46:38 2019][ 15.639079] scsi 1:0:92:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:38 2019][ 15.647388] scsi 1:0:92:0: SSP: handle(0x00fc), sas_addr(0x5000cca2525e263d), phy(29), device_name(0x5000cca2525e263f) [Thu Dec 12 07:46:38 2019][ 15.658074] scsi 1:0:92:0: enclosure logical id(0x5000ccab04037180), slot(38) [Thu Dec 12 07:46:38 2019][ 15.665292] scsi 1:0:92:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:38 2019][ 15.672099] scsi 1:0:92:0: serial_number( 7SHNT4GW) [Thu Dec 12 07:46:38 2019][ 15.677587] scsi 1:0:92:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:38 2019][ 15.699830] mpt3sas_cm0: detecting: handle(0x00fd), sas_address(0x5000cca2525f6081), phy(30) [Thu Dec 12 07:46:38 2019][ 15.708266] mpt3sas_cm0: REPORT_LUNS: handle(0x00fd), retries(0) [Thu Dec 12 07:46:38 2019][ 15.714425] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00fd), lun(0) [Thu Dec 12 07:46:38 2019][ 15.721044] scsi 1:0:93:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:38 2019][ 15.729347] scsi 1:0:93:0: SSP: handle(0x00fd), sas_addr(0x5000cca2525f6081), phy(30), device_name(0x5000cca2525f6083) [Thu Dec 12 07:46:38 2019][ 15.740035] scsi 1:0:93:0: enclosure logical id(0x5000ccab04037180), slot(39) [Thu Dec 12 07:46:38 2019][ 15.747252] scsi 1:0:93:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:38 2019][ 15.754060] scsi 1:0:93:0: serial_number( 7SHPG28W) [Thu Dec 12 07:46:38 2019][ 15.759545] scsi 1:0:93:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:38 2019][ 15.782828] mpt3sas_cm0: detecting: handle(0x00fe), sas_address(0x5000cca2525ec83d), phy(31) [Thu Dec 12 07:46:38 2019][ 15.791268] mpt3sas_cm0: REPORT_LUNS: handle(0x00fe), retries(0) [Thu Dec 12 07:46:38 2019][ 15.797411] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00fe), lun(0) [Thu Dec 12 07:46:38 2019][ 15.804065] scsi 1:0:94:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:38 2019][ 15.812375] scsi 1:0:94:0: SSP: handle(0x00fe), sas_addr(0x5000cca2525ec83d), phy(31), device_name(0x5000cca2525ec83f) [Thu Dec 12 07:46:38 2019][ 15.823062] scsi 1:0:94:0: enclosure logical id(0x5000ccab04037180), slot(40) [Thu Dec 12 07:46:38 2019][ 15.830282] scsi 1:0:94:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:38 2019][ 15.837087] scsi 1:0:94:0: serial_number( 7SHP3XXW) [Thu Dec 12 07:46:38 2019][ 15.842575] scsi 1:0:94:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:38 2019][ 15.864835] mpt3sas_cm0: detecting: handle(0x00ff), sas_address(0x5000cca2525ec019), phy(32) [Thu Dec 12 07:46:38 2019][ 15.873276] mpt3sas_cm0: REPORT_LUNS: handle(0x00ff), retries(0) [Thu Dec 12 07:46:38 2019][ 15.879427] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ff), lun(0) [Thu Dec 12 07:46:38 2019][ 15.886088] scsi 1:0:95:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:38 2019][ 15.894405] scsi 1:0:95:0: SSP: handle(0x00ff), sas_addr(0x5000cca2525ec019), phy(32), device_name(0x5000cca2525ec01b) [Thu Dec 12 07:46:38 2019][ 15.905092] scsi 1:0:95:0: enclosure logical id(0x5000ccab04037180), slot(41) [Thu Dec 12 07:46:38 2019][ 15.912312] scsi 1:0:95:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:38 2019][ 15.919117] scsi 1:0:95:0: serial_number( 7SHP3D3W) [Thu Dec 12 07:46:38 2019][ 15.924605] scsi 1:0:95:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:38 2019][ 15.944834] mpt3sas_cm0: detecting: handle(0x0100), sas_address(0x5000cca2525ec559), phy(33) [Thu Dec 12 07:46:38 2019][ 15.953271] mpt3sas_cm0: REPORT_LUNS: handle(0x0100), retries(0) [Thu Dec 12 07:46:38 2019][ 15.959408] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0100), lun(0) [Thu Dec 12 07:46:38 2019][ 15.966022] scsi 1:0:96:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:38 2019][ 15.974325] scsi 1:0:96:0: SSP: handle(0x0100), sas_addr(0x5000cca2525ec559), phy(33), device_name(0x5000cca2525ec55b) [Thu Dec 12 07:46:38 2019][ 15.985009] scsi 1:0:96:0: enclosure logical id(0x5000ccab04037180), slot(42) [Thu Dec 12 07:46:38 2019][ 15.992227] scsi 1:0:96:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:38 2019][ 15.999032] scsi 1:0:96:0: serial_number( 7SHP3RYW) [Thu Dec 12 07:46:38 2019][ 16.004522] scsi 1:0:96:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:38 2019][ 16.026834] mpt3sas_cm0: detecting: handle(0x0101), sas_address(0x5000cca2525fd4a1), phy(34) [Thu Dec 12 07:46:38 2019][ 16.035275] mpt3sas_cm0: REPORT_LUNS: handle(0x0101), retries(0) [Thu Dec 12 07:46:38 2019][ 16.041433] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0101), lun(0) [Thu Dec 12 07:46:38 2019][ 16.048059] scsi 1:0:97:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:38 2019][ 16.056368] scsi 1:0:97:0: SSP: handle(0x0101), sas_addr(0x5000cca2525fd4a1), phy(34), device_name(0x5000cca2525fd4a3) [Thu Dec 12 07:46:38 2019][ 16.067056] scsi 1:0:97:0: enclosure logical id(0x5000ccab04037180), slot(43) [Thu Dec 12 07:46:38 2019][ 16.074276] scsi 1:0:97:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:38 2019][ 16.081084] scsi 1:0:97:0: serial_number( 7SHPPU0W) [Thu Dec 12 07:46:38 2019][ 16.086569] scsi 1:0:97:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:38 2019][ 16.108833] mpt3sas_cm0: detecting: handle(0x0102), sas_address(0x5000cca2525eb5f5), phy(35) [Thu Dec 12 07:46:38 2019][ 16.117270] mpt3sas_cm0: REPORT_LUNS: handle(0x0102), retries(0) [Thu Dec 12 07:46:38 2019][ 16.123407] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0102), lun(0) [Thu Dec 12 07:46:38 2019][ 16.130064] scsi 1:0:98:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:38 2019][ 16.138368] scsi 1:0:98:0: SSP: handle(0x0102), sas_addr(0x5000cca2525eb5f5), phy(35), device_name(0x5000cca2525eb5f7) [Thu Dec 12 07:46:38 2019][ 16.149054] scsi 1:0:98:0: enclosure logical id(0x5000ccab04037180), slot(44) [Thu Dec 12 07:46:38 2019][ 16.156273] scsi 1:0:98:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:38 2019][ 16.163077] scsi 1:0:98:0: serial_number( 7SHP2R5W) [Thu Dec 12 07:46:38 2019][ 16.168565] scsi 1:0:98:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:38 2019][ 16.190838] mpt3sas_cm0: detecting: handle(0x0103), sas_address(0x5000cca2525ebeb1), phy(36) [Thu Dec 12 07:46:38 2019][ 16.199276] mpt3sas_cm0: REPORT_LUNS: handle(0x0103), retries(0) [Thu Dec 12 07:46:38 2019][ 16.205411] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0103), lun(0) [Thu Dec 12 07:46:38 2019][ 16.212051] scsi 1:0:99:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:38 2019][ 16.220354] scsi 1:0:99:0: SSP: handle(0x0103), sas_addr(0x5000cca2525ebeb1), phy(36), device_name(0x5000cca2525ebeb3) [Thu Dec 12 07:46:38 2019][ 16.231040] scsi 1:0:99:0: enclosure logical id(0x5000ccab04037180), slot(45) [Thu Dec 12 07:46:38 2019][ 16.238260] scsi 1:0:99:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:38 2019][ 16.245065] scsi 1:0:99:0: serial_number( 7SHP396W) [Thu Dec 12 07:46:38 2019][ 16.250552] scsi 1:0:99:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:38 2019][ 16.272836] mpt3sas_cm0: detecting: handle(0x0104), sas_address(0x5000cca2525f2919), phy(37) [Thu Dec 12 07:46:38 2019][ 16.281280] mpt3sas_cm0: REPORT_LUNS: handle(0x0104), retries(0) [Thu Dec 12 07:46:39 2019][ 16.287417] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0104), lun(0) [Thu Dec 12 07:46:39 2019][ 16.298608] scsi 1:0:100:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:39 2019][ 16.307167] scsi 1:0:100:0: SSP: handle(0x0104), sas_addr(0x5000cca2525f2919), phy(37), device_name(0x5000cca2525f291b) [Thu Dec 12 07:46:39 2019][ 16.317942] scsi 1:0:100:0: enclosure logical id(0x5000ccab04037180), slot(46) [Thu Dec 12 07:46:39 2019][ 16.325249] scsi 1:0:100:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:39 2019][ 16.332140] scsi 1:0:100:0: serial_number( 7SHPABWW) [Thu Dec 12 07:46:39 2019][ 16.337713] scsi 1:0:100:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:39 2019][ 16.357835] mpt3sas_cm0: detecting: handle(0x0105), sas_address(0x5000cca252602c0d), phy(38) [Thu Dec 12 07:46:39 2019][ 16.366276] mpt3sas_cm0: REPORT_LUNS: handle(0x0105), retries(0) [Thu Dec 12 07:46:39 2019][ 16.372417] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0105), lun(0) [Thu Dec 12 07:46:39 2019][ 16.379047] scsi 1:0:101:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:39 2019][ 16.387439] scsi 1:0:101:0: SSP: handle(0x0105), sas_addr(0x5000cca252602c0d), phy(38), device_name(0x5000cca252602c0f) [Thu Dec 12 07:46:39 2019][ 16.398213] scsi 1:0:101:0: enclosure logical id(0x5000ccab04037180), slot(47) [Thu Dec 12 07:46:39 2019][ 16.405518] scsi 1:0:101:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:39 2019][ 16.412412] scsi 1:0:101:0: serial_number( 7SHPWMHW) [Thu Dec 12 07:46:39 2019][ 16.417984] scsi 1:0:101:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:39 2019][ 16.437838] mpt3sas_cm0: detecting: handle(0x0106), sas_address(0x5000cca2525e7cfd), phy(39) [Thu Dec 12 07:46:39 2019][ 16.446278] mpt3sas_cm0: REPORT_LUNS: handle(0x0106), retries(0) [Thu Dec 12 07:46:39 2019][ 16.452424] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0106), lun(0) [Thu Dec 12 07:46:39 2019][ 16.459098] scsi 1:0:102:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:39 2019][ 16.467491] scsi 1:0:102:0: SSP: handle(0x0106), sas_addr(0x5000cca2525e7cfd), phy(39), device_name(0x5000cca2525e7cff) [Thu Dec 12 07:46:39 2019][ 16.478259] scsi 1:0:102:0: enclosure logical id(0x5000ccab04037180), slot(48) [Thu Dec 12 07:46:39 2019][ 16.485565] scsi 1:0:102:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:39 2019][ 16.492458] scsi 1:0:102:0: serial_number( 7SHNYXKW) [Thu Dec 12 07:46:39 2019][ 16.498032] scsi 1:0:102:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:39 2019][ 16.517837] mpt3sas_cm0: detecting: handle(0x0107), sas_address(0x5000cca2525f6a31), phy(40) [Thu Dec 12 07:46:39 2019][ 16.526279] mpt3sas_cm0: REPORT_LUNS: handle(0x0107), retries(0) [Thu Dec 12 07:46:39 2019][ 16.532429] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0107), lun(0) [Thu Dec 12 07:46:39 2019][ 16.539085] scsi 1:0:103:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:39 2019][ 16.547482] scsi 1:0:103:0: SSP: handle(0x0107), sas_addr(0x5000cca2525f6a31), phy(40), device_name(0x5000cca2525f6a33) [Thu Dec 12 07:46:39 2019][ 16.558254] scsi 1:0:103:0: enclosure logical id(0x5000ccab04037180), slot(49) [Thu Dec 12 07:46:39 2019][ 16.565560] scsi 1:0:103:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:39 2019][ 16.572460] scsi 1:0:103:0: serial_number( 7SHPGR8W) [Thu Dec 12 07:46:39 2019][ 16.578037] scsi 1:0:103:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:39 2019][ 16.600837] mpt3sas_cm0: detecting: handle(0x0108), sas_address(0x5000cca2525f7f25), phy(41) [Thu Dec 12 07:46:39 2019][ 16.609274] mpt3sas_cm0: REPORT_LUNS: handle(0x0108), retries(0) [Thu Dec 12 07:46:39 2019][ 16.615418] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0108), lun(0) [Thu Dec 12 07:46:39 2019][ 16.622073] scsi 1:0:104:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:39 2019][ 16.630469] scsi 1:0:104:0: SSP: handle(0x0108), sas_addr(0x5000cca2525f7f25), phy(41), device_name(0x5000cca2525f7f27) [Thu Dec 12 07:46:39 2019][ 16.641239] scsi 1:0:104:0: enclosure logical id(0x5000ccab04037180), slot(50) [Thu Dec 12 07:46:39 2019][ 16.648545] scsi 1:0:104:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:39 2019][ 16.655436] scsi 1:0:104:0: serial_number( 7SHPJ3JW) [Thu Dec 12 07:46:39 2019][ 16.661009] scsi 1:0:104:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:39 2019][ 16.680841] mpt3sas_cm0: detecting: handle(0x0109), sas_address(0x5000cca2525eb4b1), phy(42) [Thu Dec 12 07:46:39 2019][ 16.689284] mpt3sas_cm0: REPORT_LUNS: handle(0x0109), retries(0) [Thu Dec 12 07:46:39 2019][ 16.695446] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0109), lun(0) [Thu Dec 12 07:46:39 2019][ 16.702085] scsi 1:0:105:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:39 2019][ 16.710479] scsi 1:0:105:0: SSP: handle(0x0109), sas_addr(0x5000cca2525eb4b1), phy(42), device_name(0x5000cca2525eb4b3) [Thu Dec 12 07:46:39 2019][ 16.721249] scsi 1:0:105:0: enclosure logical id(0x5000ccab04037180), slot(51) [Thu Dec 12 07:46:39 2019][ 16.728556] scsi 1:0:105:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:39 2019][ 16.735447] scsi 1:0:105:0: serial_number( 7SHP2MKW) [Thu Dec 12 07:46:39 2019][ 16.741019] scsi 1:0:105:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:39 2019][ 16.760840] mpt3sas_cm0: detecting: handle(0x010a), sas_address(0x5000cca2525e1f9d), phy(43) [Thu Dec 12 07:46:39 2019][ 16.769277] mpt3sas_cm0: REPORT_LUNS: handle(0x010a), retries(0) [Thu Dec 12 07:46:39 2019][ 16.775436] mpt3sas_cm0: TEST_UNIT_READY: handle(0x010a), lun(0) [Thu Dec 12 07:46:39 2019][ 16.782104] scsi 1:0:106:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:39 2019][ 16.790506] scsi 1:0:106:0: SSP: handle(0x010a), sas_addr(0x5000cca2525e1f9d), phy(43), device_name(0x5000cca2525e1f9f) [Thu Dec 12 07:46:39 2019][ 16.801279] scsi 1:0:106:0: enclosure logical id(0x5000ccab04037180), slot(52) [Thu Dec 12 07:46:39 2019][ 16.808583] scsi 1:0:106:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:39 2019][ 16.815476] scsi 1:0:106:0: serial_number( 7SHNSPTW) [Thu Dec 12 07:46:39 2019][ 16.821050] scsi 1:0:106:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:39 2019][ 16.844431] mpt3sas_cm0: detecting: handle(0x010b), sas_address(0x5000cca2525e52fd), phy(44) [Thu Dec 12 07:46:39 2019][ 16.852869] mpt3sas_cm0: REPORT_LUNS: handle(0x010b), retries(0) [Thu Dec 12 07:46:39 2019][ 16.859011] mpt3sas_cm0: TEST_UNIT_READY: handle(0x010b), lun(0) [Thu Dec 12 07:46:39 2019][ 16.865633] scsi 1:0:107:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:39 2019][ 16.874028] scsi 1:0:107:0: SSP: handle(0x010b), sas_addr(0x5000cca2525e52fd), phy(44), device_name(0x5000cca2525e52ff) [Thu Dec 12 07:46:39 2019][ 16.884799] scsi 1:0:107:0: enclosure logical id(0x5000ccab04037180), slot(53) [Thu Dec 12 07:46:39 2019][ 16.892106] scsi 1:0:107:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:39 2019][ 16.898998] scsi 1:0:107:0: serial_number( 7SHNW3VW) [Thu Dec 12 07:46:39 2019][ 16.904570] scsi 1:0:107:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:39 2019][ 16.924839] mpt3sas_cm0: detecting: handle(0x010c), sas_address(0x5000cca2525f4e71), phy(45) [Thu Dec 12 07:46:39 2019][ 16.933278] mpt3sas_cm0: REPORT_LUNS: handle(0x010c), retries(0) [Thu Dec 12 07:46:39 2019][ 16.939419] mpt3sas_cm0: TEST_UNIT_READY: handle(0x010c), lun(0) [Thu Dec 12 07:46:39 2019][ 16.946037] scsi 1:0:108:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:39 2019][ 16.954429] scsi 1:0:108:0: SSP: handle(0x010c), sas_addr(0x5000cca2525f4e71), phy(45), device_name(0x5000cca2525f4e73) [Thu Dec 12 07:46:39 2019][ 16.965201] scsi 1:0:108:0: enclosure logical id(0x5000ccab04037180), slot(54) [Thu Dec 12 07:46:39 2019][ 16.972508] scsi 1:0:108:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:39 2019][ 16.979399] scsi 1:0:108:0: serial_number( 7SHPDVZW) [Thu Dec 12 07:46:39 2019][ 16.984974] scsi 1:0:108:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:39 2019][ 17.004841] mpt3sas_cm0: detecting: handle(0x010d), sas_address(0x5000cca2525fd499), phy(46) [Thu Dec 12 07:46:39 2019][ 17.013281] mpt3sas_cm0: REPORT_LUNS: handle(0x010d), retries(0) [Thu Dec 12 07:46:39 2019][ 17.019421] mpt3sas_cm0: TEST_UNIT_READY: handle(0x010d), lun(0) [Thu Dec 12 07:46:39 2019][ 17.026055] scsi 1:0:109:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:39 2019][ 17.034444] scsi 1:0:109:0: SSP: handle(0x010d), sas_addr(0x5000cca2525fd499), phy(46), device_name(0x5000cca2525fd49b) [Thu Dec 12 07:46:39 2019][ 17.045212] scsi 1:0:109:0: enclosure logical id(0x5000ccab04037180), slot(55) [Thu Dec 12 07:46:39 2019][ 17.052519] scsi 1:0:109:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:39 2019][ 17.059410] scsi 1:0:109:0: serial_number( 7SHPPTYW) [Thu Dec 12 07:46:39 2019][ 17.064988] scsi 1:0:109:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:39 2019][ 17.087841] mpt3sas_cm0: detecting: handle(0x010e), sas_address(0x5000cca2525e7879), phy(47) [Thu Dec 12 07:46:39 2019][ 17.096283] mpt3sas_cm0: REPORT_LUNS: handle(0x010e), retries(0) [Thu Dec 12 07:46:39 2019][ 17.102455] mpt3sas_cm0: TEST_UNIT_READY: handle(0x010e), lun(0) [Thu Dec 12 07:46:39 2019][ 17.109119] scsi 1:0:110:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:39 2019][ 17.117517] scsi 1:0:110:0: SSP: handle(0x010e), sas_addr(0x5000cca2525e7879), phy(47), device_name(0x5000cca2525e787b) [Thu Dec 12 07:46:39 2019][ 17.128291] scsi 1:0:110:0: enclosure logical id(0x5000ccab04037180), slot(56) [Thu Dec 12 07:46:39 2019][ 17.135599] scsi 1:0:110:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:39 2019][ 17.142489] scsi 1:0:110:0: serial_number( 7SHNYM7W) [Thu Dec 12 07:46:39 2019][ 17.148062] scsi 1:0:110:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:39 2019][ 17.170848] mpt3sas_cm0: detecting: handle(0x010f), sas_address(0x5000cca2525ca199), phy(48) [Thu Dec 12 07:46:39 2019][ 17.179286] mpt3sas_cm0: REPORT_LUNS: handle(0x010f), retries(0) [Thu Dec 12 07:46:39 2019][ 17.185457] mpt3sas_cm0: TEST_UNIT_READY: handle(0x010f), lun(0) [Thu Dec 12 07:46:39 2019][ 17.192094] scsi 1:0:111:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:39 2019][ 17.200488] scsi 1:0:111:0: SSP: handle(0x010f), sas_addr(0x5000cca2525ca199), phy(48), device_name(0x5000cca2525ca19b) [Thu Dec 12 07:46:39 2019][ 17.211258] scsi 1:0:111:0: enclosure logical id(0x5000ccab04037180), slot(57) [Thu Dec 12 07:46:39 2019][ 17.218565] scsi 1:0:111:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:39 2019][ 17.225456] scsi 1:0:111:0: serial_number( 7SHMY83W) [Thu Dec 12 07:46:39 2019][ 17.231031] scsi 1:0:111:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:39 2019][ 17.250842] mpt3sas_cm0: detecting: handle(0x0110), sas_address(0x5000cca2525ffb89), phy(49) [Thu Dec 12 07:46:39 2019][ 17.259279] mpt3sas_cm0: REPORT_LUNS: handle(0x0110), retries(0) [Thu Dec 12 07:46:39 2019][ 17.265420] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0110), lun(0) [Thu Dec 12 07:46:39 2019][ 17.272035] scsi 1:0:112:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:39 2019][ 17.280425] scsi 1:0:112:0: SSP: handle(0x0110), sas_addr(0x5000cca2525ffb89), phy(49), device_name(0x5000cca2525ffb8b) [Thu Dec 12 07:46:40 2019][ 17.291201] scsi 1:0:112:0: enclosure logical id(0x5000ccab04037180), slot(58) [Thu Dec 12 07:46:40 2019][ 17.298506] scsi 1:0:112:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:40 2019][ 17.305401] scsi 1:0:112:0: serial_number( 7SHPTDAW) [Thu Dec 12 07:46:40 2019][ 17.310972] scsi 1:0:112:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:40 2019][ 17.330841] mpt3sas_cm0: detecting: handle(0x0111), sas_address(0x5000cca2525f2669), phy(50) [Thu Dec 12 07:46:40 2019][ 17.339281] mpt3sas_cm0: REPORT_LUNS: handle(0x0111), retries(0) [Thu Dec 12 07:46:40 2019][ 17.345445] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0111), lun(0) [Thu Dec 12 07:46:40 2019][ 17.352099] scsi 1:0:113:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:40 2019][ 17.360489] scsi 1:0:113:0: SSP: handle(0x0111), sas_addr(0x5000cca2525f2669), phy(50), device_name(0x5000cca2525f266b) [Thu Dec 12 07:46:40 2019][ 17.371265] scsi 1:0:113:0: enclosure logical id(0x5000ccab04037180), slot(59) [Thu Dec 12 07:46:40 2019][ 17.378569] scsi 1:0:113:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:40 2019][ 17.385460] scsi 1:0:113:0: serial_number( 7SHPA6AW) [Thu Dec 12 07:46:40 2019][ 17.391036] scsi 1:0:113:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:40 2019][ 17.413334] mpt3sas_cm0: expander_add: handle(0x00dd), parent(0x00da), sas_addr(0x5000ccab040371ff), phys(68) [Thu Dec 12 07:46:40 2019][ 17.433949] mpt3sas_cm0: detecting: handle(0x0112), sas_address(0x5000cca2525eacc1), phy(42) [Thu Dec 12 07:46:40 2019][ 17.442407] mpt3sas_cm0: REPORT_LUNS: handle(0x0112), retries(0) [Thu Dec 12 07:46:40 2019][ 17.448533] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0112), lun(0) [Thu Dec 12 07:46:40 2019][ 17.455199] scsi 1:0:114:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:40 2019][ 17.463594] scsi 1:0:114:0: SSP: handle(0x0112), sas_addr(0x5000cca2525eacc1), phy(42), device_name(0x5000cca2525eacc3) [Thu Dec 12 07:46:40 2019][ 17.474362] scsi 1:0:114:0: enclosure logical id(0x5000ccab04037180), slot(1) [Thu Dec 12 07:46:40 2019][ 17.481583] scsi 1:0:114:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:40 2019][ 17.488476] scsi 1:0:114:0: serial_number( 7SHP235W) [Thu Dec 12 07:46:40 2019][ 17.494048] scsi 1:0:114:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:40 2019][ 17.516849] mpt3sas_cm0: detecting: handle(0x0113), sas_address(0x5000cca2525f8151), phy(43) [Thu Dec 12 07:46:40 2019][ 17.525288] mpt3sas_cm0: REPORT_LUNS: handle(0x0113), retries(0) [Thu Dec 12 07:46:40 2019][ 17.531429] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0113), lun(0) [Thu Dec 12 07:46:40 2019][ 17.538057] scsi 1:0:115:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:40 2019][ 17.546443] scsi 1:0:115:0: SSP: handle(0x0113), sas_addr(0x5000cca2525f8151), phy(43), device_name(0x5000cca2525f8153) [Thu Dec 12 07:46:40 2019][ 17.557217] scsi 1:0:115:0: enclosure logical id(0x5000ccab04037180), slot(3) [Thu Dec 12 07:46:40 2019][ 17.564436] scsi 1:0:115:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:40 2019][ 17.571336] scsi 1:0:115:0: serial_number( 7SHPJ80W) [Thu Dec 12 07:46:40 2019][ 17.576914] scsi 1:0:115:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:40 2019][ 17.597438] mpt3sas_cm0: detecting: handle(0x0114), sas_address(0x5000cca2525ef839), phy(44) [Thu Dec 12 07:46:40 2019][ 17.605880] mpt3sas_cm0: REPORT_LUNS: handle(0x0114), retries(0) [Thu Dec 12 07:46:40 2019][ 17.612030] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0114), lun(0) [Thu Dec 12 07:46:40 2019][ 17.618683] scsi 1:0:116:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:40 2019][ 17.627082] scsi 1:0:116:0: SSP: handle(0x0114), sas_addr(0x5000cca2525ef839), phy(44), device_name(0x5000cca2525ef83b) [Thu Dec 12 07:46:40 2019][ 17.637853] scsi 1:0:116:0: enclosure logical id(0x5000ccab04037180), slot(4) [Thu Dec 12 07:46:40 2019][ 17.645072] scsi 1:0:116:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:40 2019][ 17.651963] scsi 1:0:116:0: serial_number( 7SHP73ZW) [Thu Dec 12 07:46:40 2019][ 17.657538] scsi 1:0:116:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:40 2019][ 17.677851] mpt3sas_cm0: detecting: handle(0x0115), sas_address(0x5000cca2525e72a9), phy(45) [Thu Dec 12 07:46:40 2019][ 17.686290] mpt3sas_cm0: REPORT_LUNS: handle(0x0115), retries(0) [Thu Dec 12 07:46:40 2019][ 17.692434] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0115), lun(0) [Thu Dec 12 07:46:40 2019][ 17.699073] scsi 1:0:117:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:40 2019][ 17.707473] scsi 1:0:117:0: SSP: handle(0x0115), sas_addr(0x5000cca2525e72a9), phy(45), device_name(0x5000cca2525e72ab) [Thu Dec 12 07:46:40 2019][ 17.718247] scsi 1:0:117:0: enclosure logical id(0x5000ccab04037180), slot(5) [Thu Dec 12 07:46:40 2019][ 17.725466] scsi 1:0:117:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:40 2019][ 17.732357] scsi 1:0:117:0: serial_number( 7SHNY77W) [Thu Dec 12 07:46:40 2019][ 17.737930] scsi 1:0:117:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:40 2019][ 17.757846] mpt3sas_cm0: detecting: handle(0x0116), sas_address(0x5000cca2525d3c89), phy(46) [Thu Dec 12 07:46:40 2019][ 17.766283] mpt3sas_cm0: REPORT_LUNS: handle(0x0116), retries(0) [Thu Dec 12 07:46:40 2019][ 17.772424] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0116), lun(0) [Thu Dec 12 07:46:40 2019][ 17.779044] scsi 1:0:118:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:40 2019][ 17.787428] scsi 1:0:118:0: SSP: handle(0x0116), sas_addr(0x5000cca2525d3c89), phy(46), device_name(0x5000cca2525d3c8b) [Thu Dec 12 07:46:40 2019][ 17.798197] scsi 1:0:118:0: enclosure logical id(0x5000ccab04037180), slot(6) [Thu Dec 12 07:46:40 2019][ 17.805417] scsi 1:0:118:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:40 2019][ 17.812308] scsi 1:0:118:0: serial_number( 7SHN8KZW) [Thu Dec 12 07:46:40 2019][ 17.817882] scsi 1:0:118:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:40 2019][ 17.837847] mpt3sas_cm0: detecting: handle(0x0117), sas_address(0x5000cca2525fae0d), phy(47) [Thu Dec 12 07:46:40 2019][ 17.846286] mpt3sas_cm0: REPORT_LUNS: handle(0x0117), retries(0) [Thu Dec 12 07:46:40 2019][ 17.852438] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0117), lun(0) [Thu Dec 12 07:46:40 2019][ 17.859121] scsi 1:0:119:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:40 2019][ 17.867521] scsi 1:0:119:0: SSP: handle(0x0117), sas_addr(0x5000cca2525fae0d), phy(47), device_name(0x5000cca2525fae0f) [Thu Dec 12 07:46:40 2019][ 17.878295] scsi 1:0:119:0: enclosure logical id(0x5000ccab04037180), slot(7) [Thu Dec 12 07:46:40 2019][ 17.885514] scsi 1:0:119:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:40 2019][ 17.892407] scsi 1:0:119:0: serial_number( 7SHPM7BW) [Thu Dec 12 07:46:40 2019][ 17.897980] scsi 1:0:119:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:40 2019][ 17.917848] mpt3sas_cm0: detecting: handle(0x0118), sas_address(0x5000cca2525efdad), phy(48) [Thu Dec 12 07:46:40 2019][ 17.926289] mpt3sas_cm0: REPORT_LUNS: handle(0x0118), retries(0) [Thu Dec 12 07:46:40 2019][ 17.932437] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0118), lun(0) [Thu Dec 12 07:46:40 2019][ 17.939078] scsi 1:0:120:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:40 2019][ 17.947473] scsi 1:0:120:0: SSP: handle(0x0118), sas_addr(0x5000cca2525efdad), phy(48), device_name(0x5000cca2525efdaf) [Thu Dec 12 07:46:40 2019][ 17.958245] scsi 1:0:120:0: enclosure logical id(0x5000ccab04037180), slot(8) [Thu Dec 12 07:46:40 2019][ 17.965466] scsi 1:0:120:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:40 2019][ 17.972359] scsi 1:0:120:0: serial_number( 7SHP7H7W) [Thu Dec 12 07:46:40 2019][ 17.977931] scsi 1:0:120:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:40 2019][ 17.997853] mpt3sas_cm0: detecting: handle(0x0119), sas_address(0x5000cca2525fa301), phy(49) [Thu Dec 12 07:46:40 2019][ 18.006294] mpt3sas_cm0: REPORT_LUNS: handle(0x0119), retries(0) [Thu Dec 12 07:46:40 2019][ 18.012435] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0119), lun(0) [Thu Dec 12 07:46:40 2019][ 18.019086] scsi 1:0:121:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:40 2019][ 18.027483] scsi 1:0:121:0: SSP: handle(0x0119), sas_addr(0x5000cca2525fa301), phy(49), device_name(0x5000cca2525fa303) [Thu Dec 12 07:46:40 2019][ 18.038257] scsi 1:0:121:0: enclosure logical id(0x5000ccab04037180), slot(9) [Thu Dec 12 07:46:40 2019][ 18.045477] scsi 1:0:121:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:40 2019][ 18.052368] scsi 1:0:121:0: serial_number( 7SHPLHKW) [Thu Dec 12 07:46:40 2019][ 18.057943] scsi 1:0:121:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:40 2019][ 18.077853] mpt3sas_cm0: detecting: handle(0x011a), sas_address(0x5000cca2525fb4bd), phy(50) [Thu Dec 12 07:46:40 2019][ 18.086294] mpt3sas_cm0: REPORT_LUNS: handle(0x011a), retries(0) [Thu Dec 12 07:46:40 2019][ 18.092433] mpt3sas_cm0: TEST_UNIT_READY: handle(0x011a), lun(0) [Thu Dec 12 07:46:40 2019][ 18.099063] scsi 1:0:122:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:40 2019][ 18.107450] scsi 1:0:122:0: SSP: handle(0x011a), sas_addr(0x5000cca2525fb4bd), phy(50), device_name(0x5000cca2525fb4bf) [Thu Dec 12 07:46:40 2019][ 18.118226] scsi 1:0:122:0: enclosure logical id(0x5000ccab04037180), slot(10) [Thu Dec 12 07:46:40 2019][ 18.125532] scsi 1:0:122:0: enclosure level(0x0000), connector name( C2 ) [Thu Dec 12 07:46:40 2019][ 18.132425] scsi 1:0:122:0: serial_number( 7SHPMP5W) [Thu Dec 12 07:46:40 2019][ 18.137996] scsi 1:0:122:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:40 2019][ 18.160370] mpt3sas_cm0: expander_add: handle(0x0017), parent(0x0009), sas_addr(0x5000ccab0405db7d), phys(49) [Thu Dec 12 07:46:40 2019][ 18.181103] mpt3sas_cm0: detecting: handle(0x001b), sas_address(0x5000ccab0405db7c), phy(48) [Thu Dec 12 07:46:40 2019][ 18.189542] mpt3sas_cm0: REPORT_LUNS: handle(0x001b), retries(0) [Thu Dec 12 07:46:40 2019][ 18.195936] mpt3sas_cm0: TEST_UNIT_READY: handle(0x001b), lun(0) [Thu Dec 12 07:46:40 2019][ 18.202822] scsi 1:0:123:0: Enclosure HGST H4060-J 2033 PQ: 0 ANSI: 6 [Thu Dec 12 07:46:40 2019][ 18.211390] scsi 1:0:123:0: set ignore_delay_remove for handle(0x001b) [Thu Dec 12 07:46:40 2019][ 18.217919] scsi 1:0:123:0: SES: handle(0x001b), sas_addr(0x5000ccab0405db7c), phy(48), device_name(0x0000000000000000) [Thu Dec 12 07:46:40 2019][ 18.228690] scsi 1:0:123:0: enclosure logical id(0x5000ccab0405db00), slot(60) [Thu Dec 12 07:46:40 2019][ 18.235996] scsi 1:0:123:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:40 2019][ 18.242889] scsi 1:0:123:0: serial_number(USWSJ03918EZ0069 ) [Thu Dec 12 07:46:40 2019][ 18.248811] scsi 1:0:123:0: qdepth(1), tagged(0), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:40 2019][ 18.270291] mpt3sas_cm0: expander_add: handle(0x0019), parent(0x0017), sas_addr(0x5000ccab0405db79), phys(68) [Thu Dec 12 07:46:41 2019][ 18.291599] mpt3sas_cm0: detecting: handle(0x001c), sas_address(0x5000cca252550a76), phy(0) [Thu Dec 12 07:46:41 2019][ 18.299950] mpt3sas_cm0: REPORT_LUNS: handle(0x001c), retries(0) [Thu Dec 12 07:46:41 2019][ 18.306106] mpt3sas_cm0: TEST_UNIT_READY: handle(0x001c), lun(0) [Thu Dec 12 07:46:41 2019][ 18.312735] scsi 1:0:124:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:41 2019][ 18.321123] scsi 1:0:124:0: SSP: handle(0x001c), sas_addr(0x5000cca252550a76), phy(0), device_name(0x5000cca252550a77) [Thu Dec 12 07:46:41 2019][ 18.331810] scsi 1:0:124:0: enclosure logical id(0x5000ccab0405db00), slot(0) [Thu Dec 12 07:46:41 2019][ 18.339028] scsi 1:0:124:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:41 2019][ 18.345919] scsi 1:0:124:0: serial_number( 7SHHSVGG) [Thu Dec 12 07:46:41 2019][ 18.351494] scsi 1:0:124:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:41 2019][ 18.373842] mpt3sas_cm0: detecting: handle(0x001d), sas_address(0x5000cca25253eb32), phy(1) [Thu Dec 12 07:46:41 2019][ 18.382196] mpt3sas_cm0: REPORT_LUNS: handle(0x001d), retries(0) [Thu Dec 12 07:46:41 2019][ 18.388334] mpt3sas_cm0: TEST_UNIT_READY: handle(0x001d), lun(0) [Thu Dec 12 07:46:41 2019][ 18.394982] scsi 1:0:125:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:41 2019][ 18.403370] scsi 1:0:125:0: SSP: handle(0x001d), sas_addr(0x5000cca25253eb32), phy(1), device_name(0x5000cca25253eb33) [Thu Dec 12 07:46:41 2019][ 18.414055] scsi 1:0:125:0: enclosure logical id(0x5000ccab0405db00), slot(2) [Thu Dec 12 07:46:41 2019][ 18.421274] scsi 1:0:125:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:41 2019][ 18.428166] scsi 1:0:125:0: serial_number( 7SHH4RDG) [Thu Dec 12 07:46:41 2019][ 18.433739] scsi 1:0:125:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:41 2019][ 18.453847] mpt3sas_cm0: detecting: handle(0x001e), sas_address(0x5000cca26b950bb6), phy(2) [Thu Dec 12 07:46:41 2019][ 18.462199] mpt3sas_cm0: REPORT_LUNS: handle(0x001e), retries(0) [Thu Dec 12 07:46:41 2019][ 18.468340] mpt3sas_cm0: TEST_UNIT_READY: handle(0x001e), lun(0) [Thu Dec 12 07:46:41 2019][ 18.474960] scsi 1:0:126:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:41 2019][ 18.483346] scsi 1:0:126:0: SSP: handle(0x001e), sas_addr(0x5000cca26b950bb6), phy(2), device_name(0x5000cca26b950bb7) [Thu Dec 12 07:46:41 2019][ 18.494032] scsi 1:0:126:0: enclosure logical id(0x5000ccab0405db00), slot(11) [Thu Dec 12 07:46:41 2019][ 18.501339] scsi 1:0:126:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:41 2019][ 18.508232] scsi 1:0:126:0: serial_number( 1SJMZ22Z) [Thu Dec 12 07:46:41 2019][ 18.513806] scsi 1:0:126:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:41 2019][ 18.533842] mpt3sas_cm0: detecting: handle(0x001f), sas_address(0x5000cca25253f3be), phy(3) [Thu Dec 12 07:46:41 2019][ 18.542192] mpt3sas_cm0: REPORT_LUNS: handle(0x001f), retries(0) [Thu Dec 12 07:46:41 2019][ 18.548341] mpt3sas_cm0: TEST_UNIT_READY: handle(0x001f), lun(0) [Thu Dec 12 07:46:41 2019][ 18.554991] scsi 1:0:127:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:41 2019][ 18.563402] scsi 1:0:127:0: SSP: handle(0x001f), sas_addr(0x5000cca25253f3be), phy(3), device_name(0x5000cca25253f3bf) [Thu Dec 12 07:46:41 2019][ 18.574088] scsi 1:0:127:0: enclosure logical id(0x5000ccab0405db00), slot(12) [Thu Dec 12 07:46:41 2019][ 18.581394] scsi 1:0:127:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:41 2019][ 18.588289] scsi 1:0:127:0: serial_number( 7SHH591G) [Thu Dec 12 07:46:41 2019][ 18.593869] scsi 1:0:127:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:41 2019][ 18.613839] mpt3sas_cm0: detecting: handle(0x0020), sas_address(0x5000cca26a2ac3da), phy(4) [Thu Dec 12 07:46:41 2019][ 18.622196] mpt3sas_cm0: REPORT_LUNS: handle(0x0020), retries(0) [Thu Dec 12 07:46:41 2019][ 18.628332] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0020), lun(0) [Thu Dec 12 07:46:41 2019][ 18.634966] scsi 1:0:128:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:41 2019][ 18.643353] scsi 1:0:128:0: SSP: handle(0x0020), sas_addr(0x5000cca26a2ac3da), phy(4), device_name(0x5000cca26a2ac3db) [Thu Dec 12 07:46:41 2019][ 18.654039] scsi 1:0:128:0: enclosure logical id(0x5000ccab0405db00), slot(13) [Thu Dec 12 07:46:41 2019][ 18.661345] scsi 1:0:128:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:41 2019][ 18.668237] scsi 1:0:128:0: serial_number( 2TGSJ30D) [Thu Dec 12 07:46:41 2019][ 18.673809] scsi 1:0:128:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:41 2019][ 18.693847] mpt3sas_cm0: detecting: handle(0x0021), sas_address(0x5000cca25254102a), phy(5) [Thu Dec 12 07:46:41 2019][ 18.702198] mpt3sas_cm0: REPORT_LUNS: handle(0x0021), retries(0) [Thu Dec 12 07:46:41 2019][ 18.708370] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0021), lun(0) [Thu Dec 12 07:46:41 2019][ 18.725018] scsi 1:0:129:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:41 2019][ 18.733397] scsi 1:0:129:0: SSP: handle(0x0021), sas_addr(0x5000cca25254102a), phy(5), device_name(0x5000cca25254102b) [Thu Dec 12 07:46:41 2019][ 18.744086] scsi 1:0:129:0: enclosure logical id(0x5000ccab0405db00), slot(14) [Thu Dec 12 07:46:41 2019][ 18.751392] scsi 1:0:129:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:41 2019][ 18.758268] scsi 1:0:129:0: serial_number( 7SHH75RG) [Thu Dec 12 07:46:41 2019][ 18.763841] scsi 1:0:129:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:41 2019][ 18.783843] mpt3sas_cm0: detecting: handle(0x0022), sas_address(0x5000cca25254534a), phy(6) [Thu Dec 12 07:46:41 2019][ 18.792193] mpt3sas_cm0: REPORT_LUNS: handle(0x0022), retries(0) [Thu Dec 12 07:46:41 2019][ 18.798331] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0022), lun(0) [Thu Dec 12 07:46:41 2019][ 18.804963] scsi 1:0:130:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:41 2019][ 18.813351] scsi 1:0:130:0: SSP: handle(0x0022), sas_addr(0x5000cca25254534a), phy(6), device_name(0x5000cca25254534b) [Thu Dec 12 07:46:41 2019][ 18.824035] scsi 1:0:130:0: enclosure logical id(0x5000ccab0405db00), slot(15) [Thu Dec 12 07:46:41 2019][ 18.831342] scsi 1:0:130:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:41 2019][ 18.838233] scsi 1:0:130:0: serial_number( 7SHHBN9G) [Thu Dec 12 07:46:41 2019][ 18.843808] scsi 1:0:130:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:41 2019][ 18.863848] mpt3sas_cm0: detecting: handle(0x0023), sas_address(0x5000cca2525430c6), phy(7) [Thu Dec 12 07:46:41 2019][ 18.872205] mpt3sas_cm0: REPORT_LUNS: handle(0x0023), retries(0) [Thu Dec 12 07:46:41 2019][ 18.878382] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0023), lun(0) [Thu Dec 12 07:46:41 2019][ 18.885024] scsi 1:0:131:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:41 2019][ 18.893415] scsi 1:0:131:0: SSP: handle(0x0023), sas_addr(0x5000cca2525430c6), phy(7), device_name(0x5000cca2525430c7) [Thu Dec 12 07:46:41 2019][ 18.904099] scsi 1:0:131:0: enclosure logical id(0x5000ccab0405db00), slot(16) [Thu Dec 12 07:46:41 2019][ 18.911406] scsi 1:0:131:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:41 2019][ 18.918298] scsi 1:0:131:0: serial_number( 7SHH9B1G) [Thu Dec 12 07:46:41 2019][ 18.923872] scsi 1:0:131:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:41 2019][ 18.943847] mpt3sas_cm0: detecting: handle(0x0024), sas_address(0x5000cca25254385e), phy(8) [Thu Dec 12 07:46:41 2019][ 18.952199] mpt3sas_cm0: REPORT_LUNS: handle(0x0024), retries(0) [Thu Dec 12 07:46:41 2019][ 18.958372] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0024), lun(0) [Thu Dec 12 07:46:41 2019][ 18.964991] scsi 1:0:132:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:41 2019][ 18.973380] scsi 1:0:132:0: SSP: handle(0x0024), sas_addr(0x5000cca25254385e), phy(8), device_name(0x5000cca25254385f) [Thu Dec 12 07:46:41 2019][ 18.984067] scsi 1:0:132:0: enclosure logical id(0x5000ccab0405db00), slot(17) [Thu Dec 12 07:46:41 2019][ 18.991375] scsi 1:0:132:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:41 2019][ 18.998265] scsi 1:0:132:0: serial_number( 7SHH9VRG) [Thu Dec 12 07:46:41 2019][ 19.003840] scsi 1:0:132:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:41 2019][ 19.023842] mpt3sas_cm0: detecting: handle(0x0025), sas_address(0x5000cca25253f30e), phy(9) [Thu Dec 12 07:46:41 2019][ 19.032193] mpt3sas_cm0: REPORT_LUNS: handle(0x0025), retries(0) [Thu Dec 12 07:46:41 2019][ 19.038333] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0025), lun(0) [Thu Dec 12 07:46:41 2019][ 19.044946] scsi 1:0:133:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:41 2019][ 19.053331] scsi 1:0:133:0: SSP: handle(0x0025), sas_addr(0x5000cca25253f30e), phy(9), device_name(0x5000cca25253f30f) [Thu Dec 12 07:46:41 2019][ 19.064020] scsi 1:0:133:0: enclosure logical id(0x5000ccab0405db00), slot(18) [Thu Dec 12 07:46:41 2019][ 19.071326] scsi 1:0:133:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:41 2019][ 19.078217] scsi 1:0:133:0: serial_number( 7SHH57MG) [Thu Dec 12 07:46:41 2019][ 19.083790] scsi 1:0:133:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:41 2019][ 19.113846] mpt3sas_cm0: detecting: handle(0x0026), sas_address(0x5000cca252545f66), phy(10) [Thu Dec 12 07:46:41 2019][ 19.122280] mpt3sas_cm0: REPORT_LUNS: handle(0x0026), retries(0) [Thu Dec 12 07:46:41 2019][ 19.128418] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0026), lun(0) [Thu Dec 12 07:46:41 2019][ 19.135069] scsi 1:0:134:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:41 2019][ 19.143473] scsi 1:0:134:0: SSP: handle(0x0026), sas_addr(0x5000cca252545f66), phy(10), device_name(0x5000cca252545f67) [Thu Dec 12 07:46:41 2019][ 19.154249] scsi 1:0:134:0: enclosure logical id(0x5000ccab0405db00), slot(19) [Thu Dec 12 07:46:41 2019][ 19.161555] scsi 1:0:134:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:41 2019][ 19.168446] scsi 1:0:134:0: serial_number( 7SHHDG9G) [Thu Dec 12 07:46:41 2019][ 19.174019] scsi 1:0:134:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:41 2019][ 19.196847] mpt3sas_cm0: detecting: handle(0x0027), sas_address(0x5000cca266daa4e6), phy(11) [Thu Dec 12 07:46:41 2019][ 19.205286] mpt3sas_cm0: REPORT_LUNS: handle(0x0027), retries(0) [Thu Dec 12 07:46:41 2019][ 19.211421] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0027), lun(0) [Thu Dec 12 07:46:41 2019][ 19.218033] scsi 1:0:135:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:41 2019][ 19.226424] scsi 1:0:135:0: SSP: handle(0x0027), sas_addr(0x5000cca266daa4e6), phy(11), device_name(0x5000cca266daa4e7) [Thu Dec 12 07:46:41 2019][ 19.237198] scsi 1:0:135:0: enclosure logical id(0x5000ccab0405db00), slot(20) [Thu Dec 12 07:46:41 2019][ 19.244504] scsi 1:0:135:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:41 2019][ 19.251396] scsi 1:0:135:0: serial_number( 7JKW7MYK) [Thu Dec 12 07:46:41 2019][ 19.256971] scsi 1:0:135:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:41 2019][ 19.276844] mpt3sas_cm0: detecting: handle(0x0028), sas_address(0x5000cca26a25167e), phy(12) [Thu Dec 12 07:46:41 2019][ 19.285278] mpt3sas_cm0: REPORT_LUNS: handle(0x0028), retries(0) [Thu Dec 12 07:46:41 2019][ 19.291447] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0028), lun(0) [Thu Dec 12 07:46:41 2019][ 19.298079] scsi 1:0:136:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:42 2019][ 19.306470] scsi 1:0:136:0: SSP: handle(0x0028), sas_addr(0x5000cca26a25167e), phy(12), device_name(0x5000cca26a25167f) [Thu Dec 12 07:46:42 2019][ 19.317244] scsi 1:0:136:0: enclosure logical id(0x5000ccab0405db00), slot(21) [Thu Dec 12 07:46:42 2019][ 19.324549] scsi 1:0:136:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:42 2019][ 19.331442] scsi 1:0:136:0: serial_number( 2TGND9JD) [Thu Dec 12 07:46:42 2019][ 19.337016] scsi 1:0:136:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:42 2019][ 19.356849] mpt3sas_cm0: detecting: handle(0x0029), sas_address(0x5000cca25253edaa), phy(13) [Thu Dec 12 07:46:42 2019][ 19.365287] mpt3sas_cm0: REPORT_LUNS: handle(0x0029), retries(0) [Thu Dec 12 07:46:42 2019][ 19.371453] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0029), lun(0) [Thu Dec 12 07:46:42 2019][ 19.378098] scsi 1:0:137:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:42 2019][ 19.386498] scsi 1:0:137:0: SSP: handle(0x0029), sas_addr(0x5000cca25253edaa), phy(13), device_name(0x5000cca25253edab) [Thu Dec 12 07:46:42 2019][ 19.397272] scsi 1:0:137:0: enclosure logical id(0x5000ccab0405db00), slot(22) [Thu Dec 12 07:46:42 2019][ 19.404578] scsi 1:0:137:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:42 2019][ 19.411472] scsi 1:0:137:0: serial_number( 7SHH4WHG) [Thu Dec 12 07:46:42 2019][ 19.417044] scsi 1:0:137:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:42 2019][ 19.439849] mpt3sas_cm0: detecting: handle(0x002a), sas_address(0x5000cca266d491a2), phy(14) [Thu Dec 12 07:46:42 2019][ 19.448292] mpt3sas_cm0: REPORT_LUNS: handle(0x002a), retries(0) [Thu Dec 12 07:46:42 2019][ 19.454444] mpt3sas_cm0: TEST_UNIT_READY: handle(0x002a), lun(0) [Thu Dec 12 07:46:42 2019][ 19.461100] scsi 1:0:138:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:42 2019][ 19.469506] scsi 1:0:138:0: SSP: handle(0x002a), sas_addr(0x5000cca266d491a2), phy(14), device_name(0x5000cca266d491a3) [Thu Dec 12 07:46:42 2019][ 19.480274] scsi 1:0:138:0: enclosure logical id(0x5000ccab0405db00), slot(23) [Thu Dec 12 07:46:42 2019][ 19.487580] scsi 1:0:138:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:42 2019][ 19.494472] scsi 1:0:138:0: serial_number( 7JKSX22K) [Thu Dec 12 07:46:42 2019][ 19.500046] scsi 1:0:138:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:42 2019][ 19.519849] mpt3sas_cm0: detecting: handle(0x002b), sas_address(0x5000cca26b9a709a), phy(15) [Thu Dec 12 07:46:42 2019][ 19.528286] mpt3sas_cm0: REPORT_LUNS: handle(0x002b), retries(0) [Thu Dec 12 07:46:42 2019][ 19.534458] mpt3sas_cm0: TEST_UNIT_READY: handle(0x002b), lun(0) [Thu Dec 12 07:46:42 2019][ 19.541101] scsi 1:0:139:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:42 2019][ 19.549495] scsi 1:0:139:0: SSP: handle(0x002b), sas_addr(0x5000cca26b9a709a), phy(15), device_name(0x5000cca26b9a709b) [Thu Dec 12 07:46:42 2019][ 19.560269] scsi 1:0:139:0: enclosure logical id(0x5000ccab0405db00), slot(24) [Thu Dec 12 07:46:42 2019][ 19.567574] scsi 1:0:139:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:42 2019][ 19.574471] scsi 1:0:139:0: serial_number( 1SJRY0YZ) [Thu Dec 12 07:46:42 2019][ 19.580051] scsi 1:0:139:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:42 2019][ 19.602851] mpt3sas_cm0: detecting: handle(0x002c), sas_address(0x5000cca25253f832), phy(16) [Thu Dec 12 07:46:42 2019][ 19.611288] mpt3sas_cm0: REPORT_LUNS: handle(0x002c), retries(0) [Thu Dec 12 07:46:42 2019][ 19.617434] mpt3sas_cm0: TEST_UNIT_READY: handle(0x002c), lun(0) [Thu Dec 12 07:46:42 2019][ 19.624066] scsi 1:0:140:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:42 2019][ 19.632452] scsi 1:0:140:0: SSP: handle(0x002c), sas_addr(0x5000cca25253f832), phy(16), device_name(0x5000cca25253f833) [Thu Dec 12 07:46:42 2019][ 19.643227] scsi 1:0:140:0: enclosure logical id(0x5000ccab0405db00), slot(25) [Thu Dec 12 07:46:42 2019][ 19.650533] scsi 1:0:140:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:42 2019][ 19.657425] scsi 1:0:140:0: serial_number( 7SHH5L7G) [Thu Dec 12 07:46:42 2019][ 19.663000] scsi 1:0:140:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:42 2019][ 19.682850] mpt3sas_cm0: detecting: handle(0x002d), sas_address(0x5000cca26a2ab23e), phy(17) [Thu Dec 12 07:46:42 2019][ 19.691290] mpt3sas_cm0: REPORT_LUNS: handle(0x002d), retries(0) [Thu Dec 12 07:46:42 2019][ 19.697461] mpt3sas_cm0: TEST_UNIT_READY: handle(0x002d), lun(0) [Thu Dec 12 07:46:42 2019][ 19.704095] scsi 1:0:141:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:42 2019][ 19.712483] scsi 1:0:141:0: SSP: handle(0x002d), sas_addr(0x5000cca26a2ab23e), phy(17), device_name(0x5000cca26a2ab23f) [Thu Dec 12 07:46:42 2019][ 19.723255] scsi 1:0:141:0: enclosure logical id(0x5000ccab0405db00), slot(26) [Thu Dec 12 07:46:42 2019][ 19.730562] scsi 1:0:141:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:42 2019][ 19.737453] scsi 1:0:141:0: serial_number( 2TGSGXND) [Thu Dec 12 07:46:42 2019][ 19.743027] scsi 1:0:141:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:42 2019][ 19.762851] mpt3sas_cm0: detecting: handle(0x002e), sas_address(0x5000cca26b9b9696), phy(18) [Thu Dec 12 07:46:42 2019][ 19.771295] mpt3sas_cm0: REPORT_LUNS: handle(0x002e), retries(0) [Thu Dec 12 07:46:42 2019][ 19.777465] mpt3sas_cm0: TEST_UNIT_READY: handle(0x002e), lun(0) [Thu Dec 12 07:46:42 2019][ 19.789515] scsi 1:0:142:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:42 2019][ 19.797891] scsi 1:0:142:0: SSP: handle(0x002e), sas_addr(0x5000cca26b9b9696), phy(18), device_name(0x5000cca26b9b9697) [Thu Dec 12 07:46:42 2019][ 19.808667] scsi 1:0:142:0: enclosure logical id(0x5000ccab0405db00), slot(27) [Thu Dec 12 07:46:42 2019][ 19.815971] scsi 1:0:142:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:42 2019][ 19.822849] scsi 1:0:142:0: serial_number( 1SJSKLWZ) [Thu Dec 12 07:46:42 2019][ 19.828420] scsi 1:0:142:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:42 2019][ 19.850852] mpt3sas_cm0: detecting: handle(0x002f), sas_address(0x5000cca252559472), phy(19) [Thu Dec 12 07:46:42 2019][ 19.859296] mpt3sas_cm0: REPORT_LUNS: handle(0x002f), retries(0) [Thu Dec 12 07:46:42 2019][ 19.865470] mpt3sas_cm0: TEST_UNIT_READY: handle(0x002f), lun(0) [Thu Dec 12 07:46:42 2019][ 19.872087] scsi 1:0:143:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:42 2019][ 19.880476] scsi 1:0:143:0: SSP: handle(0x002f), sas_addr(0x5000cca252559472), phy(19), device_name(0x5000cca252559473) [Thu Dec 12 07:46:42 2019][ 19.891252] scsi 1:0:143:0: enclosure logical id(0x5000ccab0405db00), slot(28) [Thu Dec 12 07:46:42 2019][ 19.898558] scsi 1:0:143:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:42 2019][ 19.905448] scsi 1:0:143:0: serial_number( 7SHJ21AG) [Thu Dec 12 07:46:42 2019][ 19.911024] scsi 1:0:143:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:42 2019][ 19.930849] mpt3sas_cm0: detecting: handle(0x0030), sas_address(0x5000cca25253f94e), phy(20) [Thu Dec 12 07:46:42 2019][ 19.939288] mpt3sas_cm0: REPORT_LUNS: handle(0x0030), retries(0) [Thu Dec 12 07:46:42 2019][ 19.945438] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0030), lun(0) [Thu Dec 12 07:46:42 2019][ 19.952060] scsi 1:0:144:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:42 2019][ 19.960443] scsi 1:0:144:0: SSP: handle(0x0030), sas_addr(0x5000cca25253f94e), phy(20), device_name(0x5000cca25253f94f) [Thu Dec 12 07:46:42 2019][ 19.971211] scsi 1:0:144:0: enclosure logical id(0x5000ccab0405db00), slot(29) [Thu Dec 12 07:46:42 2019][ 19.978516] scsi 1:0:144:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:42 2019][ 19.985409] scsi 1:0:144:0: serial_number( 7SHH5NJG) [Thu Dec 12 07:46:42 2019][ 19.990984] scsi 1:0:144:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:42 2019][ 20.010851] mpt3sas_cm0: detecting: handle(0x0031), sas_address(0x5000cca25253e69a), phy(21) [Thu Dec 12 07:46:42 2019][ 20.019291] mpt3sas_cm0: REPORT_LUNS: handle(0x0031), retries(0) [Thu Dec 12 07:46:42 2019][ 20.025429] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0031), lun(0) [Thu Dec 12 07:46:42 2019][ 20.032056] scsi 1:0:145:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:42 2019][ 20.040448] scsi 1:0:145:0: SSP: handle(0x0031), sas_addr(0x5000cca25253e69a), phy(21), device_name(0x5000cca25253e69b) [Thu Dec 12 07:46:42 2019][ 20.051221] scsi 1:0:145:0: enclosure logical id(0x5000ccab0405db00), slot(30) [Thu Dec 12 07:46:42 2019][ 20.058528] scsi 1:0:145:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:42 2019][ 20.065421] scsi 1:0:145:0: serial_number( 7SHH4DXG) [Thu Dec 12 07:46:42 2019][ 20.070996] scsi 1:0:145:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:42 2019][ 20.096444] mpt3sas_cm0: detecting: handle(0x0032), sas_address(0x5000cca252543cc2), phy(22) [Thu Dec 12 07:46:42 2019][ 20.104897] mpt3sas_cm0: REPORT_LUNS: handle(0x0032), retries(0) [Thu Dec 12 07:46:42 2019][ 20.111016] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0032), lun(0) [Thu Dec 12 07:46:42 2019][ 20.122903] scsi 1:0:146:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:42 2019][ 20.131279] scsi 1:0:146:0: SSP: handle(0x0032), sas_addr(0x5000cca252543cc2), phy(22), device_name(0x5000cca252543cc3) [Thu Dec 12 07:46:42 2019][ 20.142051] scsi 1:0:146:0: enclosure logical id(0x5000ccab0405db00), slot(31) [Thu Dec 12 07:46:42 2019][ 20.149356] scsi 1:0:146:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:42 2019][ 20.156233] scsi 1:0:146:0: serial_number( 7SHHA4TG) [Thu Dec 12 07:46:42 2019][ 20.161804] scsi 1:0:146:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:42 2019][ 20.181854] mpt3sas_cm0: detecting: handle(0x0033), sas_address(0x5000cca26a24fcde), phy(23) [Thu Dec 12 07:46:42 2019][ 20.190293] mpt3sas_cm0: REPORT_LUNS: handle(0x0033), retries(0) [Thu Dec 12 07:46:42 2019][ 20.196427] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0033), lun(0) [Thu Dec 12 07:46:42 2019][ 20.203130] scsi 1:0:147:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:42 2019][ 20.211526] scsi 1:0:147:0: SSP: handle(0x0033), sas_addr(0x5000cca26a24fcde), phy(23), device_name(0x5000cca26a24fcdf) [Thu Dec 12 07:46:42 2019][ 20.222296] scsi 1:0:147:0: enclosure logical id(0x5000ccab0405db00), slot(32) [Thu Dec 12 07:46:42 2019][ 20.229602] scsi 1:0:147:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:42 2019][ 20.236494] scsi 1:0:147:0: serial_number( 2TGNALMD) [Thu Dec 12 07:46:42 2019][ 20.242069] scsi 1:0:147:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:42 2019][ 20.264860] mpt3sas_cm0: detecting: handle(0x0034), sas_address(0x5000cca252543bce), phy(24) [Thu Dec 12 07:46:42 2019][ 20.273298] mpt3sas_cm0: REPORT_LUNS: handle(0x0034), retries(0) [Thu Dec 12 07:46:42 2019][ 20.279438] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0034), lun(0) [Thu Dec 12 07:46:42 2019][ 20.286073] scsi 1:0:148:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:43 2019][ 20.294464] scsi 1:0:148:0: SSP: handle(0x0034), sas_addr(0x5000cca252543bce), phy(24), device_name(0x5000cca252543bcf) [Thu Dec 12 07:46:43 2019][ 20.305236] scsi 1:0:148:0: enclosure logical id(0x5000ccab0405db00), slot(33) [Thu Dec 12 07:46:43 2019][ 20.312541] scsi 1:0:148:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:43 2019][ 20.319433] scsi 1:0:148:0: serial_number( 7SHHA2UG) [Thu Dec 12 07:46:43 2019][ 20.325009] scsi 1:0:148:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:43 2019][ 20.344856] mpt3sas_cm0: detecting: handle(0x0035), sas_address(0x5000cca252551266), phy(25) [Thu Dec 12 07:46:43 2019][ 20.353300] mpt3sas_cm0: REPORT_LUNS: handle(0x0035), retries(0) [Thu Dec 12 07:46:43 2019][ 20.359433] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0035), lun(0) [Thu Dec 12 07:46:43 2019][ 20.366056] scsi 1:0:149:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:43 2019][ 20.374449] scsi 1:0:149:0: SSP: handle(0x0035), sas_addr(0x5000cca252551266), phy(25), device_name(0x5000cca252551267) [Thu Dec 12 07:46:43 2019][ 20.385222] scsi 1:0:149:0: enclosure logical id(0x5000ccab0405db00), slot(34) [Thu Dec 12 07:46:43 2019][ 20.392528] scsi 1:0:149:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:43 2019][ 20.399420] scsi 1:0:149:0: serial_number( 7SHHTBVG) [Thu Dec 12 07:46:43 2019][ 20.404993] scsi 1:0:149:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:43 2019][ 20.436858] mpt3sas_cm0: detecting: handle(0x0036), sas_address(0x5000cca252555fca), phy(26) [Thu Dec 12 07:46:43 2019][ 20.445296] mpt3sas_cm0: REPORT_LUNS: handle(0x0036), retries(0) [Thu Dec 12 07:46:43 2019][ 20.451437] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0036), lun(0) [Thu Dec 12 07:46:43 2019][ 20.458074] scsi 1:0:150:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:43 2019][ 20.466464] scsi 1:0:150:0: SSP: handle(0x0036), sas_addr(0x5000cca252555fca), phy(26), device_name(0x5000cca252555fcb) [Thu Dec 12 07:46:43 2019][ 20.477237] scsi 1:0:150:0: enclosure logical id(0x5000ccab0405db00), slot(35) [Thu Dec 12 07:46:43 2019][ 20.484543] scsi 1:0:150:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:43 2019][ 20.491435] scsi 1:0:150:0: serial_number( 7SHHYJMG) [Thu Dec 12 07:46:43 2019][ 20.497008] scsi 1:0:150:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:43 2019][ 20.519857] mpt3sas_cm0: detecting: handle(0x0037), sas_address(0x5000cca252559f7e), phy(27) [Thu Dec 12 07:46:43 2019][ 20.528299] mpt3sas_cm0: REPORT_LUNS: handle(0x0037), retries(0) [Thu Dec 12 07:46:43 2019][ 20.534468] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0037), lun(0) [Thu Dec 12 07:46:43 2019][ 20.541162] scsi 1:0:151:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:43 2019][ 20.549561] scsi 1:0:151:0: SSP: handle(0x0037), sas_addr(0x5000cca252559f7e), phy(27), device_name(0x5000cca252559f7f) [Thu Dec 12 07:46:43 2019][ 20.560334] scsi 1:0:151:0: enclosure logical id(0x5000ccab0405db00), slot(36) [Thu Dec 12 07:46:43 2019][ 20.567639] scsi 1:0:151:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:43 2019][ 20.574541] scsi 1:0:151:0: serial_number( 7SHJ2T4G) [Thu Dec 12 07:46:43 2019][ 20.580115] scsi 1:0:151:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:43 2019][ 20.602863] mpt3sas_cm0: detecting: handle(0x0038), sas_address(0x5000cca26c244bce), phy(28) [Thu Dec 12 07:46:43 2019][ 20.611299] mpt3sas_cm0: REPORT_LUNS: handle(0x0038), retries(0) [Thu Dec 12 07:46:43 2019][ 20.617443] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0038), lun(0) [Thu Dec 12 07:46:43 2019][ 20.624066] scsi 1:0:152:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:43 2019][ 20.632458] scsi 1:0:152:0: SSP: handle(0x0038), sas_addr(0x5000cca26c244bce), phy(28), device_name(0x5000cca26c244bcf) [Thu Dec 12 07:46:43 2019][ 20.643229] scsi 1:0:152:0: enclosure logical id(0x5000ccab0405db00), slot(37) [Thu Dec 12 07:46:43 2019][ 20.650535] scsi 1:0:152:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:43 2019][ 20.657431] scsi 1:0:152:0: serial_number( 1DGMYU2Z) [Thu Dec 12 07:46:43 2019][ 20.663003] scsi 1:0:152:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:43 2019][ 20.682862] mpt3sas_cm0: detecting: handle(0x0039), sas_address(0x5000cca26a2aa10e), phy(29) [Thu Dec 12 07:46:43 2019][ 20.691303] mpt3sas_cm0: REPORT_LUNS: handle(0x0039), retries(0) [Thu Dec 12 07:46:43 2019][ 20.697477] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0039), lun(0) [Thu Dec 12 07:46:43 2019][ 20.704272] scsi 1:0:153:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:43 2019][ 20.712661] scsi 1:0:153:0: SSP: handle(0x0039), sas_addr(0x5000cca26a2aa10e), phy(29), device_name(0x5000cca26a2aa10f) [Thu Dec 12 07:46:43 2019][ 20.723433] scsi 1:0:153:0: enclosure logical id(0x5000ccab0405db00), slot(38) [Thu Dec 12 07:46:43 2019][ 20.730738] scsi 1:0:153:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:43 2019][ 20.737631] scsi 1:0:153:0: serial_number( 2TGSET5D) [Thu Dec 12 07:46:43 2019][ 20.743206] scsi 1:0:153:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:43 2019][ 20.765862] mpt3sas_cm0: detecting: handle(0x003a), sas_address(0x5000cca25254e236), phy(30) [Thu Dec 12 07:46:43 2019][ 20.774303] mpt3sas_cm0: REPORT_LUNS: handle(0x003a), retries(0) [Thu Dec 12 07:46:43 2019][ 20.780476] mpt3sas_cm0: TEST_UNIT_READY: handle(0x003a), lun(0) [Thu Dec 12 07:46:43 2019][ 20.787088] scsi 1:0:154:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:43 2019][ 20.795477] scsi 1:0:154:0: SSP: handle(0x003a), sas_addr(0x5000cca25254e236), phy(30), device_name(0x5000cca25254e237) [Thu Dec 12 07:46:43 2019][ 20.806253] scsi 1:0:154:0: enclosure logical id(0x5000ccab0405db00), slot(39) [Thu Dec 12 07:46:43 2019][ 20.813559] scsi 1:0:154:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:43 2019][ 20.820449] scsi 1:0:154:0: serial_number( 7SHHP5BG) [Thu Dec 12 07:46:43 2019][ 20.826024] scsi 1:0:154:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:43 2019][ 20.857193] mpt3sas_cm0: detecting: handle(0x003b), sas_address(0x5000cca25254df96), phy(31) [Thu Dec 12 07:46:43 2019][ 20.865632] mpt3sas_cm0: REPORT_LUNS: handle(0x003b), retries(0) [Thu Dec 12 07:46:43 2019][ 20.871791] mpt3sas_cm0: TEST_UNIT_READY: handle(0x003b), lun(0) [Thu Dec 12 07:46:43 2019][ 20.878401] scsi 1:0:155:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:43 2019][ 20.886794] scsi 1:0:155:0: SSP: handle(0x003b), sas_addr(0x5000cca25254df96), phy(31), device_name(0x5000cca25254df97) [Thu Dec 12 07:46:43 2019][ 20.897565] scsi 1:0:155:0: enclosure logical id(0x5000ccab0405db00), slot(40) [Thu Dec 12 07:46:43 2019][ 20.904872] scsi 1:0:155:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:43 2019][ 20.911761] scsi 1:0:155:0: serial_number( 7SHHNZYG) [Thu Dec 12 07:46:43 2019][ 20.917335] scsi 1:0:155:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:43 2019][ 20.939861] mpt3sas_cm0: detecting: handle(0x003c), sas_address(0x5000cca25254e9d2), phy(32) [Thu Dec 12 07:46:43 2019][ 20.948298] mpt3sas_cm0: REPORT_LUNS: handle(0x003c), retries(0) [Thu Dec 12 07:46:43 2019][ 20.954460] mpt3sas_cm0: TEST_UNIT_READY: handle(0x003c), lun(0) [Thu Dec 12 07:46:43 2019][ 20.961086] scsi 1:0:156:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:43 2019][ 20.969468] scsi 1:0:156:0: SSP: handle(0x003c), sas_addr(0x5000cca25254e9d2), phy(32), device_name(0x5000cca25254e9d3) [Thu Dec 12 07:46:43 2019][ 20.980237] scsi 1:0:156:0: enclosure logical id(0x5000ccab0405db00), slot(41) [Thu Dec 12 07:46:43 2019][ 20.987542] scsi 1:0:156:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:43 2019][ 20.994433] scsi 1:0:156:0: serial_number( 7SHHPP2G) [Thu Dec 12 07:46:43 2019][ 21.000009] scsi 1:0:156:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:43 2019][ 21.022872] mpt3sas_cm0: detecting: handle(0x003d), sas_address(0x5000cca26a24008a), phy(33) [Thu Dec 12 07:46:43 2019][ 21.031308] mpt3sas_cm0: REPORT_LUNS: handle(0x003d), retries(0) [Thu Dec 12 07:46:43 2019][ 21.037454] mpt3sas_cm0: TEST_UNIT_READY: handle(0x003d), lun(0) [Thu Dec 12 07:46:43 2019][ 21.044108] scsi 1:0:157:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:43 2019][ 21.052501] scsi 1:0:157:0: SSP: handle(0x003d), sas_addr(0x5000cca26a24008a), phy(33), device_name(0x5000cca26a24008b) [Thu Dec 12 07:46:43 2019][ 21.063274] scsi 1:0:157:0: enclosure logical id(0x5000ccab0405db00), slot(42) [Thu Dec 12 07:46:43 2019][ 21.070579] scsi 1:0:157:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:43 2019][ 21.077478] scsi 1:0:157:0: serial_number( 2TGMTTPD) [Thu Dec 12 07:46:43 2019][ 21.083057] scsi 1:0:157:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:43 2019][ 21.105862] mpt3sas_cm0: detecting: handle(0x003e), sas_address(0x5000cca26a24b9ea), phy(34) [Thu Dec 12 07:46:43 2019][ 21.114300] mpt3sas_cm0: REPORT_LUNS: handle(0x003e), retries(0) [Thu Dec 12 07:46:43 2019][ 21.120437] mpt3sas_cm0: TEST_UNIT_READY: handle(0x003e), lun(0) [Thu Dec 12 07:46:43 2019][ 21.127060] scsi 1:0:158:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:43 2019][ 21.135448] scsi 1:0:158:0: SSP: handle(0x003e), sas_addr(0x5000cca26a24b9ea), phy(34), device_name(0x5000cca26a24b9eb) [Thu Dec 12 07:46:43 2019][ 21.146222] scsi 1:0:158:0: enclosure logical id(0x5000ccab0405db00), slot(43) [Thu Dec 12 07:46:43 2019][ 21.153527] scsi 1:0:158:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:43 2019][ 21.160420] scsi 1:0:158:0: serial_number( 2TGN64DD) [Thu Dec 12 07:46:43 2019][ 21.165994] scsi 1:0:158:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:43 2019][ 21.188863] mpt3sas_cm0: detecting: handle(0x003f), sas_address(0x5000cca26a25aed6), phy(35) [Thu Dec 12 07:46:43 2019][ 21.197303] mpt3sas_cm0: REPORT_LUNS: handle(0x003f), retries(0) [Thu Dec 12 07:46:43 2019][ 21.203454] mpt3sas_cm0: TEST_UNIT_READY: handle(0x003f), lun(0) [Thu Dec 12 07:46:43 2019][ 21.210122] scsi 1:0:159:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:43 2019][ 21.218520] scsi 1:0:159:0: SSP: handle(0x003f), sas_addr(0x5000cca26a25aed6), phy(35), device_name(0x5000cca26a25aed7) [Thu Dec 12 07:46:43 2019][ 21.229294] scsi 1:0:159:0: enclosure logical id(0x5000ccab0405db00), slot(44) [Thu Dec 12 07:46:43 2019][ 21.236600] scsi 1:0:159:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:43 2019][ 21.243492] scsi 1:0:159:0: serial_number( 2TGNRG1D) [Thu Dec 12 07:46:43 2019][ 21.249066] scsi 1:0:159:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:43 2019][ 21.271863] mpt3sas_cm0: detecting: handle(0x0040), sas_address(0x5000cca266d32b6a), phy(36) [Thu Dec 12 07:46:43 2019][ 21.280301] mpt3sas_cm0: REPORT_LUNS: handle(0x0040), retries(0) [Thu Dec 12 07:46:43 2019][ 21.286463] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0040), lun(0) [Thu Dec 12 07:46:43 2019][ 21.293081] scsi 1:0:160:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:44 2019][ 21.301466] scsi 1:0:160:0: SSP: handle(0x0040), sas_addr(0x5000cca266d32b6a), phy(36), device_name(0x5000cca266d32b6b) [Thu Dec 12 07:46:44 2019][ 21.312243] scsi 1:0:160:0: enclosure logical id(0x5000ccab0405db00), slot(45) [Thu Dec 12 07:46:44 2019][ 21.319548] scsi 1:0:160:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:44 2019][ 21.326441] scsi 1:0:160:0: serial_number( 7JKS46JK) [Thu Dec 12 07:46:44 2019][ 21.332014] scsi 1:0:160:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:44 2019][ 21.354866] mpt3sas_cm0: detecting: handle(0x0041), sas_address(0x5000cca26b9bf886), phy(37) [Thu Dec 12 07:46:44 2019][ 21.363306] mpt3sas_cm0: REPORT_LUNS: handle(0x0041), retries(0) [Thu Dec 12 07:46:44 2019][ 21.369440] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0041), lun(0) [Thu Dec 12 07:46:44 2019][ 21.376069] scsi 1:0:161:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:44 2019][ 21.384451] scsi 1:0:161:0: SSP: handle(0x0041), sas_addr(0x5000cca26b9bf886), phy(37), device_name(0x5000cca26b9bf887) [Thu Dec 12 07:46:44 2019][ 21.395227] scsi 1:0:161:0: enclosure logical id(0x5000ccab0405db00), slot(46) [Thu Dec 12 07:46:44 2019][ 21.402532] scsi 1:0:161:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:44 2019][ 21.409426] scsi 1:0:161:0: serial_number( 1SJST42Z) [Thu Dec 12 07:46:44 2019][ 21.415000] scsi 1:0:161:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:44 2019][ 21.434866] mpt3sas_cm0: detecting: handle(0x0042), sas_address(0x5000cca26b9b24ca), phy(38) [Thu Dec 12 07:46:44 2019][ 21.443308] mpt3sas_cm0: REPORT_LUNS: handle(0x0042), retries(0) [Thu Dec 12 07:46:44 2019][ 21.449449] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0042), lun(0) [Thu Dec 12 07:46:44 2019][ 21.456064] scsi 1:0:162:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:44 2019][ 21.464454] scsi 1:0:162:0: SSP: handle(0x0042), sas_addr(0x5000cca26b9b24ca), phy(38), device_name(0x5000cca26b9b24cb) [Thu Dec 12 07:46:44 2019][ 21.475229] scsi 1:0:162:0: enclosure logical id(0x5000ccab0405db00), slot(47) [Thu Dec 12 07:46:44 2019][ 21.482534] scsi 1:0:162:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:44 2019][ 21.489426] scsi 1:0:162:0: serial_number( 1SJSA0YZ) [Thu Dec 12 07:46:44 2019][ 21.495003] scsi 1:0:162:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:44 2019][ 21.514866] mpt3sas_cm0: detecting: handle(0x0043), sas_address(0x5000cca26a21d742), phy(39) [Thu Dec 12 07:46:44 2019][ 21.523303] mpt3sas_cm0: REPORT_LUNS: handle(0x0043), retries(0) [Thu Dec 12 07:46:44 2019][ 21.529438] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0043), lun(0) [Thu Dec 12 07:46:44 2019][ 21.536058] scsi 1:0:163:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:44 2019][ 21.544447] scsi 1:0:163:0: SSP: handle(0x0043), sas_addr(0x5000cca26a21d742), phy(39), device_name(0x5000cca26a21d743) [Thu Dec 12 07:46:44 2019][ 21.555225] scsi 1:0:163:0: enclosure logical id(0x5000ccab0405db00), slot(48) [Thu Dec 12 07:46:44 2019][ 21.562530] scsi 1:0:163:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:44 2019][ 21.569428] scsi 1:0:163:0: serial_number( 2TGLLYED) [Thu Dec 12 07:46:44 2019][ 21.575008] scsi 1:0:163:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:44 2019][ 21.594867] mpt3sas_cm0: detecting: handle(0x0044), sas_address(0x5000cca26a27af5e), phy(40) [Thu Dec 12 07:46:44 2019][ 21.603306] mpt3sas_cm0: REPORT_LUNS: handle(0x0044), retries(0) [Thu Dec 12 07:46:44 2019][ 21.609443] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0044), lun(0) [Thu Dec 12 07:46:44 2019][ 21.616083] scsi 1:0:164:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:44 2019][ 21.624473] scsi 1:0:164:0: SSP: handle(0x0044), sas_addr(0x5000cca26a27af5e), phy(40), device_name(0x5000cca26a27af5f) [Thu Dec 12 07:46:44 2019][ 21.635245] scsi 1:0:164:0: enclosure logical id(0x5000ccab0405db00), slot(49) [Thu Dec 12 07:46:44 2019][ 21.642549] scsi 1:0:164:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:44 2019][ 21.649442] scsi 1:0:164:0: serial_number( 2TGPUL5D) [Thu Dec 12 07:46:44 2019][ 21.655016] scsi 1:0:164:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:44 2019][ 21.674870] mpt3sas_cm0: detecting: handle(0x0045), sas_address(0x5000cca2525552e6), phy(41) [Thu Dec 12 07:46:44 2019][ 21.683308] mpt3sas_cm0: REPORT_LUNS: handle(0x0045), retries(0) [Thu Dec 12 07:46:44 2019][ 21.689444] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0045), lun(0) [Thu Dec 12 07:46:44 2019][ 21.697354] scsi 1:0:165:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:44 2019][ 21.705764] scsi 1:0:165:0: SSP: handle(0x0045), sas_addr(0x5000cca2525552e6), phy(41), device_name(0x5000cca2525552e7) [Thu Dec 12 07:46:44 2019][ 21.716539] scsi 1:0:165:0: enclosure logical id(0x5000ccab0405db00), slot(50) [Thu Dec 12 07:46:44 2019][ 21.723844] scsi 1:0:165:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:44 2019][ 21.730736] scsi 1:0:165:0: serial_number( 7SHHXP0G) [Thu Dec 12 07:46:44 2019][ 21.736309] scsi 1:0:165:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:44 2019][ 21.758873] mpt3sas_cm0: detecting: handle(0x0046), sas_address(0x5000cca26a26dff2), phy(42) [Thu Dec 12 07:46:44 2019][ 21.767314] mpt3sas_cm0: REPORT_LUNS: handle(0x0046), retries(0) [Thu Dec 12 07:46:44 2019][ 21.773483] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0046), lun(0) [Thu Dec 12 07:46:44 2019][ 21.780108] scsi 1:0:166:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:44 2019][ 21.788497] scsi 1:0:166:0: SSP: handle(0x0046), sas_addr(0x5000cca26a26dff2), phy(42), device_name(0x5000cca26a26dff3) [Thu Dec 12 07:46:44 2019][ 21.799272] scsi 1:0:166:0: enclosure logical id(0x5000ccab0405db00), slot(51) [Thu Dec 12 07:46:44 2019][ 21.806576] scsi 1:0:166:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:44 2019][ 21.813469] scsi 1:0:166:0: serial_number( 2TGPBSYD) [Thu Dec 12 07:46:44 2019][ 21.819044] scsi 1:0:166:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:44 2019][ 21.841869] mpt3sas_cm0: detecting: handle(0x0047), sas_address(0x5000cca26b9c5d52), phy(43) [Thu Dec 12 07:46:44 2019][ 21.850308] mpt3sas_cm0: REPORT_LUNS: handle(0x0047), retries(0) [Thu Dec 12 07:46:44 2019][ 21.856479] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0047), lun(0) [Thu Dec 12 07:46:44 2019][ 21.863106] scsi 1:0:167:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:44 2019][ 21.871490] scsi 1:0:167:0: SSP: handle(0x0047), sas_addr(0x5000cca26b9c5d52), phy(43), device_name(0x5000cca26b9c5d53) [Thu Dec 12 07:46:44 2019][ 21.882263] scsi 1:0:167:0: enclosure logical id(0x5000ccab0405db00), slot(52) [Thu Dec 12 07:46:44 2019][ 21.889568] scsi 1:0:167:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:44 2019][ 21.896462] scsi 1:0:167:0: serial_number( 1SJSZV5Z) [Thu Dec 12 07:46:44 2019][ 21.902036] scsi 1:0:167:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:44 2019][ 21.922466] mpt3sas_cm0: detecting: handle(0x0048), sas_address(0x5000cca26b9602c6), phy(44) [Thu Dec 12 07:46:44 2019][ 21.930908] mpt3sas_cm0: REPORT_LUNS: handle(0x0048), retries(0) [Thu Dec 12 07:46:44 2019][ 21.937081] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0048), lun(0) [Thu Dec 12 07:46:44 2019][ 21.943706] scsi 1:0:168:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:44 2019][ 21.952098] scsi 1:0:168:0: SSP: handle(0x0048), sas_addr(0x5000cca26b9602c6), phy(44), device_name(0x5000cca26b9602c7) [Thu Dec 12 07:46:44 2019][ 21.962874] scsi 1:0:168:0: enclosure logical id(0x5000ccab0405db00), slot(53) [Thu Dec 12 07:46:44 2019][ 21.970179] scsi 1:0:168:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:44 2019][ 21.977070] scsi 1:0:168:0: serial_number( 1SJNHJ4Z) [Thu Dec 12 07:46:44 2019][ 21.982645] scsi 1:0:168:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:44 2019][ 22.002873] mpt3sas_cm0: detecting: handle(0x0049), sas_address(0x5000cca252544a02), phy(45) [Thu Dec 12 07:46:44 2019][ 22.011308] mpt3sas_cm0: REPORT_LUNS: handle(0x0049), retries(0) [Thu Dec 12 07:46:44 2019][ 22.017451] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0049), lun(0) [Thu Dec 12 07:46:44 2019][ 22.024080] scsi 1:0:169:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:44 2019][ 22.032465] scsi 1:0:169:0: SSP: handle(0x0049), sas_addr(0x5000cca252544a02), phy(45), device_name(0x5000cca252544a03) [Thu Dec 12 07:46:44 2019][ 22.043240] scsi 1:0:169:0: enclosure logical id(0x5000ccab0405db00), slot(54) [Thu Dec 12 07:46:44 2019][ 22.050547] scsi 1:0:169:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:44 2019][ 22.057437] scsi 1:0:169:0: serial_number( 7SHHB14G) [Thu Dec 12 07:46:44 2019][ 22.063012] scsi 1:0:169:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:44 2019][ 22.082872] mpt3sas_cm0: detecting: handle(0x004a), sas_address(0x5000cca252559f9e), phy(46) [Thu Dec 12 07:46:44 2019][ 22.091310] mpt3sas_cm0: REPORT_LUNS: handle(0x004a), retries(0) [Thu Dec 12 07:46:44 2019][ 22.097441] mpt3sas_cm0: TEST_UNIT_READY: handle(0x004a), lun(0) [Thu Dec 12 07:46:44 2019][ 22.104083] scsi 1:0:170:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:44 2019][ 22.112468] scsi 1:0:170:0: SSP: handle(0x004a), sas_addr(0x5000cca252559f9e), phy(46), device_name(0x5000cca252559f9f) [Thu Dec 12 07:46:44 2019][ 22.123244] scsi 1:0:170:0: enclosure logical id(0x5000ccab0405db00), slot(55) [Thu Dec 12 07:46:44 2019][ 22.130549] scsi 1:0:170:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:44 2019][ 22.137439] scsi 1:0:170:0: serial_number( 7SHJ2TDG) [Thu Dec 12 07:46:44 2019][ 22.143015] scsi 1:0:170:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:44 2019][ 22.163871] mpt3sas_cm0: detecting: handle(0x004b), sas_address(0x5000cca25255571e), phy(47) [Thu Dec 12 07:46:44 2019][ 22.172311] mpt3sas_cm0: REPORT_LUNS: handle(0x004b), retries(0) [Thu Dec 12 07:46:44 2019][ 22.178487] mpt3sas_cm0: TEST_UNIT_READY: handle(0x004b), lun(0) [Thu Dec 12 07:46:44 2019][ 22.185106] scsi 1:0:171:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:44 2019][ 22.193497] scsi 1:0:171:0: SSP: handle(0x004b), sas_addr(0x5000cca25255571e), phy(47), device_name(0x5000cca25255571f) [Thu Dec 12 07:46:44 2019][ 22.204269] scsi 1:0:171:0: enclosure logical id(0x5000ccab0405db00), slot(56) [Thu Dec 12 07:46:44 2019][ 22.211575] scsi 1:0:171:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:44 2019][ 22.218465] scsi 1:0:171:0: serial_number( 7SHHXYRG) [Thu Dec 12 07:46:44 2019][ 22.224039] scsi 1:0:171:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:44 2019][ 22.243872] mpt3sas_cm0: detecting: handle(0x004c), sas_address(0x5000cca26b9bf57e), phy(48) [Thu Dec 12 07:46:44 2019][ 22.252314] mpt3sas_cm0: REPORT_LUNS: handle(0x004c), retries(0) [Thu Dec 12 07:46:44 2019][ 22.258492] mpt3sas_cm0: TEST_UNIT_READY: handle(0x004c), lun(0) [Thu Dec 12 07:46:44 2019][ 22.265138] scsi 1:0:172:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:44 2019][ 22.273526] scsi 1:0:172:0: SSP: handle(0x004c), sas_addr(0x5000cca26b9bf57e), phy(48), device_name(0x5000cca26b9bf57f) [Thu Dec 12 07:46:44 2019][ 22.284296] scsi 1:0:172:0: enclosure logical id(0x5000ccab0405db00), slot(57) [Thu Dec 12 07:46:45 2019][ 22.291602] scsi 1:0:172:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:45 2019][ 22.298496] scsi 1:0:172:0: serial_number( 1SJSSXUZ) [Thu Dec 12 07:46:45 2019][ 22.304068] scsi 1:0:172:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:45 2019][ 22.323876] mpt3sas_cm0: detecting: handle(0x004d), sas_address(0x5000cca252555372), phy(49) [Thu Dec 12 07:46:45 2019][ 22.332316] mpt3sas_cm0: REPORT_LUNS: handle(0x004d), retries(0) [Thu Dec 12 07:46:45 2019][ 22.338459] mpt3sas_cm0: TEST_UNIT_READY: handle(0x004d), lun(0) [Thu Dec 12 07:46:45 2019][ 22.345077] scsi 1:0:173:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:45 2019][ 22.353464] scsi 1:0:173:0: SSP: handle(0x004d), sas_addr(0x5000cca252555372), phy(49), device_name(0x5000cca252555373) [Thu Dec 12 07:46:45 2019][ 22.364239] scsi 1:0:173:0: enclosure logical id(0x5000ccab0405db00), slot(58) [Thu Dec 12 07:46:45 2019][ 22.371545] scsi 1:0:173:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:45 2019][ 22.378436] scsi 1:0:173:0: serial_number( 7SHHXR4G) [Thu Dec 12 07:46:45 2019][ 22.384012] scsi 1:0:173:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:45 2019][ 22.403871] mpt3sas_cm0: detecting: handle(0x004e), sas_address(0x5000cca25253eefe), phy(50) [Thu Dec 12 07:46:45 2019][ 22.412308] mpt3sas_cm0: REPORT_LUNS: handle(0x004e), retries(0) [Thu Dec 12 07:46:45 2019][ 22.418441] mpt3sas_cm0: TEST_UNIT_READY: handle(0x004e), lun(0) [Thu Dec 12 07:46:45 2019][ 22.425070] scsi 1:0:174:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:45 2019][ 22.433459] scsi 1:0:174:0: SSP: handle(0x004e), sas_addr(0x5000cca25253eefe), phy(50), device_name(0x5000cca25253eeff) [Thu Dec 12 07:46:45 2019][ 22.444234] scsi 1:0:174:0: enclosure logical id(0x5000ccab0405db00), slot(59) [Thu Dec 12 07:46:45 2019][ 22.451539] scsi 1:0:174:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:45 2019][ 22.458431] scsi 1:0:174:0: serial_number( 7SHH4Z7G) [Thu Dec 12 07:46:45 2019][ 22.464006] scsi 1:0:174:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:45 2019][ 22.486374] mpt3sas_cm0: expander_add: handle(0x001a), parent(0x0017), sas_addr(0x5000ccab0405db7b), phys(68) [Thu Dec 12 07:46:45 2019][ 22.507565] mpt3sas_cm0: detecting: handle(0x004f), sas_address(0x5000cca26b9cbb06), phy(42) [Thu Dec 12 07:46:45 2019][ 22.516009] mpt3sas_cm0: REPORT_LUNS: handle(0x004f), retries(0) [Thu Dec 12 07:46:45 2019][ 22.522171] mpt3sas_cm0: TEST_UNIT_READY: handle(0x004f), lun(0) [Thu Dec 12 07:46:45 2019][ 22.528854] scsi 1:0:175:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:45 2019][ 22.537245] scsi 1:0:175:0: SSP: handle(0x004f), sas_addr(0x5000cca26b9cbb06), phy(42), device_name(0x5000cca26b9cbb07) [Thu Dec 12 07:46:45 2019][ 22.548017] scsi 1:0:175:0: enclosure logical id(0x5000ccab0405db00), slot(1) [Thu Dec 12 07:46:45 2019][ 22.555235] scsi 1:0:175:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:45 2019][ 22.562132] scsi 1:0:175:0: serial_number( 1SJT62MZ) [Thu Dec 12 07:46:45 2019][ 22.567713] scsi 1:0:175:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:45 2019][ 22.587876] mpt3sas_cm0: detecting: handle(0x0050), sas_address(0x5000cca252544476), phy(43) [Thu Dec 12 07:46:45 2019][ 22.596314] mpt3sas_cm0: REPORT_LUNS: handle(0x0050), retries(0) [Thu Dec 12 07:46:45 2019][ 22.602452] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0050), lun(0) [Thu Dec 12 07:46:45 2019][ 22.609098] scsi 1:0:176:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:45 2019][ 22.617491] scsi 1:0:176:0: SSP: handle(0x0050), sas_addr(0x5000cca252544476), phy(43), device_name(0x5000cca252544477) [Thu Dec 12 07:46:45 2019][ 22.628263] scsi 1:0:176:0: enclosure logical id(0x5000ccab0405db00), slot(3) [Thu Dec 12 07:46:45 2019][ 22.635482] scsi 1:0:176:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:45 2019][ 22.642374] scsi 1:0:176:0: serial_number( 7SHHANPG) [Thu Dec 12 07:46:45 2019][ 22.647946] scsi 1:0:176:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:45 2019][ 22.668470] mpt3sas_cm0: detecting: handle(0x0051), sas_address(0x5000cca26a26173e), phy(44) [Thu Dec 12 07:46:45 2019][ 22.676904] mpt3sas_cm0: REPORT_LUNS: handle(0x0051), retries(0) [Thu Dec 12 07:46:45 2019][ 22.683042] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0051), lun(0) [Thu Dec 12 07:46:45 2019][ 22.689673] scsi 1:0:177:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:45 2019][ 22.698067] scsi 1:0:177:0: SSP: handle(0x0051), sas_addr(0x5000cca26a26173e), phy(44), device_name(0x5000cca26a26173f) [Thu Dec 12 07:46:45 2019][ 22.708837] scsi 1:0:177:0: enclosure logical id(0x5000ccab0405db00), slot(4) [Thu Dec 12 07:46:45 2019][ 22.716056] scsi 1:0:177:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:45 2019][ 22.722949] scsi 1:0:177:0: serial_number( 2TGNYDLD) [Thu Dec 12 07:46:45 2019][ 22.728524] scsi 1:0:177:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:45 2019][ 22.750878] mpt3sas_cm0: detecting: handle(0x0052), sas_address(0x5000cca252544cb6), phy(45) [Thu Dec 12 07:46:45 2019][ 22.759318] mpt3sas_cm0: REPORT_LUNS: handle(0x0052), retries(0) [Thu Dec 12 07:46:45 2019][ 22.765469] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0052), lun(0) [Thu Dec 12 07:46:45 2019][ 22.772144] scsi 1:0:178:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:45 2019][ 22.780555] scsi 1:0:178:0: SSP: handle(0x0052), sas_addr(0x5000cca252544cb6), phy(45), device_name(0x5000cca252544cb7) [Thu Dec 12 07:46:45 2019][ 22.791328] scsi 1:0:178:0: enclosure logical id(0x5000ccab0405db00), slot(5) [Thu Dec 12 07:46:45 2019][ 22.798548] scsi 1:0:178:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:45 2019][ 22.805441] scsi 1:0:178:0: serial_number( 7SHHB6RG) [Thu Dec 12 07:46:45 2019][ 22.811012] scsi 1:0:178:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:45 2019][ 22.833878] mpt3sas_cm0: detecting: handle(0x0053), sas_address(0x5000cca26c238692), phy(46) [Thu Dec 12 07:46:45 2019][ 22.842320] mpt3sas_cm0: REPORT_LUNS: handle(0x0053), retries(0) [Thu Dec 12 07:46:45 2019][ 22.848458] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0053), lun(0) [Thu Dec 12 07:46:45 2019][ 22.855086] scsi 1:0:179:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:45 2019][ 22.863479] scsi 1:0:179:0: SSP: handle(0x0053), sas_addr(0x5000cca26c238692), phy(46), device_name(0x5000cca26c238693) [Thu Dec 12 07:46:45 2019][ 22.874252] scsi 1:0:179:0: enclosure logical id(0x5000ccab0405db00), slot(6) [Thu Dec 12 07:46:45 2019][ 22.881470] scsi 1:0:179:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:45 2019][ 22.888365] scsi 1:0:179:0: serial_number( 1DGMJNWZ) [Thu Dec 12 07:46:45 2019][ 22.893937] scsi 1:0:179:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:45 2019][ 22.916886] mpt3sas_cm0: detecting: handle(0x0054), sas_address(0x5000cca26a2ac96a), phy(47) [Thu Dec 12 07:46:45 2019][ 22.925323] mpt3sas_cm0: REPORT_LUNS: handle(0x0054), retries(0) [Thu Dec 12 07:46:45 2019][ 22.931488] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0054), lun(0) [Thu Dec 12 07:46:45 2019][ 22.938107] scsi 1:0:180:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:45 2019][ 22.946495] scsi 1:0:180:0: SSP: handle(0x0054), sas_addr(0x5000cca26a2ac96a), phy(47), device_name(0x5000cca26a2ac96b) [Thu Dec 12 07:46:45 2019][ 22.957271] scsi 1:0:180:0: enclosure logical id(0x5000ccab0405db00), slot(7) [Thu Dec 12 07:46:45 2019][ 22.964490] scsi 1:0:180:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:45 2019][ 22.971380] scsi 1:0:180:0: serial_number( 2TGSJGHD) [Thu Dec 12 07:46:45 2019][ 22.976957] scsi 1:0:180:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:45 2019][ 22.996879] mpt3sas_cm0: detecting: handle(0x0055), sas_address(0x5000cca25253e61a), phy(48) [Thu Dec 12 07:46:45 2019][ 23.005313] mpt3sas_cm0: REPORT_LUNS: handle(0x0055), retries(0) [Thu Dec 12 07:46:45 2019][ 23.011477] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0055), lun(0) [Thu Dec 12 07:46:45 2019][ 23.018132] scsi 1:0:181:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:45 2019][ 23.026515] scsi 1:0:181:0: SSP: handle(0x0055), sas_addr(0x5000cca25253e61a), phy(48), device_name(0x5000cca25253e61b) [Thu Dec 12 07:46:45 2019][ 23.037290] scsi 1:0:181:0: enclosure logical id(0x5000ccab0405db00), slot(8) [Thu Dec 12 07:46:45 2019][ 23.044510] scsi 1:0:181:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:45 2019][ 23.051400] scsi 1:0:181:0: serial_number( 7SHH4BWG) [Thu Dec 12 07:46:45 2019][ 23.056976] scsi 1:0:181:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:45 2019][ 23.082874] mpt3sas_cm0: detecting: handle(0x0056), sas_address(0x5000cca252542cfe), phy(49) [Thu Dec 12 07:46:45 2019][ 23.091332] mpt3sas_cm0: REPORT_LUNS: handle(0x0056), retries(0) [Thu Dec 12 07:46:45 2019][ 23.097491] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0056), lun(0) [Thu Dec 12 07:46:45 2019][ 23.104122] scsi 1:0:182:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:45 2019][ 23.112509] scsi 1:0:182:0: SSP: handle(0x0056), sas_addr(0x5000cca252542cfe), phy(49), device_name(0x5000cca252542cff) [Thu Dec 12 07:46:45 2019][ 23.123282] scsi 1:0:182:0: enclosure logical id(0x5000ccab0405db00), slot(9) [Thu Dec 12 07:46:45 2019][ 23.130501] scsi 1:0:182:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:45 2019][ 23.137393] scsi 1:0:182:0: serial_number( 7SHH937G) [Thu Dec 12 07:46:45 2019][ 23.142965] scsi 1:0:182:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:45 2019][ 23.162890] mpt3sas_cm0: detecting: handle(0x0057), sas_address(0x5000cca26a3181fe), phy(50) [Thu Dec 12 07:46:45 2019][ 23.171327] mpt3sas_cm0: REPORT_LUNS: handle(0x0057), retries(0) [Thu Dec 12 07:46:45 2019][ 23.177465] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0057), lun(0) [Thu Dec 12 07:46:45 2019][ 23.184106] scsi 1:0:183:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:45 2019][ 23.192495] scsi 1:0:183:0: SSP: handle(0x0057), sas_addr(0x5000cca26a3181fe), phy(50), device_name(0x5000cca26a3181ff) [Thu Dec 12 07:46:45 2019][ 23.203266] scsi 1:0:183:0: enclosure logical id(0x5000ccab0405db00), slot(10) [Thu Dec 12 07:46:45 2019][ 23.210573] scsi 1:0:183:0: enclosure level(0x0000), connector name( C1 ) [Thu Dec 12 07:46:45 2019][ 23.217465] scsi 1:0:183:0: serial_number( 2TGW71ND) [Thu Dec 12 07:46:45 2019][ 23.223040] scsi 1:0:183:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:45 2019][ 23.245394] mpt3sas_cm0: expander_add: handle(0x0099), parent(0x000a), sas_addr(0x5000ccab0405db3d), phys(49) [Thu Dec 12 07:46:45 2019][ 23.266906] mpt3sas_cm0: detecting: handle(0x009d), sas_address(0x5000ccab0405db3c), phy(48) [Thu Dec 12 07:46:45 2019][ 23.275344] mpt3sas_cm0: REPORT_LUNS: handle(0x009d), retries(0) [Thu Dec 12 07:46:45 2019][ 23.281727] mpt3sas_cm0: TEST_UNIT_READY: handle(0x009d), lun(0) [Thu Dec 12 07:46:45 2019][ 23.288619] scsi 1:0:184:0: Enclosure HGST H4060-J 2033 PQ: 0 ANSI: 6 [Thu Dec 12 07:46:46 2019][ 23.297196] scsi 1:0:184:0: set ignore_delay_remove for handle(0x009d) [Thu Dec 12 07:46:46 2019][ 23.303724] scsi 1:0:184:0: SES: handle(0x009d), sas_addr(0x5000ccab0405db3c), phy(48), device_name(0x0000000000000000) [Thu Dec 12 07:46:46 2019][ 23.314495] scsi 1:0:184:0: enclosure logical id(0x5000ccab0405db00), slot(60) [Thu Dec 12 07:46:46 2019][ 23.321802] scsi 1:0:184:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:46 2019][ 23.328694] scsi 1:0:184:0: serial_number(USWSJ03918EZ0069 ) [Thu Dec 12 07:46:46 2019][ 23.334614] scsi 1:0:184:0: qdepth(1), tagged(0), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:46 2019][ 23.359256] mpt3sas_cm0: expander_add: handle(0x009b), parent(0x0099), sas_addr(0x5000ccab0405db3f), phys(68) [Thu Dec 12 07:46:46 2019][ 23.381749] mpt3sas_cm0: detecting: handle(0x009e), sas_address(0x5000cca252550a75), phy(0) [Thu Dec 12 07:46:46 2019][ 23.390099] mpt3sas_cm0: REPORT_LUNS: handle(0x009e), retries(0) [Thu Dec 12 07:46:46 2019][ 23.396232] mpt3sas_cm0: TEST_UNIT_READY: handle(0x009e), lun(0) [Thu Dec 12 07:46:46 2019][ 23.402896] scsi 1:0:185:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:46 2019][ 23.411297] scsi 1:0:185:0: SSP: handle(0x009e), sas_addr(0x5000cca252550a75), phy(0), device_name(0x5000cca252550a77) [Thu Dec 12 07:46:46 2019][ 23.421981] scsi 1:0:185:0: enclosure logical id(0x5000ccab0405db00), slot(0) [Thu Dec 12 07:46:46 2019][ 23.429201] scsi 1:0:185:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:46 2019][ 23.436092] scsi 1:0:185:0: serial_number( 7SHHSVGG) [Thu Dec 12 07:46:46 2019][ 23.441665] scsi 1:0:185:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:46 2019][ 23.461890] mpt3sas_cm0: detecting: handle(0x009f), sas_address(0x5000cca25253eb31), phy(1) [Thu Dec 12 07:46:46 2019][ 23.470242] mpt3sas_cm0: REPORT_LUNS: handle(0x009f), retries(0) [Thu Dec 12 07:46:46 2019][ 23.476375] mpt3sas_cm0: TEST_UNIT_READY: handle(0x009f), lun(0) [Thu Dec 12 07:46:46 2019][ 23.483025] scsi 1:0:186:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:46 2019][ 23.491410] scsi 1:0:186:0: SSP: handle(0x009f), sas_addr(0x5000cca25253eb31), phy(1), device_name(0x5000cca25253eb33) [Thu Dec 12 07:46:46 2019][ 23.502097] scsi 1:0:186:0: enclosure logical id(0x5000ccab0405db00), slot(2) [Thu Dec 12 07:46:46 2019][ 23.509316] scsi 1:0:186:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:46 2019][ 23.516206] scsi 1:0:186:0: serial_number( 7SHH4RDG) [Thu Dec 12 07:46:46 2019][ 23.521782] scsi 1:0:186:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:46 2019][ 23.541888] mpt3sas_cm0: detecting: handle(0x00a0), sas_address(0x5000cca26b950bb5), phy(2) [Thu Dec 12 07:46:46 2019][ 23.550258] mpt3sas_cm0: REPORT_LUNS: handle(0x00a0), retries(0) [Thu Dec 12 07:46:46 2019][ 23.556399] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a0), lun(0) [Thu Dec 12 07:46:46 2019][ 23.563055] scsi 1:0:187:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:46 2019][ 23.571462] scsi 1:0:187:0: SSP: handle(0x00a0), sas_addr(0x5000cca26b950bb5), phy(2), device_name(0x5000cca26b950bb7) [Thu Dec 12 07:46:46 2019][ 23.582152] scsi 1:0:187:0: enclosure logical id(0x5000ccab0405db00), slot(11) [Thu Dec 12 07:46:46 2019][ 23.589458] scsi 1:0:187:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:46 2019][ 23.596349] scsi 1:0:187:0: serial_number( 1SJMZ22Z) [Thu Dec 12 07:46:46 2019][ 23.601921] scsi 1:0:187:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:46 2019][ 23.621889] mpt3sas_cm0: detecting: handle(0x00a1), sas_address(0x5000cca25253f3bd), phy(3) [Thu Dec 12 07:46:46 2019][ 23.630237] mpt3sas_cm0: REPORT_LUNS: handle(0x00a1), retries(0) [Thu Dec 12 07:46:46 2019][ 23.636376] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a1), lun(0) [Thu Dec 12 07:46:46 2019][ 23.643058] scsi 1:0:188:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:46 2019][ 23.651454] scsi 1:0:188:0: SSP: handle(0x00a1), sas_addr(0x5000cca25253f3bd), phy(3), device_name(0x5000cca25253f3bf) [Thu Dec 12 07:46:46 2019][ 23.662137] scsi 1:0:188:0: enclosure logical id(0x5000ccab0405db00), slot(12) [Thu Dec 12 07:46:46 2019][ 23.669443] scsi 1:0:188:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:46 2019][ 23.676333] scsi 1:0:188:0: serial_number( 7SHH591G) [Thu Dec 12 07:46:46 2019][ 23.681907] scsi 1:0:188:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:46 2019][ 23.701891] mpt3sas_cm0: detecting: handle(0x00a2), sas_address(0x5000cca26a2ac3d9), phy(4) [Thu Dec 12 07:46:46 2019][ 23.710241] mpt3sas_cm0: REPORT_LUNS: handle(0x00a2), retries(0) [Thu Dec 12 07:46:46 2019][ 23.716384] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a2), lun(0) [Thu Dec 12 07:46:46 2019][ 23.723051] scsi 1:0:189:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:46 2019][ 23.731454] scsi 1:0:189:0: SSP: handle(0x00a2), sas_addr(0x5000cca26a2ac3d9), phy(4), device_name(0x5000cca26a2ac3db) [Thu Dec 12 07:46:46 2019][ 23.742140] scsi 1:0:189:0: enclosure logical id(0x5000ccab0405db00), slot(13) [Thu Dec 12 07:46:46 2019][ 23.749446] scsi 1:0:189:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:46 2019][ 23.756335] scsi 1:0:189:0: serial_number( 2TGSJ30D) [Thu Dec 12 07:46:46 2019][ 23.761910] scsi 1:0:189:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:46 2019][ 23.781892] mpt3sas_cm0: detecting: handle(0x00a3), sas_address(0x5000cca252541029), phy(5) [Thu Dec 12 07:46:46 2019][ 23.790244] mpt3sas_cm0: REPORT_LUNS: handle(0x00a3), retries(0) [Thu Dec 12 07:46:46 2019][ 23.796376] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a3), lun(0) [Thu Dec 12 07:46:46 2019][ 23.803023] scsi 1:0:190:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:46 2019][ 23.811433] scsi 1:0:190:0: SSP: handle(0x00a3), sas_addr(0x5000cca252541029), phy(5), device_name(0x5000cca25254102b) [Thu Dec 12 07:46:46 2019][ 23.822115] scsi 1:0:190:0: enclosure logical id(0x5000ccab0405db00), slot(14) [Thu Dec 12 07:46:46 2019][ 23.829421] scsi 1:0:190:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:46 2019][ 23.836313] scsi 1:0:190:0: serial_number( 7SHH75RG) [Thu Dec 12 07:46:46 2019][ 23.841888] scsi 1:0:190:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:46 2019][ 23.861893] mpt3sas_cm0: detecting: handle(0x00a4), sas_address(0x5000cca252545349), phy(6) [Thu Dec 12 07:46:46 2019][ 23.870248] mpt3sas_cm0: REPORT_LUNS: handle(0x00a4), retries(0) [Thu Dec 12 07:46:46 2019][ 23.876420] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a4), lun(0) [Thu Dec 12 07:46:46 2019][ 23.883078] scsi 1:0:191:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:46 2019][ 23.891472] scsi 1:0:191:0: SSP: handle(0x00a4), sas_addr(0x5000cca252545349), phy(6), device_name(0x5000cca25254534b) [Thu Dec 12 07:46:46 2019][ 23.902153] scsi 1:0:191:0: enclosure logical id(0x5000ccab0405db00), slot(15) [Thu Dec 12 07:46:46 2019][ 23.909460] scsi 1:0:191:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:46 2019][ 23.916350] scsi 1:0:191:0: serial_number( 7SHHBN9G) [Thu Dec 12 07:46:46 2019][ 23.921925] scsi 1:0:191:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:46 2019][ 23.941890] mpt3sas_cm0: detecting: handle(0x00a5), sas_address(0x5000cca2525430c5), phy(7) [Thu Dec 12 07:46:46 2019][ 23.950240] mpt3sas_cm0: REPORT_LUNS: handle(0x00a5), retries(0) [Thu Dec 12 07:46:46 2019][ 23.956403] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a5), lun(0) [Thu Dec 12 07:46:46 2019][ 23.963043] scsi 1:0:192:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:46 2019][ 23.971438] scsi 1:0:192:0: SSP: handle(0x00a5), sas_addr(0x5000cca2525430c5), phy(7), device_name(0x5000cca2525430c7) [Thu Dec 12 07:46:46 2019][ 23.982120] scsi 1:0:192:0: enclosure logical id(0x5000ccab0405db00), slot(16) [Thu Dec 12 07:46:46 2019][ 23.989427] scsi 1:0:192:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:46 2019][ 23.996319] scsi 1:0:192:0: serial_number( 7SHH9B1G) [Thu Dec 12 07:46:46 2019][ 24.001894] scsi 1:0:192:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:46 2019][ 24.021895] mpt3sas_cm0: detecting: handle(0x00a6), sas_address(0x5000cca25254385d), phy(8) [Thu Dec 12 07:46:46 2019][ 24.030243] mpt3sas_cm0: REPORT_LUNS: handle(0x00a6), retries(0) [Thu Dec 12 07:46:46 2019][ 24.036381] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a6), lun(0) [Thu Dec 12 07:46:46 2019][ 24.043050] scsi 1:0:193:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:46 2019][ 24.051445] scsi 1:0:193:0: SSP: handle(0x00a6), sas_addr(0x5000cca25254385d), phy(8), device_name(0x5000cca25254385f) [Thu Dec 12 07:46:46 2019][ 24.062133] scsi 1:0:193:0: enclosure logical id(0x5000ccab0405db00), slot(17) [Thu Dec 12 07:46:46 2019][ 24.069440] scsi 1:0:193:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:46 2019][ 24.076330] scsi 1:0:193:0: serial_number( 7SHH9VRG) [Thu Dec 12 07:46:46 2019][ 24.081905] scsi 1:0:193:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:46 2019][ 24.101891] mpt3sas_cm0: detecting: handle(0x00a7), sas_address(0x5000cca25253f30d), phy(9) [Thu Dec 12 07:46:46 2019][ 24.110236] mpt3sas_cm0: REPORT_LUNS: handle(0x00a7), retries(0) [Thu Dec 12 07:46:46 2019][ 24.116372] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a7), lun(0) [Thu Dec 12 07:46:46 2019][ 24.123005] scsi 1:0:194:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:46 2019][ 24.131430] scsi 1:0:194:0: SSP: handle(0x00a7), sas_addr(0x5000cca25253f30d), phy(9), device_name(0x5000cca25253f30f) [Thu Dec 12 07:46:46 2019][ 24.142118] scsi 1:0:194:0: enclosure logical id(0x5000ccab0405db00), slot(18) [Thu Dec 12 07:46:46 2019][ 24.149425] scsi 1:0:194:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:46 2019][ 24.156315] scsi 1:0:194:0: serial_number( 7SHH57MG) [Thu Dec 12 07:46:46 2019][ 24.161890] scsi 1:0:194:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:46 2019][ 24.190893] mpt3sas_cm0: detecting: handle(0x00a8), sas_address(0x5000cca252545f65), phy(10) [Thu Dec 12 07:46:46 2019][ 24.199333] mpt3sas_cm0: REPORT_LUNS: handle(0x00a8), retries(0) [Thu Dec 12 07:46:46 2019][ 24.205480] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a8), lun(0) [Thu Dec 12 07:46:46 2019][ 24.212129] scsi 1:0:195:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:46 2019][ 24.220517] scsi 1:0:195:0: SSP: handle(0x00a8), sas_addr(0x5000cca252545f65), phy(10), device_name(0x5000cca252545f67) [Thu Dec 12 07:46:46 2019][ 24.231290] scsi 1:0:195:0: enclosure logical id(0x5000ccab0405db00), slot(19) [Thu Dec 12 07:46:46 2019][ 24.238597] scsi 1:0:195:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:46 2019][ 24.245489] scsi 1:0:195:0: serial_number( 7SHHDG9G) [Thu Dec 12 07:46:46 2019][ 24.251064] scsi 1:0:195:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:46 2019][ 24.270894] mpt3sas_cm0: detecting: handle(0x00a9), sas_address(0x5000cca266daa4e5), phy(11) [Thu Dec 12 07:46:46 2019][ 24.279335] mpt3sas_cm0: REPORT_LUNS: handle(0x00a9), retries(0) [Thu Dec 12 07:46:46 2019][ 24.285468] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a9), lun(0) [Thu Dec 12 07:46:46 2019][ 24.292127] scsi 1:0:196:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:46 2019][ 24.300529] scsi 1:0:196:0: SSP: handle(0x00a9), sas_addr(0x5000cca266daa4e5), phy(11), device_name(0x5000cca266daa4e7) [Thu Dec 12 07:46:47 2019][ 24.311303] scsi 1:0:196:0: enclosure logical id(0x5000ccab0405db00), slot(20) [Thu Dec 12 07:46:47 2019][ 24.318609] scsi 1:0:196:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:47 2019][ 24.325501] scsi 1:0:196:0: serial_number( 7JKW7MYK) [Thu Dec 12 07:46:47 2019][ 24.331075] scsi 1:0:196:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:47 2019][ 24.353893] mpt3sas_cm0: detecting: handle(0x00aa), sas_address(0x5000cca26a25167d), phy(12) [Thu Dec 12 07:46:47 2019][ 24.362331] mpt3sas_cm0: REPORT_LUNS: handle(0x00aa), retries(0) [Thu Dec 12 07:46:47 2019][ 24.368476] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00aa), lun(0) [Thu Dec 12 07:46:47 2019][ 24.375138] scsi 1:0:197:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:47 2019][ 24.383540] scsi 1:0:197:0: SSP: handle(0x00aa), sas_addr(0x5000cca26a25167d), phy(12), device_name(0x5000cca26a25167f) [Thu Dec 12 07:46:47 2019][ 24.394312] scsi 1:0:197:0: enclosure logical id(0x5000ccab0405db00), slot(21) [Thu Dec 12 07:46:47 2019][ 24.401618] scsi 1:0:197:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:47 2019][ 24.408510] scsi 1:0:197:0: serial_number( 2TGND9JD) [Thu Dec 12 07:46:47 2019][ 24.414084] scsi 1:0:197:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:47 2019][ 24.436897] mpt3sas_cm0: detecting: handle(0x00ab), sas_address(0x5000cca25253eda9), phy(13) [Thu Dec 12 07:46:47 2019][ 24.445340] mpt3sas_cm0: REPORT_LUNS: handle(0x00ab), retries(0) [Thu Dec 12 07:46:47 2019][ 24.451474] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ab), lun(0) [Thu Dec 12 07:46:47 2019][ 24.458119] scsi 1:0:198:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:47 2019][ 24.466507] scsi 1:0:198:0: SSP: handle(0x00ab), sas_addr(0x5000cca25253eda9), phy(13), device_name(0x5000cca25253edab) [Thu Dec 12 07:46:47 2019][ 24.477278] scsi 1:0:198:0: enclosure logical id(0x5000ccab0405db00), slot(22) [Thu Dec 12 07:46:47 2019][ 24.484584] scsi 1:0:198:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:47 2019][ 24.491478] scsi 1:0:198:0: serial_number( 7SHH4WHG) [Thu Dec 12 07:46:47 2019][ 24.497052] scsi 1:0:198:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:47 2019][ 24.516897] mpt3sas_cm0: detecting: handle(0x00ac), sas_address(0x5000cca266d491a1), phy(14) [Thu Dec 12 07:46:47 2019][ 24.525335] mpt3sas_cm0: REPORT_LUNS: handle(0x00ac), retries(0) [Thu Dec 12 07:46:47 2019][ 24.531479] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ac), lun(0) [Thu Dec 12 07:46:47 2019][ 24.538126] scsi 1:0:199:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:47 2019][ 24.546521] scsi 1:0:199:0: SSP: handle(0x00ac), sas_addr(0x5000cca266d491a1), phy(14), device_name(0x5000cca266d491a3) [Thu Dec 12 07:46:47 2019][ 24.557290] scsi 1:0:199:0: enclosure logical id(0x5000ccab0405db00), slot(23) [Thu Dec 12 07:46:47 2019][ 24.564598] scsi 1:0:199:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:47 2019][ 24.571492] scsi 1:0:199:0: serial_number( 7JKSX22K) [Thu Dec 12 07:46:47 2019][ 24.577073] scsi 1:0:199:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:47 2019][ 24.599896] mpt3sas_cm0: detecting: handle(0x00ad), sas_address(0x5000cca26b9a7099), phy(15) [Thu Dec 12 07:46:47 2019][ 24.608333] mpt3sas_cm0: REPORT_LUNS: handle(0x00ad), retries(0) [Thu Dec 12 07:46:47 2019][ 24.614469] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ad), lun(0) [Thu Dec 12 07:46:47 2019][ 24.621127] scsi 1:0:200:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:47 2019][ 24.629529] scsi 1:0:200:0: SSP: handle(0x00ad), sas_addr(0x5000cca26b9a7099), phy(15), device_name(0x5000cca26b9a709b) [Thu Dec 12 07:46:47 2019][ 24.640301] scsi 1:0:200:0: enclosure logical id(0x5000ccab0405db00), slot(24) [Thu Dec 12 07:46:47 2019][ 24.647607] scsi 1:0:200:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:47 2019][ 24.654499] scsi 1:0:200:0: serial_number( 1SJRY0YZ) [Thu Dec 12 07:46:47 2019][ 24.660072] scsi 1:0:200:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:47 2019][ 24.682902] mpt3sas_cm0: detecting: handle(0x00ae), sas_address(0x5000cca25253f831), phy(16) [Thu Dec 12 07:46:47 2019][ 24.691335] mpt3sas_cm0: REPORT_LUNS: handle(0x00ae), retries(0) [Thu Dec 12 07:46:47 2019][ 24.697482] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ae), lun(0) [Thu Dec 12 07:46:47 2019][ 24.704141] scsi 1:0:201:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:47 2019][ 24.712527] scsi 1:0:201:0: SSP: handle(0x00ae), sas_addr(0x5000cca25253f831), phy(16), device_name(0x5000cca25253f833) [Thu Dec 12 07:46:47 2019][ 24.723303] scsi 1:0:201:0: enclosure logical id(0x5000ccab0405db00), slot(25) [Thu Dec 12 07:46:47 2019][ 24.730607] scsi 1:0:201:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:47 2019][ 24.737500] scsi 1:0:201:0: serial_number( 7SHH5L7G) [Thu Dec 12 07:46:47 2019][ 24.743073] scsi 1:0:201:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:47 2019][ 24.765901] mpt3sas_cm0: detecting: handle(0x00af), sas_address(0x5000cca26a2ab23d), phy(17) [Thu Dec 12 07:46:47 2019][ 24.774336] mpt3sas_cm0: REPORT_LUNS: handle(0x00af), retries(0) [Thu Dec 12 07:46:47 2019][ 24.780471] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00af), lun(0) [Thu Dec 12 07:46:47 2019][ 24.787117] scsi 1:0:202:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:47 2019][ 24.795507] scsi 1:0:202:0: SSP: handle(0x00af), sas_addr(0x5000cca26a2ab23d), phy(17), device_name(0x5000cca26a2ab23f) [Thu Dec 12 07:46:47 2019][ 24.806277] scsi 1:0:202:0: enclosure logical id(0x5000ccab0405db00), slot(26) [Thu Dec 12 07:46:47 2019][ 24.813583] scsi 1:0:202:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:47 2019][ 24.820475] scsi 1:0:202:0: serial_number( 2TGSGXND) [Thu Dec 12 07:46:47 2019][ 24.826050] scsi 1:0:202:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:47 2019][ 24.848911] mpt3sas_cm0: detecting: handle(0x00b0), sas_address(0x5000cca26b9b9695), phy(18) [Thu Dec 12 07:46:47 2019][ 24.857348] mpt3sas_cm0: REPORT_LUNS: handle(0x00b0), retries(0) [Thu Dec 12 07:46:47 2019][ 24.863489] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b0), lun(0) [Thu Dec 12 07:46:47 2019][ 24.870127] scsi 1:0:203:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:47 2019][ 24.878514] scsi 1:0:203:0: SSP: handle(0x00b0), sas_addr(0x5000cca26b9b9695), phy(18), device_name(0x5000cca26b9b9697) [Thu Dec 12 07:46:47 2019][ 24.889289] scsi 1:0:203:0: enclosure logical id(0x5000ccab0405db00), slot(27) [Thu Dec 12 07:46:47 2019][ 24.896592] scsi 1:0:203:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:47 2019][ 24.903486] scsi 1:0:203:0: serial_number( 1SJSKLWZ) [Thu Dec 12 07:46:47 2019][ 24.909059] scsi 1:0:203:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:47 2019][ 24.928903] mpt3sas_cm0: detecting: handle(0x00b1), sas_address(0x5000cca252559471), phy(19) [Thu Dec 12 07:46:47 2019][ 24.937343] mpt3sas_cm0: REPORT_LUNS: handle(0x00b1), retries(0) [Thu Dec 12 07:46:47 2019][ 24.943509] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b1), lun(0) [Thu Dec 12 07:46:47 2019][ 24.950159] scsi 1:0:204:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:47 2019][ 24.958550] scsi 1:0:204:0: SSP: handle(0x00b1), sas_addr(0x5000cca252559471), phy(19), device_name(0x5000cca252559473) [Thu Dec 12 07:46:47 2019][ 24.969324] scsi 1:0:204:0: enclosure logical id(0x5000ccab0405db00), slot(28) [Thu Dec 12 07:46:47 2019][ 24.976631] scsi 1:0:204:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:47 2019][ 24.983522] scsi 1:0:204:0: serial_number( 7SHJ21AG) [Thu Dec 12 07:46:47 2019][ 24.989096] scsi 1:0:204:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:47 2019][ 25.008898] mpt3sas_cm0: detecting: handle(0x00b2), sas_address(0x5000cca25253f94d), phy(20) [Thu Dec 12 07:46:47 2019][ 25.017334] mpt3sas_cm0: REPORT_LUNS: handle(0x00b2), retries(0) [Thu Dec 12 07:46:47 2019][ 25.023463] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b2), lun(0) [Thu Dec 12 07:46:47 2019][ 25.030102] scsi 1:0:205:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:47 2019][ 25.038494] scsi 1:0:205:0: SSP: handle(0x00b2), sas_addr(0x5000cca25253f94d), phy(20), device_name(0x5000cca25253f94f) [Thu Dec 12 07:46:47 2019][ 25.049268] scsi 1:0:205:0: enclosure logical id(0x5000ccab0405db00), slot(29) [Thu Dec 12 07:46:47 2019][ 25.056574] scsi 1:0:205:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:47 2019][ 25.063469] scsi 1:0:205:0: serial_number( 7SHH5NJG) [Thu Dec 12 07:46:47 2019][ 25.069047] scsi 1:0:205:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:47 2019][ 25.088902] mpt3sas_cm0: detecting: handle(0x00b3), sas_address(0x5000cca25253e699), phy(21) [Thu Dec 12 07:46:47 2019][ 25.097340] mpt3sas_cm0: REPORT_LUNS: handle(0x00b3), retries(0) [Thu Dec 12 07:46:47 2019][ 25.103487] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b3), lun(0) [Thu Dec 12 07:46:47 2019][ 25.110136] scsi 1:0:206:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:47 2019][ 25.118530] scsi 1:0:206:0: SSP: handle(0x00b3), sas_addr(0x5000cca25253e699), phy(21), device_name(0x5000cca25253e69b) [Thu Dec 12 07:46:47 2019][ 25.129305] scsi 1:0:206:0: enclosure logical id(0x5000ccab0405db00), slot(30) [Thu Dec 12 07:46:47 2019][ 25.136610] scsi 1:0:206:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:47 2019][ 25.143502] scsi 1:0:206:0: serial_number( 7SHH4DXG) [Thu Dec 12 07:46:47 2019][ 25.149075] scsi 1:0:206:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:47 2019][ 25.175554] mpt3sas_cm0: detecting: handle(0x00b4), sas_address(0x5000cca252543cc1), phy(22) [Thu Dec 12 07:46:47 2019][ 25.183992] mpt3sas_cm0: REPORT_LUNS: handle(0x00b4), retries(0) [Thu Dec 12 07:46:47 2019][ 25.190139] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b4), lun(0) [Thu Dec 12 07:46:47 2019][ 25.196777] scsi 1:0:207:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:47 2019][ 25.205171] scsi 1:0:207:0: SSP: handle(0x00b4), sas_addr(0x5000cca252543cc1), phy(22), device_name(0x5000cca252543cc3) [Thu Dec 12 07:46:47 2019][ 25.215945] scsi 1:0:207:0: enclosure logical id(0x5000ccab0405db00), slot(31) [Thu Dec 12 07:46:47 2019][ 25.223252] scsi 1:0:207:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:47 2019][ 25.230144] scsi 1:0:207:0: serial_number( 7SHHA4TG) [Thu Dec 12 07:46:47 2019][ 25.235718] scsi 1:0:207:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:47 2019][ 25.255901] mpt3sas_cm0: detecting: handle(0x00b5), sas_address(0x5000cca26a24fcdd), phy(23) [Thu Dec 12 07:46:47 2019][ 25.264339] mpt3sas_cm0: REPORT_LUNS: handle(0x00b5), retries(0) [Thu Dec 12 07:46:47 2019][ 25.270476] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b5), lun(0) [Thu Dec 12 07:46:47 2019][ 25.277125] scsi 1:0:208:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:47 2019][ 25.285516] scsi 1:0:208:0: SSP: handle(0x00b5), sas_addr(0x5000cca26a24fcdd), phy(23), device_name(0x5000cca26a24fcdf) [Thu Dec 12 07:46:48 2019][ 25.296286] scsi 1:0:208:0: enclosure logical id(0x5000ccab0405db00), slot(32) [Thu Dec 12 07:46:48 2019][ 25.303594] scsi 1:0:208:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:48 2019][ 25.310485] scsi 1:0:208:0: serial_number( 2TGNALMD) [Thu Dec 12 07:46:48 2019][ 25.316058] scsi 1:0:208:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:48 2019][ 25.335906] mpt3sas_cm0: detecting: handle(0x00b6), sas_address(0x5000cca252543bcd), phy(24) [Thu Dec 12 07:46:48 2019][ 25.344339] mpt3sas_cm0: REPORT_LUNS: handle(0x00b6), retries(0) [Thu Dec 12 07:46:48 2019][ 25.350472] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b6), lun(0) [Thu Dec 12 07:46:48 2019][ 25.357125] scsi 1:0:209:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:48 2019][ 25.365515] scsi 1:0:209:0: SSP: handle(0x00b6), sas_addr(0x5000cca252543bcd), phy(24), device_name(0x5000cca252543bcf) [Thu Dec 12 07:46:48 2019][ 25.376289] scsi 1:0:209:0: enclosure logical id(0x5000ccab0405db00), slot(33) [Thu Dec 12 07:46:48 2019][ 25.383595] scsi 1:0:209:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:48 2019][ 25.390488] scsi 1:0:209:0: serial_number( 7SHHA2UG) [Thu Dec 12 07:46:48 2019][ 25.396063] scsi 1:0:209:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:48 2019][ 25.415906] mpt3sas_cm0: detecting: handle(0x00b7), sas_address(0x5000cca252551265), phy(25) [Thu Dec 12 07:46:48 2019][ 25.424345] mpt3sas_cm0: REPORT_LUNS: handle(0x00b7), retries(0) [Thu Dec 12 07:46:48 2019][ 25.430494] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b7), lun(0) [Thu Dec 12 07:46:48 2019][ 25.437143] scsi 1:0:210:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:48 2019][ 25.445542] scsi 1:0:210:0: SSP: handle(0x00b7), sas_addr(0x5000cca252551265), phy(25), device_name(0x5000cca252551267) [Thu Dec 12 07:46:48 2019][ 25.456318] scsi 1:0:210:0: enclosure logical id(0x5000ccab0405db00), slot(34) [Thu Dec 12 07:46:48 2019][ 25.463624] scsi 1:0:210:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:48 2019][ 25.470516] scsi 1:0:210:0: serial_number( 7SHHTBVG) [Thu Dec 12 07:46:48 2019][ 25.476091] scsi 1:0:210:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:48 2019][ 25.498904] mpt3sas_cm0: detecting: handle(0x00b8), sas_address(0x5000cca252555fc9), phy(26) [Thu Dec 12 07:46:48 2019][ 25.507346] mpt3sas_cm0: REPORT_LUNS: handle(0x00b8), retries(0) [Thu Dec 12 07:46:48 2019][ 25.513494] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b8), lun(0) [Thu Dec 12 07:46:48 2019][ 25.520219] scsi 1:0:211:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:48 2019][ 25.528615] scsi 1:0:211:0: SSP: handle(0x00b8), sas_addr(0x5000cca252555fc9), phy(26), device_name(0x5000cca252555fcb) [Thu Dec 12 07:46:48 2019][ 25.539388] scsi 1:0:211:0: enclosure logical id(0x5000ccab0405db00), slot(35) [Thu Dec 12 07:46:48 2019][ 25.546695] scsi 1:0:211:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:48 2019][ 25.553589] scsi 1:0:211:0: serial_number( 7SHHYJMG) [Thu Dec 12 07:46:48 2019][ 25.559163] scsi 1:0:211:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:48 2019][ 25.590907] mpt3sas_cm0: detecting: handle(0x00b9), sas_address(0x5000cca252559f7d), phy(27) [Thu Dec 12 07:46:48 2019][ 25.599341] mpt3sas_cm0: REPORT_LUNS: handle(0x00b9), retries(0) [Thu Dec 12 07:46:48 2019][ 25.605482] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b9), lun(0) [Thu Dec 12 07:46:48 2019][ 25.612142] scsi 1:0:212:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:48 2019][ 25.620540] scsi 1:0:212:0: SSP: handle(0x00b9), sas_addr(0x5000cca252559f7d), phy(27), device_name(0x5000cca252559f7f) [Thu Dec 12 07:46:48 2019][ 25.631309] scsi 1:0:212:0: enclosure logical id(0x5000ccab0405db00), slot(36) [Thu Dec 12 07:46:48 2019][ 25.638615] scsi 1:0:212:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:48 2019][ 25.645506] scsi 1:0:212:0: serial_number( 7SHJ2T4G) [Thu Dec 12 07:46:48 2019][ 25.651080] scsi 1:0:212:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:48 2019][ 25.671909] mpt3sas_cm0: detecting: handle(0x00ba), sas_address(0x5000cca26c244bcd), phy(28) [Thu Dec 12 07:46:48 2019][ 25.680350] mpt3sas_cm0: REPORT_LUNS: handle(0x00ba), retries(0) [Thu Dec 12 07:46:48 2019][ 25.686482] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ba), lun(0) [Thu Dec 12 07:46:48 2019][ 25.693143] scsi 1:0:213:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:48 2019][ 25.701536] scsi 1:0:213:0: SSP: handle(0x00ba), sas_addr(0x5000cca26c244bcd), phy(28), device_name(0x5000cca26c244bcf) [Thu Dec 12 07:46:48 2019][ 25.712307] scsi 1:0:213:0: enclosure logical id(0x5000ccab0405db00), slot(37) [Thu Dec 12 07:46:48 2019][ 25.719615] scsi 1:0:213:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:48 2019][ 25.726507] scsi 1:0:213:0: serial_number( 1DGMYU2Z) [Thu Dec 12 07:46:48 2019][ 25.732080] scsi 1:0:213:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:48 2019][ 25.751909] mpt3sas_cm0: detecting: handle(0x00bb), sas_address(0x5000cca26a2aa10d), phy(29) [Thu Dec 12 07:46:48 2019][ 25.760343] mpt3sas_cm0: REPORT_LUNS: handle(0x00bb), retries(0) [Thu Dec 12 07:46:48 2019][ 25.766476] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00bb), lun(0) [Thu Dec 12 07:46:48 2019][ 25.773131] scsi 1:0:214:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:48 2019][ 25.781520] scsi 1:0:214:0: SSP: handle(0x00bb), sas_addr(0x5000cca26a2aa10d), phy(29), device_name(0x5000cca26a2aa10f) [Thu Dec 12 07:46:48 2019][ 25.792293] scsi 1:0:214:0: enclosure logical id(0x5000ccab0405db00), slot(38) [Thu Dec 12 07:46:48 2019][ 25.799599] scsi 1:0:214:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:48 2019][ 25.806491] scsi 1:0:214:0: serial_number( 2TGSET5D) [Thu Dec 12 07:46:48 2019][ 25.812066] scsi 1:0:214:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:48 2019][ 25.834908] mpt3sas_cm0: detecting: handle(0x00bc), sas_address(0x5000cca25254e235), phy(30) [Thu Dec 12 07:46:48 2019][ 25.843347] mpt3sas_cm0: REPORT_LUNS: handle(0x00bc), retries(0) [Thu Dec 12 07:46:48 2019][ 25.849487] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00bc), lun(0) [Thu Dec 12 07:46:48 2019][ 25.856124] scsi 1:0:215:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:48 2019][ 25.864512] scsi 1:0:215:0: SSP: handle(0x00bc), sas_addr(0x5000cca25254e235), phy(30), device_name(0x5000cca25254e237) [Thu Dec 12 07:46:48 2019][ 25.875287] scsi 1:0:215:0: enclosure logical id(0x5000ccab0405db00), slot(39) [Thu Dec 12 07:46:48 2019][ 25.882593] scsi 1:0:215:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:48 2019][ 25.889486] scsi 1:0:215:0: serial_number( 7SHHP5BG) [Thu Dec 12 07:46:48 2019][ 25.895058] scsi 1:0:215:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:48 2019][ 25.914909] mpt3sas_cm0: detecting: handle(0x00bd), sas_address(0x5000cca25254df95), phy(31) [Thu Dec 12 07:46:48 2019][ 25.923348] mpt3sas_cm0: REPORT_LUNS: handle(0x00bd), retries(0) [Thu Dec 12 07:46:48 2019][ 25.929513] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00bd), lun(0) [Thu Dec 12 07:46:48 2019][ 25.936157] scsi 1:0:216:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:48 2019][ 25.944550] scsi 1:0:216:0: SSP: handle(0x00bd), sas_addr(0x5000cca25254df95), phy(31), device_name(0x5000cca25254df97) [Thu Dec 12 07:46:48 2019][ 25.955325] scsi 1:0:216:0: enclosure logical id(0x5000ccab0405db00), slot(40) [Thu Dec 12 07:46:48 2019][ 25.962629] scsi 1:0:216:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:48 2019][ 25.969521] scsi 1:0:216:0: serial_number( 7SHHNZYG) [Thu Dec 12 07:46:48 2019][ 25.975096] scsi 1:0:216:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:48 2019][ 25.997908] mpt3sas_cm0: detecting: handle(0x00be), sas_address(0x5000cca25254e9d1), phy(32) [Thu Dec 12 07:46:48 2019][ 26.006343] mpt3sas_cm0: REPORT_LUNS: handle(0x00be), retries(0) [Thu Dec 12 07:46:48 2019][ 26.012514] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00be), lun(0) [Thu Dec 12 07:46:48 2019][ 26.019171] scsi 1:0:217:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:48 2019][ 26.027561] scsi 1:0:217:0: SSP: handle(0x00be), sas_addr(0x5000cca25254e9d1), phy(32), device_name(0x5000cca25254e9d3) [Thu Dec 12 07:46:48 2019][ 26.038333] scsi 1:0:217:0: enclosure logical id(0x5000ccab0405db00), slot(41) [Thu Dec 12 07:46:48 2019][ 26.045638] scsi 1:0:217:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:48 2019][ 26.052534] scsi 1:0:217:0: serial_number( 7SHHPP2G) [Thu Dec 12 07:46:48 2019][ 26.058104] scsi 1:0:217:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:48 2019][ 26.077908] mpt3sas_cm0: detecting: handle(0x00bf), sas_address(0x5000cca26a240089), phy(33) [Thu Dec 12 07:46:48 2019][ 26.086344] mpt3sas_cm0: REPORT_LUNS: handle(0x00bf), retries(0) [Thu Dec 12 07:46:48 2019][ 26.092491] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00bf), lun(0) [Thu Dec 12 07:46:48 2019][ 26.099150] scsi 1:0:218:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:48 2019][ 26.107544] scsi 1:0:218:0: SSP: handle(0x00bf), sas_addr(0x5000cca26a240089), phy(33), device_name(0x5000cca26a24008b) [Thu Dec 12 07:46:48 2019][ 26.118318] scsi 1:0:218:0: enclosure logical id(0x5000ccab0405db00), slot(42) [Thu Dec 12 07:46:48 2019][ 26.125624] scsi 1:0:218:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:48 2019][ 26.132518] scsi 1:0:218:0: serial_number( 2TGMTTPD) [Thu Dec 12 07:46:48 2019][ 26.138089] scsi 1:0:218:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:48 2019][ 26.157909] mpt3sas_cm0: detecting: handle(0x00c0), sas_address(0x5000cca26a24b9e9), phy(34) [Thu Dec 12 07:46:48 2019][ 26.166346] mpt3sas_cm0: REPORT_LUNS: handle(0x00c0), retries(0) [Thu Dec 12 07:46:48 2019][ 26.172507] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c0), lun(0) [Thu Dec 12 07:46:48 2019][ 26.179150] scsi 1:0:219:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:48 2019][ 26.187538] scsi 1:0:219:0: SSP: handle(0x00c0), sas_addr(0x5000cca26a24b9e9), phy(34), device_name(0x5000cca26a24b9eb) [Thu Dec 12 07:46:48 2019][ 26.198314] scsi 1:0:219:0: enclosure logical id(0x5000ccab0405db00), slot(43) [Thu Dec 12 07:46:48 2019][ 26.205620] scsi 1:0:219:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:48 2019][ 26.212511] scsi 1:0:219:0: serial_number( 2TGN64DD) [Thu Dec 12 07:46:48 2019][ 26.218085] scsi 1:0:219:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:48 2019][ 26.237912] mpt3sas_cm0: detecting: handle(0x00c1), sas_address(0x5000cca26a25aed5), phy(35) [Thu Dec 12 07:46:48 2019][ 26.246349] mpt3sas_cm0: REPORT_LUNS: handle(0x00c1), retries(0) [Thu Dec 12 07:46:48 2019][ 26.252494] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c1), lun(0) [Thu Dec 12 07:46:48 2019][ 26.259185] scsi 1:0:220:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:48 2019][ 26.267587] scsi 1:0:220:0: SSP: handle(0x00c1), sas_addr(0x5000cca26a25aed5), phy(35), device_name(0x5000cca26a25aed7) [Thu Dec 12 07:46:48 2019][ 26.278361] scsi 1:0:220:0: enclosure logical id(0x5000ccab0405db00), slot(44) [Thu Dec 12 07:46:48 2019][ 26.285665] scsi 1:0:220:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:48 2019][ 26.292559] scsi 1:0:220:0: serial_number( 2TGNRG1D) [Thu Dec 12 07:46:49 2019][ 26.298132] scsi 1:0:220:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:49 2019][ 26.320913] mpt3sas_cm0: detecting: handle(0x00c2), sas_address(0x5000cca266d32b69), phy(36) [Thu Dec 12 07:46:49 2019][ 26.329351] mpt3sas_cm0: REPORT_LUNS: handle(0x00c2), retries(0) [Thu Dec 12 07:46:49 2019][ 26.335487] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c2), lun(0) [Thu Dec 12 07:46:49 2019][ 26.342139] scsi 1:0:221:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:49 2019][ 26.350535] scsi 1:0:221:0: SSP: handle(0x00c2), sas_addr(0x5000cca266d32b69), phy(36), device_name(0x5000cca266d32b6b) [Thu Dec 12 07:46:49 2019][ 26.361309] scsi 1:0:221:0: enclosure logical id(0x5000ccab0405db00), slot(45) [Thu Dec 12 07:46:49 2019][ 26.368614] scsi 1:0:221:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:49 2019][ 26.375506] scsi 1:0:221:0: serial_number( 7JKS46JK) [Thu Dec 12 07:46:49 2019][ 26.381082] scsi 1:0:221:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:49 2019][ 26.400913] mpt3sas_cm0: detecting: handle(0x00c3), sas_address(0x5000cca26b9bf885), phy(37) [Thu Dec 12 07:46:49 2019][ 26.409353] mpt3sas_cm0: REPORT_LUNS: handle(0x00c3), retries(0) [Thu Dec 12 07:46:49 2019][ 26.415493] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c3), lun(0) [Thu Dec 12 07:46:49 2019][ 26.422125] scsi 1:0:222:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:49 2019][ 26.430510] scsi 1:0:222:0: SSP: handle(0x00c3), sas_addr(0x5000cca26b9bf885), phy(37), device_name(0x5000cca26b9bf887) [Thu Dec 12 07:46:49 2019][ 26.441287] scsi 1:0:222:0: enclosure logical id(0x5000ccab0405db00), slot(46) [Thu Dec 12 07:46:49 2019][ 26.448591] scsi 1:0:222:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:49 2019][ 26.455481] scsi 1:0:222:0: serial_number( 1SJST42Z) [Thu Dec 12 07:46:49 2019][ 26.461059] scsi 1:0:222:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:49 2019][ 26.483915] mpt3sas_cm0: detecting: handle(0x00c4), sas_address(0x5000cca26b9b24c9), phy(38) [Thu Dec 12 07:46:49 2019][ 26.492353] mpt3sas_cm0: REPORT_LUNS: handle(0x00c4), retries(0) [Thu Dec 12 07:46:49 2019][ 26.498489] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c4), lun(0) [Thu Dec 12 07:46:49 2019][ 26.505117] scsi 1:0:223:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:49 2019][ 26.513504] scsi 1:0:223:0: SSP: handle(0x00c4), sas_addr(0x5000cca26b9b24c9), phy(38), device_name(0x5000cca26b9b24cb) [Thu Dec 12 07:46:49 2019][ 26.524278] scsi 1:0:223:0: enclosure logical id(0x5000ccab0405db00), slot(47) [Thu Dec 12 07:46:49 2019][ 26.531584] scsi 1:0:223:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:49 2019][ 26.538475] scsi 1:0:223:0: serial_number( 1SJSA0YZ) [Thu Dec 12 07:46:49 2019][ 26.544050] scsi 1:0:223:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:49 2019][ 26.569914] mpt3sas_cm0: detecting: handle(0x00c5), sas_address(0x5000cca26a21d741), phy(39) [Thu Dec 12 07:46:49 2019][ 26.578365] mpt3sas_cm0: REPORT_LUNS: handle(0x00c5), retries(0) [Thu Dec 12 07:46:49 2019][ 26.584510] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c5), lun(0) [Thu Dec 12 07:46:49 2019][ 26.591170] scsi 1:0:224:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:49 2019][ 26.599563] scsi 1:0:224:0: SSP: handle(0x00c5), sas_addr(0x5000cca26a21d741), phy(39), device_name(0x5000cca26a21d743) [Thu Dec 12 07:46:49 2019][ 26.610339] scsi 1:0:224:0: enclosure logical id(0x5000ccab0405db00), slot(48) [Thu Dec 12 07:46:49 2019][ 26.617645] scsi 1:0:224:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:49 2019][ 26.624539] scsi 1:0:224:0: serial_number( 2TGLLYED) [Thu Dec 12 07:46:49 2019][ 26.630111] scsi 1:0:224:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:49 2019][ 26.649912] mpt3sas_cm0: detecting: handle(0x00c6), sas_address(0x5000cca26a27af5d), phy(40) [Thu Dec 12 07:46:49 2019][ 26.658350] mpt3sas_cm0: REPORT_LUNS: handle(0x00c6), retries(0) [Thu Dec 12 07:46:49 2019][ 26.664492] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c6), lun(0) [Thu Dec 12 07:46:49 2019][ 26.671147] scsi 1:0:225:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:49 2019][ 26.679537] scsi 1:0:225:0: SSP: handle(0x00c6), sas_addr(0x5000cca26a27af5d), phy(40), device_name(0x5000cca26a27af5f) [Thu Dec 12 07:46:49 2019][ 26.690308] scsi 1:0:225:0: enclosure logical id(0x5000ccab0405db00), slot(49) [Thu Dec 12 07:46:49 2019][ 26.697614] scsi 1:0:225:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:49 2019][ 26.704506] scsi 1:0:225:0: serial_number( 2TGPUL5D) [Thu Dec 12 07:46:49 2019][ 26.710080] scsi 1:0:225:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:49 2019][ 26.732917] mpt3sas_cm0: detecting: handle(0x00c7), sas_address(0x5000cca2525552e5), phy(41) [Thu Dec 12 07:46:49 2019][ 26.741349] mpt3sas_cm0: REPORT_LUNS: handle(0x00c7), retries(0) [Thu Dec 12 07:46:49 2019][ 26.747488] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c7), lun(0) [Thu Dec 12 07:46:49 2019][ 26.754164] scsi 1:0:226:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:49 2019][ 26.762557] scsi 1:0:226:0: SSP: handle(0x00c7), sas_addr(0x5000cca2525552e5), phy(41), device_name(0x5000cca2525552e7) [Thu Dec 12 07:46:49 2019][ 26.773326] scsi 1:0:226:0: enclosure logical id(0x5000ccab0405db00), slot(50) [Thu Dec 12 07:46:49 2019][ 26.780633] scsi 1:0:226:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:49 2019][ 26.787524] scsi 1:0:226:0: serial_number( 7SHHXP0G) [Thu Dec 12 07:46:49 2019][ 26.793098] scsi 1:0:226:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:49 2019][ 26.812921] mpt3sas_cm0: detecting: handle(0x00c8), sas_address(0x5000cca26a26dff1), phy(42) [Thu Dec 12 07:46:49 2019][ 26.821364] mpt3sas_cm0: REPORT_LUNS: handle(0x00c8), retries(0) [Thu Dec 12 07:46:49 2019][ 26.827508] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c8), lun(0) [Thu Dec 12 07:46:49 2019][ 26.834160] scsi 1:0:227:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:49 2019][ 26.842549] scsi 1:0:227:0: SSP: handle(0x00c8), sas_addr(0x5000cca26a26dff1), phy(42), device_name(0x5000cca26a26dff3) [Thu Dec 12 07:46:49 2019][ 26.853319] scsi 1:0:227:0: enclosure logical id(0x5000ccab0405db00), slot(51) [Thu Dec 12 07:46:49 2019][ 26.860625] scsi 1:0:227:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:49 2019][ 26.867518] scsi 1:0:227:0: serial_number( 2TGPBSYD) [Thu Dec 12 07:46:49 2019][ 26.873092] scsi 1:0:227:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:49 2019][ 26.892919] mpt3sas_cm0: detecting: handle(0x00c9), sas_address(0x5000cca26b9c5d51), phy(43) [Thu Dec 12 07:46:49 2019][ 26.901355] mpt3sas_cm0: REPORT_LUNS: handle(0x00c9), retries(0) [Thu Dec 12 07:46:49 2019][ 26.907495] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c9), lun(0) [Thu Dec 12 07:46:49 2019][ 26.914172] scsi 1:0:228:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:49 2019][ 26.922569] scsi 1:0:228:0: SSP: handle(0x00c9), sas_addr(0x5000cca26b9c5d51), phy(43), device_name(0x5000cca26b9c5d53) [Thu Dec 12 07:46:49 2019][ 26.933340] scsi 1:0:228:0: enclosure logical id(0x5000ccab0405db00), slot(52) [Thu Dec 12 07:46:49 2019][ 26.940645] scsi 1:0:228:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:49 2019][ 26.947540] scsi 1:0:228:0: serial_number( 1SJSZV5Z) [Thu Dec 12 07:46:49 2019][ 26.953112] scsi 1:0:228:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:49 2019][ 26.976513] mpt3sas_cm0: detecting: handle(0x00ca), sas_address(0x5000cca26b9602c5), phy(44) [Thu Dec 12 07:46:49 2019][ 26.984951] mpt3sas_cm0: REPORT_LUNS: handle(0x00ca), retries(0) [Thu Dec 12 07:46:49 2019][ 26.991097] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ca), lun(0) [Thu Dec 12 07:46:49 2019][ 26.997751] scsi 1:0:229:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:49 2019][ 27.006154] scsi 1:0:229:0: SSP: handle(0x00ca), sas_addr(0x5000cca26b9602c5), phy(44), device_name(0x5000cca26b9602c7) [Thu Dec 12 07:46:49 2019][ 27.016924] scsi 1:0:229:0: enclosure logical id(0x5000ccab0405db00), slot(53) [Thu Dec 12 07:46:49 2019][ 27.024228] scsi 1:0:229:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:49 2019][ 27.031121] scsi 1:0:229:0: serial_number( 1SJNHJ4Z) [Thu Dec 12 07:46:49 2019][ 27.036693] scsi 1:0:229:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:49 2019][ 27.056920] mpt3sas_cm0: detecting: handle(0x00cb), sas_address(0x5000cca252544a01), phy(45) [Thu Dec 12 07:46:49 2019][ 27.065357] mpt3sas_cm0: REPORT_LUNS: handle(0x00cb), retries(0) [Thu Dec 12 07:46:49 2019][ 27.071489] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00cb), lun(0) [Thu Dec 12 07:46:49 2019][ 27.078121] scsi 1:0:230:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:49 2019][ 27.086507] scsi 1:0:230:0: SSP: handle(0x00cb), sas_addr(0x5000cca252544a01), phy(45), device_name(0x5000cca252544a03) [Thu Dec 12 07:46:49 2019][ 27.097280] scsi 1:0:230:0: enclosure logical id(0x5000ccab0405db00), slot(54) [Thu Dec 12 07:46:49 2019][ 27.104587] scsi 1:0:230:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:49 2019][ 27.111479] scsi 1:0:230:0: serial_number( 7SHHB14G) [Thu Dec 12 07:46:49 2019][ 27.117053] scsi 1:0:230:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:49 2019][ 27.136919] mpt3sas_cm0: detecting: handle(0x00cc), sas_address(0x5000cca252559f9d), phy(46) [Thu Dec 12 07:46:49 2019][ 27.145359] mpt3sas_cm0: REPORT_LUNS: handle(0x00cc), retries(0) [Thu Dec 12 07:46:49 2019][ 27.151491] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00cc), lun(0) [Thu Dec 12 07:46:49 2019][ 27.158155] scsi 1:0:231:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:49 2019][ 27.166545] scsi 1:0:231:0: SSP: handle(0x00cc), sas_addr(0x5000cca252559f9d), phy(46), device_name(0x5000cca252559f9f) [Thu Dec 12 07:46:49 2019][ 27.177319] scsi 1:0:231:0: enclosure logical id(0x5000ccab0405db00), slot(55) [Thu Dec 12 07:46:49 2019][ 27.184623] scsi 1:0:231:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:49 2019][ 27.191517] scsi 1:0:231:0: serial_number( 7SHJ2TDG) [Thu Dec 12 07:46:49 2019][ 27.197089] scsi 1:0:231:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:49 2019][ 27.220920] mpt3sas_cm0: detecting: handle(0x00cd), sas_address(0x5000cca25255571d), phy(47) [Thu Dec 12 07:46:49 2019][ 27.229357] mpt3sas_cm0: REPORT_LUNS: handle(0x00cd), retries(0) [Thu Dec 12 07:46:49 2019][ 27.235499] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00cd), lun(0) [Thu Dec 12 07:46:49 2019][ 27.242182] scsi 1:0:232:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:49 2019][ 27.250582] scsi 1:0:232:0: SSP: handle(0x00cd), sas_addr(0x5000cca25255571d), phy(47), device_name(0x5000cca25255571f) [Thu Dec 12 07:46:49 2019][ 27.261351] scsi 1:0:232:0: enclosure logical id(0x5000ccab0405db00), slot(56) [Thu Dec 12 07:46:49 2019][ 27.268656] scsi 1:0:232:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:49 2019][ 27.275552] scsi 1:0:232:0: serial_number( 7SHHXYRG) [Thu Dec 12 07:46:50 2019][ 27.281122] scsi 1:0:232:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:50 2019][ 27.303928] mpt3sas_cm0: detecting: handle(0x00ce), sas_address(0x5000cca26b9bf57d), phy(48) [Thu Dec 12 07:46:50 2019][ 27.312367] mpt3sas_cm0: REPORT_LUNS: handle(0x00ce), retries(0) [Thu Dec 12 07:46:50 2019][ 27.318506] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ce), lun(0) [Thu Dec 12 07:46:50 2019][ 27.325190] scsi 1:0:233:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:50 2019][ 27.333587] scsi 1:0:233:0: SSP: handle(0x00ce), sas_addr(0x5000cca26b9bf57d), phy(48), device_name(0x5000cca26b9bf57f) [Thu Dec 12 07:46:50 2019][ 27.344361] scsi 1:0:233:0: enclosure logical id(0x5000ccab0405db00), slot(57) [Thu Dec 12 07:46:50 2019][ 27.351667] scsi 1:0:233:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:50 2019][ 27.358559] scsi 1:0:233:0: serial_number( 1SJSSXUZ) [Thu Dec 12 07:46:50 2019][ 27.364132] scsi 1:0:233:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:50 2019][ 27.386927] mpt3sas_cm0: detecting: handle(0x00cf), sas_address(0x5000cca252555371), phy(49) [Thu Dec 12 07:46:50 2019][ 27.395361] mpt3sas_cm0: REPORT_LUNS: handle(0x00cf), retries(0) [Thu Dec 12 07:46:50 2019][ 27.401492] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00cf), lun(0) [Thu Dec 12 07:46:50 2019][ 27.408317] scsi 1:0:234:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:50 2019][ 27.416711] scsi 1:0:234:0: SSP: handle(0x00cf), sas_addr(0x5000cca252555371), phy(49), device_name(0x5000cca252555373) [Thu Dec 12 07:46:50 2019][ 27.427484] scsi 1:0:234:0: enclosure logical id(0x5000ccab0405db00), slot(58) [Thu Dec 12 07:46:50 2019][ 27.434790] scsi 1:0:234:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:50 2019][ 27.441684] scsi 1:0:234:0: serial_number( 7SHHXR4G) [Thu Dec 12 07:46:50 2019][ 27.447256] scsi 1:0:234:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:50 2019][ 27.469921] mpt3sas_cm0: detecting: handle(0x00d0), sas_address(0x5000cca25253eefd), phy(50) [Thu Dec 12 07:46:50 2019][ 27.478362] mpt3sas_cm0: REPORT_LUNS: handle(0x00d0), retries(0) [Thu Dec 12 07:46:50 2019][ 27.484494] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d0), lun(0) [Thu Dec 12 07:46:50 2019][ 27.491191] scsi 1:0:235:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:50 2019][ 27.499579] scsi 1:0:235:0: SSP: handle(0x00d0), sas_addr(0x5000cca25253eefd), phy(50), device_name(0x5000cca25253eeff) [Thu Dec 12 07:46:50 2019][ 27.510356] scsi 1:0:235:0: enclosure logical id(0x5000ccab0405db00), slot(59) [Thu Dec 12 07:46:50 2019][ 27.517660] scsi 1:0:235:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:50 2019][ 27.524554] scsi 1:0:235:0: serial_number( 7SHH4Z7G) [Thu Dec 12 07:46:50 2019][ 27.530128] scsi 1:0:235:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:50 2019][ 27.552401] mpt3sas_cm0: expander_add: handle(0x009c), parent(0x0099), sas_addr(0x5000ccab0405db7f), phys(68) [Thu Dec 12 07:46:50 2019][ 27.574408] mpt3sas_cm0: detecting: handle(0x00d1), sas_address(0x5000cca26b9cbb05), phy(42) [Thu Dec 12 07:46:50 2019][ 27.582848] mpt3sas_cm0: REPORT_LUNS: handle(0x00d1), retries(0) [Thu Dec 12 07:46:50 2019][ 27.588998] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d1), lun(0) [Thu Dec 12 07:46:50 2019][ 27.595681] scsi 1:0:236:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:50 2019][ 27.604072] scsi 1:0:236:0: SSP: handle(0x00d1), sas_addr(0x5000cca26b9cbb05), phy(42), device_name(0x5000cca26b9cbb07) [Thu Dec 12 07:46:50 2019][ 27.614843] scsi 1:0:236:0: enclosure logical id(0x5000ccab0405db00), slot(1) [Thu Dec 12 07:46:50 2019][ 27.622060] scsi 1:0:236:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:50 2019][ 27.628951] scsi 1:0:236:0: serial_number( 1SJT62MZ) [Thu Dec 12 07:46:50 2019][ 27.634525] scsi 1:0:236:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:50 2019][ 27.656926] mpt3sas_cm0: detecting: handle(0x00d2), sas_address(0x5000cca252544475), phy(43) [Thu Dec 12 07:46:50 2019][ 27.665365] mpt3sas_cm0: REPORT_LUNS: handle(0x00d2), retries(0) [Thu Dec 12 07:46:50 2019][ 27.671523] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d2), lun(0) [Thu Dec 12 07:46:50 2019][ 27.678188] scsi 1:0:237:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:50 2019][ 27.686575] scsi 1:0:237:0: SSP: handle(0x00d2), sas_addr(0x5000cca252544475), phy(43), device_name(0x5000cca252544477) [Thu Dec 12 07:46:50 2019][ 27.697348] scsi 1:0:237:0: enclosure logical id(0x5000ccab0405db00), slot(3) [Thu Dec 12 07:46:50 2019][ 27.704569] scsi 1:0:237:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:50 2019][ 27.711460] scsi 1:0:237:0: serial_number( 7SHHANPG) [Thu Dec 12 07:46:50 2019][ 27.717035] scsi 1:0:237:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:50 2019][ 27.743514] mpt3sas_cm0: detecting: handle(0x00d3), sas_address(0x5000cca26a26173d), phy(44) [Thu Dec 12 07:46:50 2019][ 27.751955] mpt3sas_cm0: REPORT_LUNS: handle(0x00d3), retries(0) [Thu Dec 12 07:46:50 2019][ 27.758124] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d3), lun(0) [Thu Dec 12 07:46:50 2019][ 27.764772] scsi 1:0:238:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:50 2019][ 27.773168] scsi 1:0:238:0: SSP: handle(0x00d3), sas_addr(0x5000cca26a26173d), phy(44), device_name(0x5000cca26a26173f) [Thu Dec 12 07:46:50 2019][ 27.783938] scsi 1:0:238:0: enclosure logical id(0x5000ccab0405db00), slot(4) [Thu Dec 12 07:46:50 2019][ 27.791157] scsi 1:0:238:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:50 2019][ 27.798050] scsi 1:0:238:0: serial_number( 2TGNYDLD) [Thu Dec 12 07:46:50 2019][ 27.803622] scsi 1:0:238:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:50 2019][ 27.823958] mpt3sas_cm0: detecting: handle(0x00d4), sas_address(0x5000cca252544cb5), phy(45) [Thu Dec 12 07:46:50 2019][ 27.832399] mpt3sas_cm0: REPORT_LUNS: handle(0x00d4), retries(0) [Thu Dec 12 07:46:50 2019][ 27.838572] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d4), lun(0) [Thu Dec 12 07:46:50 2019][ 27.845265] scsi 1:0:239:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:50 2019][ 27.853655] scsi 1:0:239:0: SSP: handle(0x00d4), sas_addr(0x5000cca252544cb5), phy(45), device_name(0x5000cca252544cb7) [Thu Dec 12 07:46:50 2019][ 27.864428] scsi 1:0:239:0: enclosure logical id(0x5000ccab0405db00), slot(5) [Thu Dec 12 07:46:50 2019][ 27.871645] scsi 1:0:239:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:50 2019][ 27.878539] scsi 1:0:239:0: serial_number( 7SHHB6RG) [Thu Dec 12 07:46:50 2019][ 27.884111] scsi 1:0:239:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:50 2019][ 27.903928] mpt3sas_cm0: detecting: handle(0x00d5), sas_address(0x5000cca26c238691), phy(46) [Thu Dec 12 07:46:50 2019][ 27.912367] mpt3sas_cm0: REPORT_LUNS: handle(0x00d5), retries(0) [Thu Dec 12 07:46:50 2019][ 27.918505] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d5), lun(0) [Thu Dec 12 07:46:50 2019][ 27.925179] scsi 1:0:240:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:50 2019][ 27.933576] scsi 1:0:240:0: SSP: handle(0x00d5), sas_addr(0x5000cca26c238691), phy(46), device_name(0x5000cca26c238693) [Thu Dec 12 07:46:50 2019][ 27.944351] scsi 1:0:240:0: enclosure logical id(0x5000ccab0405db00), slot(6) [Thu Dec 12 07:46:50 2019][ 27.951571] scsi 1:0:240:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:50 2019][ 27.958464] scsi 1:0:240:0: serial_number( 1DGMJNWZ) [Thu Dec 12 07:46:50 2019][ 27.964038] scsi 1:0:240:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:50 2019][ 27.986929] mpt3sas_cm0: detecting: handle(0x00d6), sas_address(0x5000cca26a2ac969), phy(47) [Thu Dec 12 07:46:50 2019][ 27.995370] mpt3sas_cm0: REPORT_LUNS: handle(0x00d6), retries(0) [Thu Dec 12 07:46:50 2019][ 28.001511] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d6), lun(0) [Thu Dec 12 07:46:50 2019][ 28.008166] scsi 1:0:241:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:50 2019][ 28.016553] scsi 1:0:241:0: SSP: handle(0x00d6), sas_addr(0x5000cca26a2ac969), phy(47), device_name(0x5000cca26a2ac96b) [Thu Dec 12 07:46:50 2019][ 28.027327] scsi 1:0:241:0: enclosure logical id(0x5000ccab0405db00), slot(7) [Thu Dec 12 07:46:50 2019][ 28.034547] scsi 1:0:241:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:50 2019][ 28.041439] scsi 1:0:241:0: serial_number( 2TGSJGHD) [Thu Dec 12 07:46:50 2019][ 28.047013] scsi 1:0:241:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:50 2019][ 28.066925] mpt3sas_cm0: detecting: handle(0x00d7), sas_address(0x5000cca25253e619), phy(48) [Thu Dec 12 07:46:50 2019][ 28.075362] mpt3sas_cm0: REPORT_LUNS: handle(0x00d7), retries(0) [Thu Dec 12 07:46:50 2019][ 28.081488] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d7), lun(0) [Thu Dec 12 07:46:50 2019][ 28.088172] scsi 1:0:242:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:50 2019][ 28.096568] scsi 1:0:242:0: SSP: handle(0x00d7), sas_addr(0x5000cca25253e619), phy(48), device_name(0x5000cca25253e61b) [Thu Dec 12 07:46:50 2019][ 28.107339] scsi 1:0:242:0: enclosure logical id(0x5000ccab0405db00), slot(8) [Thu Dec 12 07:46:50 2019][ 28.114557] scsi 1:0:242:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:50 2019][ 28.121450] scsi 1:0:242:0: serial_number( 7SHH4BWG) [Thu Dec 12 07:46:50 2019][ 28.127022] scsi 1:0:242:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:50 2019][ 28.148930] mpt3sas_cm0: detecting: handle(0x00d8), sas_address(0x5000cca252542cfd), phy(49) [Thu Dec 12 07:46:50 2019][ 28.157372] mpt3sas_cm0: REPORT_LUNS: handle(0x00d8), retries(0) [Thu Dec 12 07:46:50 2019][ 28.163505] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d8), lun(0) [Thu Dec 12 07:46:50 2019][ 28.170170] scsi 1:0:243:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:50 2019][ 28.178562] scsi 1:0:243:0: SSP: handle(0x00d8), sas_addr(0x5000cca252542cfd), phy(49), device_name(0x5000cca252542cff) [Thu Dec 12 07:46:50 2019][ 28.189333] scsi 1:0:243:0: enclosure logical id(0x5000ccab0405db00), slot(9) [Thu Dec 12 07:46:50 2019][ 28.196554] scsi 1:0:243:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:50 2019][ 28.203445] scsi 1:0:243:0: serial_number( 7SHH937G) [Thu Dec 12 07:46:50 2019][ 28.209020] scsi 1:0:243:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:50 2019][ 28.228931] mpt3sas_cm0: detecting: handle(0x00d9), sas_address(0x5000cca26a3181fd), phy(50) [Thu Dec 12 07:46:50 2019][ 28.237370] mpt3sas_cm0: REPORT_LUNS: handle(0x00d9), retries(0) [Thu Dec 12 07:46:50 2019][ 28.243497] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d9), lun(0) [Thu Dec 12 07:46:50 2019][ 28.250153] scsi 1:0:244:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Thu Dec 12 07:46:50 2019][ 28.258537] scsi 1:0:244:0: SSP: handle(0x00d9), sas_addr(0x5000cca26a3181fd), phy(50), device_name(0x5000cca26a3181ff) [Thu Dec 12 07:46:50 2019][ 28.269310] scsi 1:0:244:0: enclosure logical id(0x5000ccab0405db00), slot(10) [Thu Dec 12 07:46:50 2019][ 28.276616] scsi 1:0:244:0: enclosure level(0x0000), connector name( C0 ) [Thu Dec 12 07:46:51 2019][ 28.283509] scsi 1:0:244:0: serial_number( 2TGW71ND) [Thu Dec 12 07:46:51 2019][ 28.289083] scsi 1:0:244:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Thu Dec 12 07:46:51 2019][ 28.324739] mpt3sas_cm0: port enable: SUCCESS [Thu Dec 12 07:46:51 2019][ 28.329929] sd 1:0:2:0: [sdb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.337780] sd 1:0:2:0: [sdb] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.337781] sd 1:0:3:0: [sdc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.337783] sd 1:0:3:0: [sdc] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.350908] sd 1:0:4:0: [sdd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.350915] sd 1:0:4:0: [sdd] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364119] sd 1:0:10:0: [sdj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364121] sd 1:0:13:0: [sdm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364123] sd 1:0:13:0: [sdm] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364124] sd 1:0:16:0: [sdp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364126] sd 1:0:10:0: [sdj] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364127] sd 1:0:16:0: [sdp] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364138] sd 1:0:17:0: [sdq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364140] sd 1:0:17:0: [sdq] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364142] sd 1:0:14:0: [sdn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364143] sd 1:0:7:0: [sdg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364144] sd 1:0:14:0: [sdn] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364145] sd 1:0:7:0: [sdg] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364150] sd 1:0:28:0: [sdab] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364152] sd 1:0:28:0: [sdab] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364153] sd 1:0:19:0: [sds] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364156] sd 1:0:19:0: [sds] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364160] sd 1:0:15:0: [sdo] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364161] sd 1:0:21:0: [sdu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364162] sd 1:0:15:0: [sdo] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364163] sd 1:0:11:0: [sdk] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364163] sd 1:0:21:0: [sdu] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364164] sd 1:0:29:0: [sdac] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364165] sd 1:0:11:0: [sdk] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364167] sd 1:0:30:0: [sdad] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364169] sd 1:0:29:0: [sdac] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364170] sd 1:0:30:0: [sdad] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364172] sd 1:0:8:0: [sdh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364173] sd 1:0:27:0: [sdaa] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364174] sd 1:0:8:0: [sdh] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364175] sd 1:0:27:0: [sdaa] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364178] sd 1:0:31:0: [sdae] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364179] sd 1:0:22:0: [sdv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364181] sd 1:0:31:0: [sdae] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364181] sd 1:0:22:0: [sdv] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364186] sd 1:0:20:0: [sdt] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364187] sd 1:0:20:0: [sdt] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364189] sd 1:0:5:0: [sde] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364191] sd 1:0:5:0: [sde] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364197] sd 1:0:9:0: [sdi] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364198] sd 1:0:18:0: [sdr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364199] sd 1:0:9:0: [sdi] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364200] sd 1:0:18:0: [sdr] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364206] sd 1:0:26:0: [sdz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364208] sd 1:0:26:0: [sdz] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364209] sd 1:0:23:0: [sdw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364210] sd 1:0:23:0: [sdw] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364214] sd 1:0:25:0: [sdy] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364215] sd 1:0:25:0: [sdy] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364223] sd 1:0:6:0: [sdf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364224] sd 1:0:6:0: [sdf] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364225] sd 1:0:12:0: [sdl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364227] sd 1:0:12:0: [sdl] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.364235] sd 1:0:24:0: [sdx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.364236] sd 1:0:24:0: [sdx] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.377464] sd 1:0:34:0: [sdah] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.377467] sd 1:0:34:0: [sdah] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.377469] sd 1:0:35:0: [sdai] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.377471] sd 1:0:35:0: [sdai] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.377532] sd 1:0:32:0: [sdaf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.377534] sd 1:0:32:0: [sdaf] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.377554] sd 1:0:33:0: [sdag] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.377557] sd 1:0:33:0: [sdag] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.385718] sd 1:0:50:0: [sdax] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.385722] sd 1:0:36:0: [sdaj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.385724] sd 1:0:50:0: [sdax] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.385725] sd 1:0:36:0: [sdaj] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.385744] sd 1:0:47:0: [sdau] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.385745] sd 1:0:38:0: [sdal] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.385746] sd 1:0:47:0: [sdau] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.385751] sd 1:0:42:0: [sdap] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.385752] sd 1:0:42:0: [sdap] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.385753] sd 1:0:38:0: [sdal] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.385757] sd 1:0:44:0: [sdar] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.385759] sd 1:0:44:0: [sdar] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.385765] sd 1:0:49:0: [sdaw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.385765] sd 1:0:46:0: [sdat] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.385766] sd 1:0:49:0: [sdaw] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.385767] sd 1:0:40:0: [sdan] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.385768] sd 1:0:46:0: [sdat] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.385769] sd 1:0:40:0: [sdan] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.385776] sd 1:0:39:0: [sdam] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.385777] sd 1:0:39:0: [sdam] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.385779] sd 1:0:45:0: [sdas] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.385780] sd 1:0:45:0: [sdas] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.385782] sd 1:0:43:0: [sdaq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.385784] sd 1:0:43:0: [sdaq] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.385789] sd 1:0:41:0: [sdao] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.385790] sd 1:0:41:0: [sdao] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.391234] sd 1:0:53:0: [sdba] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.391237] sd 1:0:53:0: [sdba] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.391288] sd 1:0:57:0: [sdbe] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.391290] sd 1:0:55:0: [sdbc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.391291] sd 1:0:57:0: [sdbe] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.391292] sd 1:0:55:0: [sdbc] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.391303] sd 1:0:3:0: [sdc] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.391323] sd 1:0:54:0: [sdbb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.391324] sd 1:0:54:0: [sdbb] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.391348] sd 1:0:56:0: [sdbd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.391350] sd 1:0:56:0: [sdbd] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.391386] sd 1:0:51:0: [sday] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.391388] sd 1:0:51:0: [sday] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.391397] sd 1:0:52:0: [sdaz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.391399] sd 1:0:52:0: [sdaz] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.396752] sd 1:0:58:0: [sdbf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.396754] sd 1:0:58:0: [sdbf] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.399549] sd 1:0:4:0: [sdd] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.399592] sd 1:0:48:0: [sdav] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.399594] sd 1:0:48:0: [sdav] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.402752] sd 1:0:59:0: [sdbg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.402754] sd 1:0:59:0: [sdbg] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.404993] sd 1:0:3:0: [sdc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:51 2019][ 28.404998] sd 1:0:28:0: [sdab] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405001] sd 1:0:26:0: [sdz] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405003] sd 1:0:17:0: [sdq] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405010] sd 1:0:19:0: [sds] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405019] sd 1:0:60:0: [sdbh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.405020] sd 1:0:60:0: [sdbh] 4096-byte physical blocks [Thu Dec 12 07:46:51 2019][ 28.405026] sd 1:0:16:0: [sdp] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405027] sd 1:0:15:0: [sdo] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405036] sd 1:0:6:0: [sdf] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405038] sd 1:0:29:0: [sdac] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405042] sd 1:0:21:0: [sdu] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405046] sd 1:0:22:0: [sdv] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405050] sd 1:0:23:0: [sdw] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405055] sd 1:0:10:0: [sdj] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405061] sd 1:0:30:0: [sdad] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405071] sd 1:0:20:0: [sdt] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405073] sd 1:0:31:0: [sdae] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405076] sd 1:0:8:0: [sdh] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405083] sd 1:0:11:0: [sdk] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405093] sd 1:0:18:0: [sdr] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405098] sd 1:0:7:0: [sdg] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405103] sd 1:0:27:0: [sdaa] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405113] sd 1:0:25:0: [sdy] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405131] sd 1:0:61:0: [sdbi] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:51 2019][ 28.405132] sd 1:0:14:0: [sdn] Write Protect is off [Thu Dec 12 07:46:51 2019][ 28.405135] sd 1:0:61:0: [sdbi] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.405145] sd 1:0:24:0: [sdx] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.405150] sd 1:0:5:0: [sde] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.405155] sd 1:0:12:0: [sdl] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.405158] sd 1:0:9:0: [sdi] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.406756] sd 1:0:13:0: [sdm] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.410562] sd 1:0:63:0: [sdbj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.410564] sd 1:0:63:0: [sdbj] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.410581] sd 1:0:4:0: [sdd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.410625] sd 1:0:34:0: [sdah] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.410640] sd 1:0:64:0: [sdbk] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.410642] sd 1:0:64:0: [sdbk] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.410688] sd 1:0:32:0: [sdaf] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.410733] sd 1:0:33:0: [sdag] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.418750] sd 1:0:28:0: [sdab] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418756] sd 1:0:70:0: [sdbq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.418757] sd 1:0:70:0: [sdbq] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.418774] sd 1:0:13:0: [sdm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418777] sd 1:0:10:0: [sdj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418779] sd 1:0:25:0: [sdy] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418783] sd 1:0:27:0: [sdaa] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418787] sd 1:0:65:0: [sdbl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.418789] sd 1:0:65:0: [sdbl] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.418790] sd 1:0:29:0: [sdac] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418795] sd 1:0:35:0: [sdai] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.418795] sd 1:0:17:0: [sdq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418802] sd 1:0:39:0: [sdam] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.418803] sd 1:0:44:0: [sdar] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.418807] sd 1:0:43:0: [sdaq] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.418810] sd 1:0:41:0: [sdao] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.418817] sd 1:0:24:0: [sdx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418818] sd 1:0:5:0: [sde] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418819] sd 1:0:7:0: [sdg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418820] sd 1:0:69:0: [sdbp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.418822] sd 1:0:69:0: [sdbp] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.418823] sd 1:0:47:0: [sdau] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.418825] sd 1:0:68:0: [sdbo] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.418826] sd 1:0:68:0: [sdbo] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.418828] sd 1:0:50:0: [sdax] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.418831] sd 1:0:11:0: [sdk] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418832] sd 1:0:36:0: [sdaj] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.418837] sd 1:0:22:0: [sdv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418839] sd 1:0:15:0: [sdo] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418839] sd 1:0:46:0: [sdat] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.418853] sd 1:0:23:0: [sdw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418858] sd 1:0:12:0: [sdl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418859] sd 1:0:38:0: [sdal] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.418861] sd 1:0:49:0: [sdaw] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.418870] sd 1:0:6:0: [sdf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418886] sd 1:0:21:0: [sdu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418894] sd 1:0:20:0: [sdt] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418896] sd 1:0:30:0: [sdad] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418902] sd 1:0:18:0: [sdr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418906] sd 1:0:67:0: [sdbn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.418907] sd 1:0:19:0: [sds] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418908] sd 1:0:67:0: [sdbn] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.418922] sd 1:0:42:0: [sdap] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.418931] sd 1:0:26:0: [sdz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418936] sd 1:0:8:0: [sdh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418941] sd 1:0:9:0: [sdi] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418943] sd 1:0:16:0: [sdp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418959] sd 1:0:14:0: [sdn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418961] sd 1:0:45:0: [sdas] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.418973] sd 1:0:31:0: [sdae] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.418987] sd 1:0:71:0: [sdbr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.418988] sd 1:0:71:0: [sdbr] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.421753] sd 1:0:40:0: [sdan] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.424352] sd 1:0:72:0: [sdbs] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.424353] sd 1:0:72:0: [sdbs] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.424365] sd 1:0:53:0: [sdba] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.424390] sd 1:0:32:0: [sdaf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.424398] sd 1:0:58:0: [sdbf] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.424407] sd 1:0:54:0: [sdbb] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.424425] sd 1:0:52:0: [sdaz] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.424460] sd 1:0:34:0: [sdah] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.424461] sd 1:0:51:0: [sday] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.424488] sd 1:0:56:0: [sdbd] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.424533] sd 1:0:55:0: [sdbc] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.424540] sd 1:0:57:0: [sdbe] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.424551] sd 1:0:66:0: [sdbm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.424552] sd 1:0:66:0: [sdbm] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.425181] sd 1:0:74:0: [sdbu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.425185] sd 1:0:74:0: [sdbu] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432694] sd 1:0:73:0: [sdbt] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432696] sd 1:0:73:0: [sdbt] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432697] sd 1:0:33:0: [sdag] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.432702] sd 1:0:39:0: [sdam] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.432705] sd 1:0:41:0: [sdao] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.432721] sd 1:0:49:0: [sdaw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.432723] sd 1:0:47:0: [sdau] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.432732] sd 1:0:45:0: [sdas] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.432749] sd 1:0:50:0: [sdax] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.432750] sd 1:0:38:0: [sdal] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.432751] sd 1:0:42:0: [sdap] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.432786] sd 1:0:93:0: [sdcn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432787] sd 1:0:93:0: [sdcn] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432817] sd 1:0:84:0: [sdce] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432819] sd 1:0:84:0: [sdce] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432824] sd 1:0:48:0: [sdav] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.432830] sd 1:0:86:0: [sdcg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432831] sd 1:0:86:0: [sdcg] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432839] sd 1:0:77:0: [sdbx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432841] sd 1:0:77:0: [sdbx] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432851] sd 1:0:43:0: [sdaq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.432853] sd 1:0:80:0: [sdca] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432855] sd 1:0:80:0: [sdca] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432866] sd 1:0:89:0: [sdcj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432867] sd 1:0:94:0: [sdco] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432868] sd 1:0:89:0: [sdcj] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432869] sd 1:0:94:0: [sdco] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432878] sd 1:0:92:0: [sdcm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432879] sd 1:0:92:0: [sdcm] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432886] sd 1:0:78:0: [sdby] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432888] sd 1:0:35:0: [sdai] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.432890] sd 1:0:78:0: [sdby] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432899] sd 1:0:44:0: [sdar] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.432902] sd 1:0:81:0: [sdcb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432904] sd 1:0:40:0: [sdan] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.432906] sd 1:0:81:0: [sdcb] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432908] sd 1:0:46:0: [sdat] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.432919] sd 1:0:85:0: [sdcf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432921] sd 1:0:85:0: [sdcf] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432924] sd 1:0:75:0: [sdbv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432926] sd 1:0:75:0: [sdbv] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432932] sd 1:0:87:0: [sdch] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432934] sd 1:0:87:0: [sdch] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432946] sd 1:0:88:0: [sdci] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432947] sd 1:0:88:0: [sdci] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432959] sd 1:0:82:0: [sdcc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432960] sd 1:0:91:0: [sdcl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432960] sd 1:0:82:0: [sdcc] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432961] sd 1:0:91:0: [sdcl] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432970] sd 1:0:83:0: [sdcd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432974] sd 1:0:83:0: [sdcd] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.432977] sd 1:0:79:0: [sdbz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.432978] sd 1:0:79:0: [sdbz] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.433037] sd 1:0:36:0: [sdaj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:52 2019][ 28.434276] sd 1:0:59:0: [sdbg] Write Protect is off [Thu Dec 12 07:46:52 2019][ 28.434294] sd 1:0:76:0: [sdbw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.434295] sd 1:0:76:0: [sdbw] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.440939] sd 1:0:90:0: [sdck] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.440941] sd 1:0:90:0: [sdck] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.440942] sd 1:0:95:0: [sdcp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:52 2019][ 28.440944] sd 1:0:95:0: [sdcp] 4096-byte physical blocks [Thu Dec 12 07:46:52 2019][ 28.440948] sd 1:0:52:0: [sdaz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.440960] sd 1:0:58:0: [sdbf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.440969] sd 1:0:55:0: [sdbc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.440994] sd 1:0:60:0: [sdbh] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.441110] sd 1:0:57:0: [sdbe] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.441139] sd 1:0:51:0: [sday] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.441152] sd 1:0:56:0: [sdbd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.441844] sd 1:0:106:0: [sdda] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.441846] sd 1:0:106:0: [sdda] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.442993] sd 1:0:61:0: [sdbi] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.443440] sd 1:0:99:0: [sdct] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.443442] sd 1:0:99:0: [sdct] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.443451] sd 1:0:100:0: [sdcu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.443452] sd 1:0:100:0: [sdcu] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.443461] sd 1:0:101:0: [sdcv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.443462] sd 1:0:101:0: [sdcv] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.443470] sd 1:0:102:0: [sdcw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.443471] sd 1:0:102:0: [sdcw] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.443534] sd 1:0:104:0: [sdcy] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.443535] sd 1:0:104:0: [sdcy] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.443542] sd 1:0:53:0: [sdba] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.443556] sd 1:0:97:0: [sdcr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.443557] sd 1:0:97:0: [sdcr] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.443607] sd 1:0:54:0: [sdbb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.446727] sd 1:0:96:0: [sdcq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446728] sd 1:0:96:0: [sdcq] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446831] sd 1:0:64:0: [sdbk] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.446857] sd 1:0:48:0: [sdav] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.446860] sd 1:0:120:0: [sddo] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446862] sd 1:0:59:0: [sdbg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.446864] sd 1:0:120:0: [sddo] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446865] sd 1:0:121:0: [sddp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446866] sd 1:0:121:0: [sddp] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446875] sd 1:0:119:0: [sddn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446876] sd 1:0:119:0: [sddn] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446879] sd 1:0:63:0: [sdbj] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.446882] sd 1:0:130:0: [sddx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446884] sd 1:0:130:0: [sddx] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446885] sd 1:0:133:0: [sdea] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446886] sd 1:0:133:0: [sdea] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446891] sd 1:0:122:0: [sddq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446893] sd 1:0:122:0: [sddq] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446895] sd 1:0:134:0: [sdeb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446896] sd 1:0:135:0: [sdec] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446897] sd 1:0:134:0: [sdeb] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446897] sd 1:0:115:0: [sddj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446898] sd 1:0:135:0: [sdec] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446899] sd 1:0:115:0: [sddj] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446909] sd 1:0:117:0: [sddl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446910] sd 1:0:117:0: [sddl] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446912] sd 1:0:113:0: [sddh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446913] sd 1:0:113:0: [sddh] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446915] sd 1:0:107:0: [sddb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446917] sd 1:0:124:0: [sddr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446919] sd 1:0:107:0: [sddb] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446922] sd 1:0:124:0: [sddr] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446923] sd 1:0:126:0: [sddt] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446925] sd 1:0:126:0: [sddt] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446926] sd 1:0:114:0: [sddi] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446927] sd 1:0:114:0: [sddi] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446931] sd 1:0:112:0: [sddg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446933] sd 1:0:108:0: [sddc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446935] sd 1:0:112:0: [sddg] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446937] sd 1:0:108:0: [sddc] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446939] sd 1:0:127:0: [sddu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446940] sd 1:0:136:0: [sded] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446941] sd 1:0:136:0: [sded] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446942] sd 1:0:127:0: [sddu] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446958] sd 1:0:128:0: [sddv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446959] sd 1:0:128:0: [sddv] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446969] sd 1:0:129:0: [sddw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446970] sd 1:0:129:0: [sddw] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446977] sd 1:0:132:0: [sddz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446978] sd 1:0:132:0: [sddz] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446988] sd 1:0:137:0: [sdee] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446989] sd 1:0:137:0: [sdee] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.446993] sd 1:0:118:0: [sddm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.446994] sd 1:0:118:0: [sddm] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.447004] sd 1:0:116:0: [sddk] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.447006] sd 1:0:116:0: [sddk] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.447228] sd 1:0:109:0: [sddd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.447229] sd 1:0:109:0: [sddd] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.447263] sd 1:0:111:0: [sddf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.447265] sd 1:0:111:0: [sddf] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.447284] sd 1:0:110:0: [sdde] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.447286] sd 1:0:110:0: [sdde] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.447386] sd 1:0:105:0: [sdcz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.447388] sd 1:0:105:0: [sdcz] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.452617] sd 1:0:65:0: [sdbl] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.452628] sd 1:0:67:0: [sdbn] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.452629] sd 1:0:69:0: [sdbp] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.452650] sd 1:0:103:0: [sdcx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.452652] sd 1:0:103:0: [sdcx] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.452656] sd 1:0:68:0: [sdbo] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.452659] sd 1:0:70:0: [sdbq] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.452669] sd 1:0:60:0: [sdbh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.452685] sd 1:0:71:0: [sdbr] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.452756] sd 1:0:61:0: [sdbi] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.452860] sd 1:0:138:0: [sdef] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.452862] sd 1:0:138:0: [sdef] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.452955] sd 1:0:139:0: [sdeg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.452957] sd 1:0:139:0: [sdeg] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.453765] sd 1:0:125:0: [sdds] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.453767] sd 1:0:125:0: [sdds] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.453778] sd 1:0:131:0: [sddy] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.453779] sd 1:0:131:0: [sddy] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.461327] sd 1:0:72:0: [sdbs] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.461406] sd 1:0:63:0: [sdbj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.461591] sd 1:0:74:0: [sdbu] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.461606] sd 1:0:140:0: [sdeh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.461608] sd 1:0:140:0: [sdeh] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.462337] sd 1:0:64:0: [sdbk] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.467241] sd 1:0:73:0: [sdbt] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.467254] sd 1:0:77:0: [sdbx] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.467277] sd 1:0:92:0: [sdcm] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.467280] sd 1:0:78:0: [sdby] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.467303] sd 1:0:75:0: [sdbv] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.467318] sd 1:0:65:0: [sdbl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.467327] sd 1:0:93:0: [sdcn] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.467346] sd 1:0:84:0: [sdce] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.467360] sd 1:0:68:0: [sdbo] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.467367] sd 1:0:69:0: [sdbp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.467379] sd 1:0:81:0: [sdcb] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.467382] sd 1:0:67:0: [sdbn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.467391] sd 1:0:89:0: [sdcj] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.467396] sd 1:0:80:0: [sdca] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.467407] sd 1:0:82:0: [sdcc] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.467451] sd 1:0:70:0: [sdbq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.467457] sd 1:0:71:0: [sdbr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.467475] sd 1:0:85:0: [sdcf] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.467488] sd 1:0:88:0: [sdci] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.467509] sd 1:0:87:0: [sdch] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.467523] sd 1:0:83:0: [sdcd] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.467545] sd 1:0:86:0: [sdcg] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.467594] sd 1:0:91:0: [sdcl] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.467598] sd 1:0:79:0: [sdbz] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.467611] sd 1:0:141:0: [sdei] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.467612] sd 1:0:141:0: [sdei] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.467647] sd 1:0:76:0: [sdbw] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.475808] sd 1:0:106:0: [sdda] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.475818] sd 1:0:97:0: [sdcr] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.475824] sd 1:0:95:0: [sdcp] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.475868] sd 1:0:104:0: [sdcy] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.475928] sd 1:0:72:0: [sdbs] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.475959] sd 1:0:100:0: [sdcu] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.475967] sd 1:0:102:0: [sdcw] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.476062] sd 1:0:74:0: [sdbu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:53 2019][ 28.476091] sd 1:0:143:0: [sdek] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:53 2019][ 28.476092] sd 1:0:143:0: [sdek] 4096-byte physical blocks [Thu Dec 12 07:46:53 2019][ 28.476152] sd 1:0:101:0: [sdcv] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.476209] sd 1:0:90:0: [sdck] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.476236] sd 1:0:99:0: [sdct] Write Protect is off [Thu Dec 12 07:46:53 2019][ 28.476255] sd 1:0:142:0: [sdej] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.476257] sd 1:0:142:0: [sdej] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.481712] sd 1:0:122:0: [sddq] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481715] sd 1:0:96:0: [sdcq] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481726] sd 1:0:120:0: [sddo] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481727] sd 1:0:135:0: [sdec] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481733] sd 1:0:134:0: [sdeb] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481744] sd 1:0:133:0: [sdea] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481745] sd 1:0:119:0: [sddn] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481748] sd 1:0:107:0: [sddb] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481751] sd 1:0:117:0: [sddl] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481753] sd 1:0:73:0: [sdbt] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.481767] sd 1:0:113:0: [sddh] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481769] sd 1:0:144:0: [sdel] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.481774] sd 1:0:144:0: [sdel] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.481775] sd 1:0:115:0: [sddj] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481786] sd 1:0:118:0: [sddm] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481787] sd 1:0:126:0: [sddt] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481791] sd 1:0:121:0: [sddp] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481794] sd 1:0:124:0: [sddr] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481797] sd 1:0:136:0: [sded] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481800] sd 1:0:75:0: [sdbv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.481806] sd 1:0:114:0: [sddi] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481815] sd 1:0:112:0: [sddg] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481825] sd 1:0:111:0: [sddf] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481834] sd 1:0:127:0: [sddu] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481850] sd 1:0:92:0: [sdcm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.481856] sd 1:0:77:0: [sdbx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.481858] sd 1:0:116:0: [sddk] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481859] sd 1:0:128:0: [sddv] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481865] sd 1:0:89:0: [sdcj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.481869] sd 1:0:81:0: [sdcb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.481879] sd 1:0:78:0: [sdby] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.481885] sd 1:0:93:0: [sdcn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.481886] sd 1:0:82:0: [sdcc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.481890] sd 1:0:80:0: [sdca] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.481891] sd 1:0:108:0: [sddc] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481901] sd 1:0:132:0: [sddz] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481912] sd 1:0:109:0: [sddd] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481921] sd 1:0:137:0: [sdee] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.481927] sd 1:0:88:0: [sdci] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.481931] sd 1:0:84:0: [sdce] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.481940] sd 1:0:87:0: [sdch] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.482016] sd 1:0:129:0: [sddw] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.482022] sd 1:0:105:0: [sdcz] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.482044] sd 1:0:85:0: [sdcf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.482102] sd 1:0:79:0: [sdbz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.482105] sd 1:0:130:0: [sddx] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.482116] sd 1:0:83:0: [sdcd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.482180] sd 1:0:91:0: [sdcl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.482189] sd 1:0:86:0: [sdcg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.482196] sd 1:0:110:0: [sdde] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.482209] sd 1:0:66:0: [sdbm] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.483756] sd 1:0:76:0: [sdbw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.490292] sd 1:0:146:0: [sden] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490294] sd 1:0:146:0: [sden] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490297] sd 1:0:94:0: [sdco] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.490307] sd 1:0:103:0: [sdcx] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.490351] sd 1:0:147:0: [sdeo] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490353] sd 1:0:147:0: [sdeo] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490383] sd 1:0:106:0: [sdda] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.490385] sd 1:0:97:0: [sdcr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.490394] sd 1:0:138:0: [sdef] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.490406] sd 1:0:95:0: [sdcp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.490427] sd 1:0:149:0: [sdeq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490429] sd 1:0:149:0: [sdeq] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490445] sd 1:0:150:0: [sder] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490447] sd 1:0:150:0: [sder] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490455] sd 1:0:131:0: [sddy] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.490464] sd 1:0:152:0: [sdet] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490465] sd 1:0:152:0: [sdet] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490484] sd 1:0:100:0: [sdcu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.490485] sd 1:0:125:0: [sdds] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.490511] sd 1:0:104:0: [sdcy] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.490526] sd 1:0:151:0: [sdes] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490527] sd 1:0:151:0: [sdes] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490533] sd 1:0:153:0: [sdeu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490534] sd 1:0:153:0: [sdeu] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490592] sd 1:0:155:0: [sdew] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490593] sd 1:0:155:0: [sdew] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490605] sd 1:0:158:0: [sdez] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490606] sd 1:0:158:0: [sdez] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490622] sd 1:0:145:0: [sdem] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490623] sd 1:0:145:0: [sdem] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490627] sd 1:0:170:0: [sdfl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490628] sd 1:0:170:0: [sdfl] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490639] sd 1:0:163:0: [sdfe] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490640] sd 1:0:163:0: [sdfe] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490645] sd 1:0:168:0: [sdfj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490646] sd 1:0:168:0: [sdfj] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490647] sd 1:0:139:0: [sdeg] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.490655] sd 1:0:171:0: [sdfm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490656] sd 1:0:171:0: [sdfm] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490705] sd 1:0:156:0: [sdex] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490707] sd 1:0:156:0: [sdex] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490730] sd 1:0:37:0: [sdak] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490731] sd 1:0:37:0: [sdak] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490746] sd 1:0:162:0: [sdfd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490748] sd 1:0:162:0: [sdfd] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490797] sd 1:0:90:0: [sdck] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.490802] sd 1:0:99:0: [sdct] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.490829] sd 1:0:102:0: [sdcw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.490844] sd 1:0:101:0: [sdcv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.490873] sd 1:0:159:0: [sdfa] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490875] sd 1:0:159:0: [sdfa] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490877] sd 1:0:160:0: [sdfb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490878] sd 1:0:160:0: [sdfb] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490888] sd 1:0:167:0: [sdfi] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490889] sd 1:0:167:0: [sdfi] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.490915] sd 1:0:166:0: [sdfh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.490916] sd 1:0:166:0: [sdfh] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.491242] sd 1:0:157:0: [sdey] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.491243] sd 1:0:157:0: [sdey] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.491251] sd 1:0:161:0: [sdfc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.491252] sd 1:0:161:0: [sdfc] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.491261] sd 1:0:164:0: [sdff] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.491263] sd 1:0:164:0: [sdff] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.491271] sd 1:0:169:0: [sdfk] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:54 2019][ 28.491273] sd 1:0:169:0: [sdfk] 4096-byte physical blocks [Thu Dec 12 07:46:54 2019][ 28.498986] sd 1:0:122:0: [sddq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499013] sd 1:0:96:0: [sdcq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499036] sd 1:0:134:0: [sdeb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499044] sd 1:0:107:0: [sddb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499047] sd 1:0:120:0: [sddo] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499049] sd 1:0:135:0: [sdec] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499056] sd 1:0:133:0: [sdea] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499090] sd 1:0:113:0: [sddh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499093] sd 1:0:117:0: [sddl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499098] sd 1:0:140:0: [sdeh] Write Protect is off [Thu Dec 12 07:46:54 2019][ 28.499109] sd 1:0:118:0: [sddm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499113] sd 1:0:115:0: [sddj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499116] sd 1:0:126:0: [sddt] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499119] sd 1:0:136:0: [sded] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499138] sd 1:0:124:0: [sddr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499146] sd 1:0:121:0: [sddp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499148] sd 1:0:112:0: [sddg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499155] sd 1:0:116:0: [sddk] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499159] sd 1:0:127:0: [sddu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499178] sd 1:0:119:0: [sddn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499182] sd 1:0:128:0: [sddv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499183] sd 1:0:108:0: [sddc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499190] sd 1:0:132:0: [sddz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:54 2019][ 28.499199] sd 1:0:114:0: [sddi] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.499205] sd 1:0:137:0: [sdee] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.499219] sd 1:0:111:0: [sddf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.499230] sd 1:0:105:0: [sdcz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.499239] sd 1:0:109:0: [sddd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.499283] sd 1:0:129:0: [sddw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.499315] sd 1:0:165:0: [sdfg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.499316] sd 1:0:165:0: [sdfg] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.499348] sd 1:0:172:0: [sdfn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.499350] sd 1:0:172:0: [sdfn] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.499352] sd 1:0:175:0: [sdfq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.499354] sd 1:0:175:0: [sdfq] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.499357] sd 1:0:178:0: [sdft] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.499360] sd 1:0:178:0: [sdft] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.499371] sd 1:0:174:0: [sdfp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.499372] sd 1:0:174:0: [sdfp] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.499373] sd 1:0:177:0: [sdfs] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.499375] sd 1:0:177:0: [sdfs] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.499376] sd 1:0:181:0: [sdfw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.499377] sd 1:0:181:0: [sdfw] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.499417] sd 1:0:130:0: [sddx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.499479] sd 1:0:110:0: [sdde] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.499500] sd 1:0:66:0: [sdbm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.499530] sd 1:0:154:0: [sdev] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.499532] sd 1:0:154:0: [sdev] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.499558] sd 1:0:180:0: [sdfv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.499560] sd 1:0:180:0: [sdfv] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.499569] sd 1:0:185:0: [sdfz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.499570] sd 1:0:185:0: [sdfz] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.499573] sd 1:0:98:0: [sdcs] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.499576] sd 1:0:98:0: [sdcs] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.499584] sd 1:0:173:0: [sdfo] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.499584] sd 1:0:176:0: [sdfr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.499585] sd 1:0:173:0: [sdfo] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.499586] sd 1:0:176:0: [sdfr] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.499587] sd 1:0:179:0: [sdfu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.499588] sd 1:0:179:0: [sdfu] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.499608] sd 1:0:183:0: [sdfy] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.499612] sd 1:0:183:0: [sdfy] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.499619] sd 1:0:148:0: [sdep] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.499620] sd 1:0:148:0: [sdep] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.501759] sd 1:0:187:0: [sdgb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.501760] sd 1:0:187:0: [sdgb] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.505056] sd 1:0:188:0: [sdgc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.505058] sd 1:0:188:0: [sdgc] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.505062] sd 1:0:191:0: [sdgf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.505064] sd 1:0:191:0: [sdgf] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.505066] sd 1:0:190:0: [sdge] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.505068] sd 1:0:190:0: [sdge] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.505070] sd 1:0:192:0: [sdgg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.505072] sd 1:0:192:0: [sdgg] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.505073] sd 1:0:193:0: [sdgh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.505074] sd 1:0:193:0: [sdgh] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.505076] sd 1:0:186:0: [sdga] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.505078] sd 1:0:186:0: [sdga] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.505080] sd 1:0:189:0: [sdgd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.505082] sd 1:0:189:0: [sdgd] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.505099] sd 1:0:94:0: [sdco] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.505107] sd 1:0:103:0: [sdcx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.505182] sd 1:0:138:0: [sdef] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.505230] sd 1:0:131:0: [sddy] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.505237] sd 1:0:141:0: [sdei] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.505239] sd 1:0:125:0: [sdds] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.505290] sd 1:0:139:0: [sdeg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.505543] sd 1:0:194:0: [sdgi] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.505544] sd 1:0:194:0: [sdgi] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.505588] sd 1:0:195:0: [sdgj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.505590] sd 1:0:195:0: [sdgj] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.513748] sd 1:0:196:0: [sdgk] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.513750] sd 1:0:196:0: [sdgk] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.513814] sd 1:0:182:0: [sdfx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.513816] sd 1:0:182:0: [sdfx] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.513863] sd 1:0:143:0: [sdek] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.513874] sd 1:0:140:0: [sdeh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.513946] sd 1:0:142:0: [sdej] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.519873] sd 1:0:197:0: [sdgl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.519875] sd 1:0:197:0: [sdgl] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.519898] sd 1:0:198:0: [sdgm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.519900] sd 1:0:198:0: [sdgm] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.519940] sd 1:0:144:0: [sdel] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.520042] sd 1:0:141:0: [sdei] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.528758] sd 1:0:146:0: [sden] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528763] sd 1:0:147:0: [sdeo] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528769] sd 1:0:149:0: [sdeq] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528784] sd 1:0:152:0: [sdet] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528785] sd 1:0:150:0: [sder] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528810] sd 1:0:158:0: [sdez] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528811] sd 1:0:151:0: [sdes] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528816] sd 1:0:37:0: [sdak] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528823] sd 1:0:155:0: [sdew] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528825] sd 1:0:145:0: [sdem] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528831] sd 1:0:153:0: [sdeu] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528834] sd 1:0:163:0: [sdfe] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528843] sd 1:0:162:0: [sdfd] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528847] sd 1:0:170:0: [sdfl] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528859] sd 1:0:160:0: [sdfb] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528872] sd 1:0:156:0: [sdex] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528873] sd 1:0:171:0: [sdfm] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528875] sd 1:0:168:0: [sdfj] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528884] sd 1:0:143:0: [sdek] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.528891] sd 1:0:169:0: [sdfk] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528893] sd 1:0:164:0: [sdff] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528898] sd 1:0:142:0: [sdej] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:55 2019][ 28.528901] sd 1:0:167:0: [sdfi] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528909] sd 1:0:159:0: [sdfa] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528915] sd 1:0:166:0: [sdfh] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528919] sd 1:0:157:0: [sdey] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.528989] sd 1:0:231:0: [sdht] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.528991] sd 1:0:231:0: [sdht] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.528992] sd 1:0:232:0: [sdhu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.528994] sd 1:0:232:0: [sdhu] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529003] sd 1:0:239:0: [sdib] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529005] sd 1:0:239:0: [sdib] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529006] sd 1:0:244:0: [sdig] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529007] sd 1:0:244:0: [sdig] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529249] sd 1:0:161:0: [sdfc] Write Protect is off [Thu Dec 12 07:46:55 2019][ 28.529302] sd 1:0:202:0: [sdgq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529304] sd 1:0:202:0: [sdgq] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529309] sd 1:0:199:0: [sdgn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529311] sd 1:0:199:0: [sdgn] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529313] sd 1:0:221:0: [sdhj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529315] sd 1:0:221:0: [sdhj] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529320] sd 1:0:216:0: [sdhe] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529322] sd 1:0:216:0: [sdhe] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529349] sd 1:0:225:0: [sdhn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529350] sd 1:0:225:0: [sdhn] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529386] sd 1:0:217:0: [sdhf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529387] sd 1:0:217:0: [sdhf] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529395] sd 1:0:218:0: [sdhg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529397] sd 1:0:218:0: [sdhg] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529404] sd 1:0:220:0: [sdhi] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529406] sd 1:0:220:0: [sdhi] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529415] sd 1:0:222:0: [sdhk] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529416] sd 1:0:222:0: [sdhk] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529426] sd 1:0:223:0: [sdhl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529427] sd 1:0:223:0: [sdhl] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529431] sd 1:0:205:0: [sdgt] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529433] sd 1:0:206:0: [sdgu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529434] sd 1:0:205:0: [sdgt] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529435] sd 1:0:206:0: [sdgu] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529438] sd 1:0:224:0: [sdhm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529440] sd 1:0:224:0: [sdhm] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529448] sd 1:0:209:0: [sdgx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529451] sd 1:0:243:0: [sdif] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529452] sd 1:0:209:0: [sdgx] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529454] sd 1:0:243:0: [sdif] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529458] sd 1:0:210:0: [sdgy] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529460] sd 1:0:210:0: [sdgy] 4096-byte physical blocks [Thu Dec 12 07:46:55 2019][ 28.529461] sd 1:0:211:0: [sdgz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:55 2019][ 28.529463] sd 1:0:211:0: [sdgz] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.529466] sd 1:0:212:0: [sdha] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.529467] sd 1:0:212:0: [sdha] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.529472] sd 1:0:213:0: [sdhb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.529474] sd 1:0:213:0: [sdhb] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.529479] sd 1:0:215:0: [sdhd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.529481] sd 1:0:215:0: [sdhd] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.529484] sd 1:0:214:0: [sdhc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.529485] sd 1:0:214:0: [sdhc] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.529487] sd 1:0:234:0: [sdhw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.529488] sd 1:0:234:0: [sdhw] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.529489] sd 1:0:233:0: [sdhv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.529491] sd 1:0:233:0: [sdhv] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.529500] sd 1:0:240:0: [sdic] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.529502] sd 1:0:240:0: [sdic] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.529515] sd 1:0:242:0: [sdie] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.529516] sd 1:0:201:0: [sdgp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.529518] sd 1:0:242:0: [sdie] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.529520] sd 1:0:201:0: [sdgp] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.529563] sd 1:0:200:0: [sdgo] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.529564] sd 1:0:200:0: [sdgo] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.529572] sd 1:0:203:0: [sdgr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.529573] sd 1:0:203:0: [sdgr] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.529580] sd 1:0:204:0: [sdgs] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.529581] sd 1:0:204:0: [sdgs] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.529588] sd 1:0:207:0: [sdgv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.529589] sd 1:0:207:0: [sdgv] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.529596] sd 1:0:208:0: [sdgw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.529597] sd 1:0:208:0: [sdgw] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.529604] sd 1:0:237:0: [sdhz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.529605] sd 1:0:237:0: [sdhz] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.535054] sd 1:0:181:0: [sdfw] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.535061] sd 1:0:177:0: [sdfs] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.535065] sd 1:0:172:0: [sdfn] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.535075] sd 1:0:178:0: [sdft] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.535083] sd 1:0:174:0: [sdfp] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.535092] sd 1:0:185:0: [sdfz] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.535096] sd 1:0:175:0: [sdfq] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.535099] sd 1:0:180:0: [sdfv] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.535121] sd 1:0:144:0: [sdel] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.535134] sd 1:0:187:0: [sdgb] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.535149] sd 1:0:154:0: [sdev] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.535180] sd 1:0:176:0: [sdfr] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.535229] sd 1:0:165:0: [sdfg] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.535231] sd 1:0:173:0: [sdfo] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.535462] sd 1:0:179:0: [sdfu] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.535725] sd 1:0:183:0: [sdfy] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.535733] sd 1:0:148:0: [sdep] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.535803] sd 1:0:238:0: [sdia] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.535804] sd 1:0:238:0: [sdia] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.535806] sd 1:0:236:0: [sdhy] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.535807] sd 1:0:236:0: [sdhy] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.535839] sd 1:0:98:0: [sdcs] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.535977] sd 1:0:241:0: [sdid] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.535978] sd 1:0:241:0: [sdid] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.537751] sd 1:0:219:0: [sdhh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.537753] sd 1:0:219:0: [sdhh] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.543980] sd 1:0:230:0: [sdhs] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.543982] sd 1:0:230:0: [sdhs] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.543983] sd 1:0:226:0: [sdho] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.543984] sd 1:0:226:0: [sdho] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.543986] sd 1:0:229:0: [sdhr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.543988] sd 1:0:229:0: [sdhr] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.544013] sd 1:0:146:0: [sden] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544014] sd 1:0:235:0: [sdhx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.544015] sd 1:0:235:0: [sdhx] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.544023] sd 1:0:191:0: [sdgf] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.544025] sd 1:0:147:0: [sdeo] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544028] sd 1:0:190:0: [sdge] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.544032] sd 1:0:186:0: [sdga] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.544033] sd 1:0:149:0: [sdeq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544056] sd 1:0:158:0: [sdez] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544058] sd 1:0:152:0: [sdet] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544060] sd 1:0:193:0: [sdgh] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.544066] sd 1:0:150:0: [sder] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544080] sd 1:0:145:0: [sdem] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544085] sd 1:0:37:0: [sdak] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544094] sd 1:0:153:0: [sdeu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544100] sd 1:0:192:0: [sdgg] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.544116] sd 1:0:162:0: [sdfd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544118] sd 1:0:160:0: [sdfb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544123] sd 1:0:194:0: [sdgi] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.544137] sd 1:0:156:0: [sdex] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544147] sd 1:0:167:0: [sdfi] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544148] sd 1:0:164:0: [sdff] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544155] sd 1:0:163:0: [sdfe] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544164] sd 1:0:155:0: [sdew] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544224] sd 1:0:166:0: [sdfh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544231] sd 1:0:157:0: [sdey] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544470] sd 1:0:159:0: [sdfa] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544492] sd 1:0:161:0: [sdfc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544498] sd 1:0:168:0: [sdfj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544510] sd 1:0:151:0: [sdes] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544516] sd 1:0:189:0: [sdgd] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.544527] sd 1:0:170:0: [sdfl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544579] sd 1:0:169:0: [sdfk] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544600] sd 1:0:171:0: [sdfm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.544649] sd 1:0:188:0: [sdgc] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.544778] sd 1:0:195:0: [sdgj] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.545021] sd 1:0:227:0: [sdhp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.545023] sd 1:0:227:0: [sdhp] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.545036] sd 1:0:228:0: [sdhq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Thu Dec 12 07:46:56 2019][ 28.545038] sd 1:0:228:0: [sdhq] 4096-byte physical blocks [Thu Dec 12 07:46:56 2019][ 28.550681] sd 1:0:182:0: [sdfx] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.550696] sd 1:0:172:0: [sdfn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.550700] sd 1:0:181:0: [sdfw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.550702] sd 1:0:196:0: [sdgk] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.550717] sd 1:0:175:0: [sdfq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.550724] sd 1:0:177:0: [sdfs] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.550725] sd 1:0:178:0: [sdft] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.550729] sd 1:0:180:0: [sdfv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.550749] sd 1:0:185:0: [sdfz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.550769] sd 1:0:187:0: [sdgb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.550771] sd 1:0:154:0: [sdev] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.550822] sd 1:0:176:0: [sdfr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.550848] sd 1:0:165:0: [sdfg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.551176] sd 1:0:179:0: [sdfu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.551259] sd 1:0:173:0: [sdfo] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.551425] sd 1:0:183:0: [sdfy] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.551500] sd 1:0:148:0: [sdep] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.551531] sd 1:0:98:0: [sdcs] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.557368] sd 1:0:197:0: [sdgl] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.557370] sd 1:0:186:0: [sdga] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.557372] sd 1:0:198:0: [sdgm] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.557403] sd 1:0:193:0: [sdgh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.557458] sd 1:0:190:0: [sdge] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.557470] sd 1:0:191:0: [sdgf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.557506] sd 1:0:194:0: [sdgi] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.557527] sd 1:0:192:0: [sdgg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.557737] sd 1:0:189:0: [sdgd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.557786] sd 1:0:188:0: [sdgc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.557854] sd 1:0:195:0: [sdgj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.558228] sd 1:0:174:0: [sdfp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.566634] sd 1:0:244:0: [sdig] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.566639] sd 1:0:182:0: [sdfx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.566661] sd 1:0:231:0: [sdht] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.566662] sd 1:0:196:0: [sdgk] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:56 2019][ 28.566670] sd 1:0:199:0: [sdgn] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.566674] sd 1:0:232:0: [sdhu] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.566761] sd 1:0:202:0: [sdgq] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.566775] sd 1:0:221:0: [sdhj] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.566810] sd 1:0:209:0: [sdgx] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.566909] sd 1:0:212:0: [sdha] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.566913] sd 1:0:234:0: [sdhw] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.566958] sd 1:0:201:0: [sdgp] Write Protect is off [Thu Dec 12 07:46:56 2019][ 28.567027] sd 1:0:200:0: [sdgo] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.567034] sd 1:0:204:0: [sdgs] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.567039] sd 1:0:203:0: [sdgr] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.567087] sd 1:0:240:0: [sdic] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.567098] sd 1:0:205:0: [sdgt] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.567109] sd 1:0:225:0: [sdhn] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.567173] sd 1:0:243:0: [sdif] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.567187] sd 1:0:220:0: [sdhi] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.567217] sd 1:0:222:0: [sdhk] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.567229] sd 1:0:206:0: [sdgu] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.567233] sd 1:0:213:0: [sdhb] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.567385] sd 1:0:223:0: [sdhl] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.567393] sd 1:0:208:0: [sdgw] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.567407] sd 1:0:214:0: [sdhc] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.567425] sd 1:0:237:0: [sdhz] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.567535] sd 1:0:242:0: [sdie] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.568271] sd 1:0:210:0: [sdgy] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.568282] sd 1:0:207:0: [sdgv] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.568290] sd 1:0:215:0: [sdhd] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.568473] sd 1:0:239:0: [sdib] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.568482] sd 1:0:233:0: [sdhv] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.569273] sd 1:0:211:0: [sdgz] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.569341] sd 1:0:217:0: [sdhf] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.575950] sd 1:0:197:0: [sdgl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.575951] sd 1:0:198:0: [sdgm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.575979] sd 1:0:216:0: [sdhe] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.575988] sd 1:0:218:0: [sdhg] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.575996] sd 1:0:224:0: [sdhm] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.576492] sd 1:0:236:0: [sdhy] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.576499] sd 1:0:241:0: [sdid] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.577071] sd 1:0:238:0: [sdia] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.577130] sd 1:0:219:0: [sdhh] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.582599] sd 1:0:230:0: [sdhs] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.582601] sd 1:0:229:0: [sdhr] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.582607] sd 1:0:226:0: [sdho] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.582612] sd 1:0:235:0: [sdhx] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.582617] sd 1:0:231:0: [sdht] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582621] sd 1:0:244:0: [sdig] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582638] sd 1:0:199:0: [sdgn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582657] sd 1:0:232:0: [sdhu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582705] sd 1:0:221:0: [sdhj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582711] sd 1:0:202:0: [sdgq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582720] sd 1:0:209:0: [sdgx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582775] sd 1:0:212:0: [sdha] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582780] sd 1:0:234:0: [sdhw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582813] sd 1:0:201:0: [sdgp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582842] sd 1:0:200:0: [sdgo] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582858] sd 1:0:204:0: [sdgs] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582869] sd 1:0:203:0: [sdgr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582891] sd 1:0:240:0: [sdic] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582899] sd 1:0:205:0: [sdgt] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582909] sd 1:0:225:0: [sdhn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582943] sd 1:0:220:0: [sdhi] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582964] sd 1:0:222:0: [sdhk] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582974] sd 1:0:206:0: [sdgu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.582997] sd 1:0:213:0: [sdhb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.583047] sd 1:0:208:0: [sdgw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.583050] sd 1:0:214:0: [sdhc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.583062] sd 1:0:223:0: [sdhl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.583064] sd 1:0:237:0: [sdhz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.583093] sd 1:0:242:0: [sdie] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.583629] sd 1:0:211:0: [sdgz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.583652] sd 1:0:217:0: [sdhf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.583870] sd 1:0:207:0: [sdgv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.583882] sd 1:0:215:0: [sdhd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.584458] sd 1:0:243:0: [sdif] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.584461] sd 1:0:239:0: [sdib] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.584471] sd 1:0:233:0: [sdhv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.585665] sd 1:0:228:0: [sdhq] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.585673] sd 1:0:227:0: [sdhp] Write Protect is off [Thu Dec 12 07:46:57 2019][ 28.589410] sd 1:0:224:0: [sdhm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.589415] sd 1:0:216:0: [sdhe] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.589418] sd 1:0:218:0: [sdhg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.589606] sd 1:0:236:0: [sdhy] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.589630] sd 1:0:241:0: [sdid] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.590125] sd 1:0:210:0: [sdgy] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.590295] sd 1:0:238:0: [sdia] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.590354] sd 1:0:219:0: [sdhh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.598647] sd 1:0:230:0: [sdhs] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.598649] sd 1:0:235:0: [sdhx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.598652] sd 1:0:229:0: [sdhr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.598656] sd 1:0:226:0: [sdho] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.599454] sd 1:0:228:0: [sdhq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.600542] sd 1:0:227:0: [sdhp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:57 2019][ 28.621891] sd 1:0:3:0: [sdc] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.629727] sd 1:0:15:0: [sdo] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.630487] sd 1:0:21:0: [sdu] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.636539] sd 1:0:14:0: [sdn] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.636541] sd 1:0:20:0: [sdt] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.636579] sd 1:0:7:0: [sdg] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.636672] sd 1:0:12:0: [sdl] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.636701] sd 1:0:10:0: [sdj] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.636822] sd 1:0:17:0: [sdq] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.636863] sd 1:0:25:0: [sdy] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.645560] sd 1:0:40:0: [sdan] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.645571] sd 1:0:5:0: [sde] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.645601] sd 1:0:29:0: [sdac] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.645602] sd 1:0:6:0: [sdf] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.645654] sd 1:0:32:0: [sdaf] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.645696] sd 1:0:38:0: [sdal] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.645720] sd 1:0:28:0: [sdab] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.645781] sd 1:0:8:0: [sdh] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.645797] sd 1:0:4:0: [sdd] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.645886] sd 1:0:45:0: [sdas] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.645936] sd 1:0:33:0: [sdag] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.646025] sd 1:0:18:0: [sdr] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.646039] sd 1:0:47:0: [sdau] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.652111] sd 1:0:22:0: [sdv] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.652135] sd 1:0:58:0: [sdbf] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.652152] sd 1:0:55:0: [sdbc] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.652163] sd 1:0:57:0: [sdbe] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.652172] sd 1:0:46:0: [sdat] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.652174] sd 1:0:42:0: [sdap] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.652177] sd 1:0:19:0: [sds] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.652195] sd 1:0:16:0: [sdp] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.652204] sd 1:0:35:0: [sdai] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.652208] sd 1:0:31:0: [sdae] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.652317] sd 1:0:39:0: [sdam] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.652389] sd 1:0:24:0: [sdx] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.652488] sd 1:0:9:0: [sdi] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.652493] sd 1:0:49:0: [sdaw] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.652518] sd 1:0:50:0: [sdax] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.652551] sd 1:0:41:0: [sdao] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.652563] sd 1:0:43:0: [sdaq] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.661006] sd 1:0:54:0: [sdbb] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.661134] sd 1:0:13:0: [sdm] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.661290] sd 1:0:59:0: [sdbg] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.661307] sd 1:0:30:0: [sdad] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.661314] sd 1:0:51:0: [sday] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.661325] sd 1:0:34:0: [sdah] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.661340] sd 1:0:27:0: [sdaa] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.670104] sd 1:0:52:0: [sdaz] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.670159] sd 1:0:60:0: [sdbh] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.670244] sd 1:0:56:0: [sdbd] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.670308] sd 1:0:26:0: [sdz] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.670317] sd 1:0:48:0: [sdav] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.670893] sd 1:0:44:0: [sdar] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.671011] sd 1:0:53:0: [sdba] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.676452] sd 1:0:23:0: [sdw] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.676488] sd 1:0:61:0: [sdbi] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.682761] sd 1:0:71:0: [sdbr] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.682851] sd 1:0:36:0: [sdaj] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.682865] sd 1:0:63:0: [sdbj] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.691496] sd 1:0:11:0: [sdk] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.691841] sd 1:0:68:0: [sdbo] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.698469] sd 1:0:81:0: [sdcb] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.698822] sd 1:0:93:0: [sdcn] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.698955] sd 1:0:64:0: [sdbk] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.707422] sd 1:0:75:0: [sdbv] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.707430] sd 1:0:104:0: [sdcy] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.707437] sd 1:0:78:0: [sdby] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.707476] sd 1:0:65:0: [sdbl] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.707593] sd 1:0:89:0: [sdcj] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.707673] sd 1:0:101:0: [sdcv] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.707682] sd 1:0:115:0: [sddj] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.707800] sd 1:0:67:0: [sdbn] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.707804] sd 1:0:82:0: [sdcc] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.707831] sd 1:0:69:0: [sdbp] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.707840] sd 1:0:91:0: [sdcl] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.707853] sd 1:0:102:0: [sdcw] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.708046] sd 1:0:136:0: [sded] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.708073] sd 1:0:86:0: [sdcg] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.713962] sd 1:0:132:0: [sddz] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.713977] sd 1:0:76:0: [sdbw] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.714030] sd 1:0:137:0: [sdee] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.714064] sd 1:0:106:0: [sdda] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.714073] sd 1:0:120:0: [sddo] Attached SCSI disk [Thu Dec 12 07:46:57 2019][ 28.714101] sd 1:0:84:0: [sdce] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.714116] sd 1:0:70:0: [sdbq] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.714131] sd 1:0:80:0: [sdca] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.714162] sd 1:0:134:0: [sdeb] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.714165] sd 1:0:87:0: [sdch] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.714231] sd 1:0:124:0: [sddr] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.714251] sd 1:0:135:0: [sdec] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.714324] sd 1:0:112:0: [sddg] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.714350] sd 1:0:133:0: [sdea] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.714374] sd 1:0:99:0: [sdct] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.714753] sd 1:0:100:0: [sdcu] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.722990] sd 1:0:79:0: [sdbz] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.723026] sd 1:0:126:0: [sddt] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.723041] sd 1:0:105:0: [sdcz] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.723062] sd 1:0:83:0: [sdcd] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.723096] sd 1:0:74:0: [sdbu] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.723314] sd 1:0:88:0: [sdci] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.723344] sd 1:0:95:0: [sdcp] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.723419] sd 1:0:103:0: [sdcx] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.723432] sd 1:0:110:0: [sdde] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.723438] sd 1:0:139:0: [sdeg] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.729298] sd 1:0:130:0: [sddx] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.729324] sd 1:0:128:0: [sddv] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.729339] sd 1:0:131:0: [sddy] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.729346] sd 1:0:117:0: [sddl] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.729373] sd 1:0:125:0: [sdds] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.729468] sd 1:0:77:0: [sdbx] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.729470] sd 1:0:114:0: [sddi] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.729559] sd 1:0:122:0: [sddq] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.729566] sd 1:0:129:0: [sddw] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.729604] sd 1:0:94:0: [sdco] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.729611] sd 1:0:127:0: [sddu] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.729634] sd 1:0:72:0: [sdbs] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.732769] sd 1:0:116:0: [sddk] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.738114] sd 1:0:92:0: [sdcm] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.738119] sd 1:0:85:0: [sdcf] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.738222] sd 1:0:90:0: [sdck] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.738270] sd 1:0:109:0: [sddd] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.738285] sd 1:0:107:0: [sddb] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.738329] sd 1:0:119:0: [sddn] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.738417] sd 1:0:66:0: [sdbm] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.738488] sd 1:0:113:0: [sddh] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.738502] sd 1:0:141:0: [sdei] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.738507] sd 1:0:121:0: [sddp] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.744092] sd 1:0:144:0: [sdel] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.744136] sd 1:0:140:0: [sdeh] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.744160] sd 1:0:108:0: [sddc] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.744209] sd 1:0:143:0: [sdek] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.744220] sd 1:0:156:0: [sdex] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.744237] sd 1:0:111:0: [sddf] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.744272] sd 1:0:73:0: [sdbt] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.752608] sd 1:0:142:0: [sdej] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.752631] sd 1:0:152:0: [sdet] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.752633] sd 1:0:159:0: [sdfa] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.752640] sd 1:0:168:0: [sdfj] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.752673] sd 1:0:160:0: [sdfb] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.752690] sd 1:0:155:0: [sdew] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.752727] sd 1:0:167:0: [sdfi] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.752747] sd 1:0:171:0: [sdfm] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.752749] sd 1:0:173:0: [sdfo] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.752840] sd 1:0:151:0: [sdes] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.752841] sd 1:0:163:0: [sdfe] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.752842] sd 1:0:37:0: [sdak] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.758414] sd 1:0:172:0: [sdfn] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.758423] sd 1:0:179:0: [sdfu] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.758433] sd 1:0:147:0: [sdeo] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.758441] sd 1:0:154:0: [sdev] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.758466] sd 1:0:178:0: [sdft] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.758473] sd 1:0:170:0: [sdfl] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.758479] sd 1:0:157:0: [sdey] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.758509] sd 1:0:138:0: [sdef] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.758513] sd 1:0:181:0: [sdfw] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.758516] sd 1:0:153:0: [sdeu] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.758552] sd 1:0:169:0: [sdfk] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.758630] sd 1:0:98:0: [sdcs] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.758638] sd 1:0:148:0: [sdep] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.766877] sd 1:0:180:0: [sdfv] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.766894] sd 1:0:175:0: [sdfq] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.766903] sd 1:0:146:0: [sden] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.766916] sd 1:0:189:0: [sdgd] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.766922] sd 1:0:193:0: [sdgh] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.766926] sd 1:0:161:0: [sdfc] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.766938] sd 1:0:145:0: [sdem] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.766943] sd 1:0:118:0: [sddm] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.766965] sd 1:0:166:0: [sdfh] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.766968] sd 1:0:96:0: [sdcq] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.766998] sd 1:0:186:0: [sdga] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.767010] sd 1:0:158:0: [sdez] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.767047] sd 1:0:177:0: [sdfs] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.767050] sd 1:0:192:0: [sdgg] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.772600] sd 1:0:188:0: [sdgc] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.772603] sd 1:0:185:0: [sdfz] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.772605] sd 1:0:176:0: [sdfr] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.772608] sd 1:0:194:0: [sdgi] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.772609] sd 1:0:97:0: [sdcr] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.772612] sd 1:0:164:0: [sdff] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.772634] sd 1:0:190:0: [sdge] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.772644] sd 1:0:165:0: [sdfg] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.772687] sd 1:0:150:0: [sder] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.772714] sd 1:0:149:0: [sdeq] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.780993] sd 1:0:196:0: [sdgk] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.781001] sd 1:0:174:0: [sdfp] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.781051] sd 1:0:162:0: [sdfd] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.781058] sd 1:0:182:0: [sdfx] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.781152] sd 1:0:195:0: [sdgj] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.781256] sd 1:0:183:0: [sdfy] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.786971] sd 1:0:201:0: [sdgp] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.786974] sd 1:0:233:0: [sdhv] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.786987] sd 1:0:213:0: [sdhb] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.787001] sd 1:0:197:0: [sdgl] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.787015] sd 1:0:240:0: [sdic] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.787017] sd 1:0:205:0: [sdgt] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.787093] sd 1:0:207:0: [sdgv] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.787102] sd 1:0:187:0: [sdgb] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.787105] sd 1:0:209:0: [sdgx] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.787141] sd 1:0:206:0: [sdgu] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.787153] sd 1:0:212:0: [sdha] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.787174] sd 1:0:214:0: [sdhc] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.787179] sd 1:0:215:0: [sdhd] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.787185] sd 1:0:242:0: [sdie] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.795429] sd 1:0:220:0: [sdhi] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.795593] sd 1:0:239:0: [sdib] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.795594] sd 1:0:191:0: [sdgf] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.795615] sd 1:0:208:0: [sdgw] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.801279] sd 1:0:232:0: [sdhu] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.801290] sd 1:0:234:0: [sdhw] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.801301] sd 1:0:200:0: [sdgo] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.801395] sd 1:0:237:0: [sdhz] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.801401] sd 1:0:222:0: [sdhk] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.801419] sd 1:0:217:0: [sdhf] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.801471] sd 1:0:241:0: [sdid] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.809725] sd 1:0:198:0: [sdgm] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.809734] sd 1:0:243:0: [sdif] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.809806] sd 1:0:210:0: [sdgy] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.809860] sd 1:0:221:0: [sdhj] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.809889] sd 1:0:204:0: [sdgs] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.809976] sd 1:0:203:0: [sdgr] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.815588] sd 1:0:238:0: [sdia] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.815597] sd 1:0:225:0: [sdhn] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.815601] sd 1:0:219:0: [sdhh] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.815603] sd 1:0:223:0: [sdhl] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.815635] sd 1:0:244:0: [sdig] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.815702] sd 1:0:236:0: [sdhy] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.815810] sd 1:0:211:0: [sdgz] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.824121] sd 1:0:235:0: [sdhx] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.824223] sd 1:0:216:0: [sdhe] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.829877] sd 1:0:231:0: [sdht] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.829900] sd 1:0:228:0: [sdhq] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.829915] sd 1:0:199:0: [sdgn] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.829920] sd 1:0:227:0: [sdhp] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.829923] sd 1:0:226:0: [sdho] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.830006] sd 1:0:218:0: [sdhg] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.830013] sd 1:0:229:0: [sdhr] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.830019] sd 1:0:224:0: [sdhm] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.838267] sd 1:0:230:0: [sdhs] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 28.838439] sd 1:0:202:0: [sdgq] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ 36.058479] sd 1:0:2:0: [sdb] Write Protect is off [Thu Dec 12 07:46:58 2019][ 36.063717] sd 1:0:2:0: [sdb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Thu Dec 12 07:46:58 2019][ 36.084126] sd 1:0:2:0: [sdb] Attached SCSI disk [Thu Dec 12 07:46:58 2019][ OK ] Found device PERC_H330_Mini os. [Thu Dec 12 07:46:58 2019] Starting File System Check on /dev/...4-e7db-49b7-baed-d6c7905c5cdc... [Thu Dec 12 07:46:58 2019][ OK ] Started dracut initqueue hook. [Thu Dec 12 07:46:58 2019][ OK ] Reached target Remote File Systems (Pre). [Thu Dec 12 07:46:58 2019][ OK ] Reached target Remote File Systems. [Thu Dec 12 07:46:58 2019][ OK ] Started File System Check on /dev/d...4c4-e7db-49b7-baed-d6c7905c5cdc. [Thu Dec 12 07:46:58 2019] Mounting /sysroot... [Thu Dec 12 07:46:58 2019][ 36.172396] EXT4-fs (sda2): mounted filesystem with ordered data mode. Opts: (null) [Thu Dec 12 07:46:58 2019][ OK ] Mounted /sysroot. [Thu Dec 12 07:46:58 2019][ OK ] Reached target Initrd Root File System. [Thu Dec 12 07:46:58 2019] Starting Reload Configuration from the Real Root... [Thu Dec 12 07:46:58 2019][ OK ] Started Reload Configuration from the Real Root. [Thu Dec 12 07:46:58 2019][ OK ] Reached target Initrd File Systems. [Thu Dec 12 07:46:58 2019][ OK ] Reached target Initrd Default Target. [Thu Dec 12 07:46:59 2019] Starting dracut pre-pivot and cleanup hook... [Thu Dec 12 07:46:59 2019][ OK ] Started dracut pre-pivot and cleanup hook. [Thu Dec 12 07:46:59 2019] Starting Cleaning Up and Shutting Down Daemons... [Thu Dec 12 07:46:59 2019] Starting Plymouth switch root service... [Thu Dec 12 07:46:59 2019][ OK ] Stopped target Timers. [Thu Dec 12 07:46:59 2019][ OK ] Stopped Cleaning Up and Shutting Down Daemons. [Thu Dec 12 07:46:59 2019][ OK ] Stopped dracut pre-pivot and cleanup hook. [Thu Dec 12 07:46:59 2019][ OK ] Stopped target Initrd Default Target. [Thu Dec 12 07:46:59 2019][ OK ] Stopped target Remote File Systems. [Thu Dec 12 07:46:59 2019][ OK ] Stopped target Remote File Systems (Pre). [Thu Dec 12 07:46:59 2019][ OK ] Stopped target Basic System. [Thu Dec 12 07:46:59 2019][ OK ] Stopped target Slices. [Thu Dec 12 07:46:59 2019][ OK ] Stopped target Paths. [Thu Dec 12 07:46:59 2019][ OK ] Stopped target System Initialization. [Thu Dec 12 07:46:59 2019][ OK ] Stopped target Swap. [Thu Dec 12 07:46:59 2019] Stopping udev Kernel Device Manager... [Thu Dec 12 07:46:59 2019][ [ 36.481514] systemd-journald[353]: Received SIGTERM from PID 1 (systemd). [Thu Dec 12 07:46:59 2019]OK ] Stopped target Local File Systems. [Thu Dec 12 07:46:59 2019][ OK ] Stopped Apply Kernel Variables. [Thu Dec 12 07:46:59 2019][ OK ] Stopped target Sockets. [Thu Dec 12 07:46:59 2019][ OK ] Stopped dracut initqueue hook. [Thu Dec 12 07:46:59 2019][ OK ] Stopped udev Coldplug all Devices. [Thu Dec 12 07:46:59 2019][ OK ] Stopped udev Kernel Device Manager. [Thu Dec 12 07:46:59 2019][ OK ] Stopped dracut pre-udev hook.[ 36.518927] SELinux: Disabled at runtime. [Thu Dec 12 07:46:59 2019] [Thu Dec 12 07:46:59 2019][ OK ] Stopped dracut cmdline hook. [Thu Dec 12 07:46:59 2019][ OK ] Stopped Create Static Device Nodes in /dev. [Thu Dec 12 07:46:59 2019][ OK ] Stopped Create list of required sta...ce nodes for the current kernel. [Thu Dec 12 07:46:59 2019][ OK ] Closed udev Kernel Socket. [Thu Dec 12 07:46:59 2019][ OK ] Closed udev Control Socket. [Thu Dec 12 07:46:59 2019] Starting Cleanup udevd DB... [Thu Dec 12 07:46:59 2019][ OK ] Started Plymouth switch root service. [Thu Dec 12 07:46:59 2019][ OK ] Started Cleanup udevd DB. [Thu Dec 12 07:46:59 2019][ OK ] Reached target S[ 36.564815] type=1404 audit(1576165619.054:2): selinux=0 auid=4294967295 ses=4294967295 [Thu Dec 12 07:46:59 2019]witch Root. [Thu Dec 12 07:46:59 2019] Starting Switch Root... [Thu Dec 12 07:46:59 2019][ 36.591713] ip_tables: (C) 2000-2006 Netfilter Core Team [Thu Dec 12 07:46:59 2019][ 36.597704] systemd[1]: Inserted module 'ip_tables' [Thu Dec 12 07:46:59 2019] [Thu Dec 12 07:46:59 2019]Welcome to CentOS Linux 7 (Core)! [Thu Dec 12 07:46:59 2019] [Thu Dec 12 07:46:59 2019][ OK ] Stopped Switch Root. [Thu Dec 12 07:46:59 2019][ OK ] Stopped Journal Service. [Thu Dec 12 07:46:59 2019] Starting Journal Service... [Thu Dec 12 07:46:59 2019][ OK ] Created slice system-selinux\x2dpol...grate\x2dlocal\x2dchanges.slice. [Thu Dec 12 07:46:59 2019][ OK ] Listen[ 36.706244] EXT4-fs (sda2): re-mounted. Opts: (null) [Thu Dec 12 07:46:59 2019]ing on udev Kernel Socket. [Thu Dec 12 07:46:59 2019] Mounting Huge Pages File System... [Thu Dec 12 07:46:59 2019] Mounting Debug File System..[ 36.720664] systemd-journald[5928]: Received request to flush runtime journal from PID 1 [Thu Dec 12 07:46:59 2019]. [Thu Dec 12 07:46:59 2019][ OK ] Created slice User and Session Slice. [Thu Dec 12 07:46:59 2019][ OK ] Reached target Slices. [Thu Dec 12 07:46:59 2019][ OK ] Created slice system-serial\x2dgetty.slice. [Thu Dec 12 07:46:59 2019][ OK ] Started Forward Password Requests to Wall Directory Watch. [Thu Dec 12 07:46:59 2019][ OK ] Listening on udev Control Socket. [Thu Dec 12 07:46:59 2019][ OK ] Reached target Local Encrypted Volumes. [Thu Dec 12 07:46:59 2019] Starting Replay Read-Ahead Data... [Thu Dec 12 07:46:59 2019] Starting Create list of required st... nodes for the current kernel... [Thu Dec 12 07:46:59 2019][ OK ] Stopped target Switch Root. [Thu Dec 12 07:46:59 2019][ OK ] Stopped target Initrd Root File System. [Thu Dec 12 07:46:59 2019][ OK ] Stopped target Initrd File Systems. [Thu Dec 12 07:46:59 2019] Starting Availability of block devices... [Thu Dec 12 07:46:59 2019] Starting Read and set NIS domainname from /etc/sysconfig/network... [Thu Dec 12 07:46:59 2019][ OK ] Created slice system-getty.slice. [Thu Dec 12 07:46:59 2019][ OK ] Set up automount Arbitrary Executab...ats File System Automount Point. [Thu Dec 12 07:46:59 2019][ OK ] Listening on /dev/initctl Compatibility Named Pipe. [Thu Dec 12 07:46:59 2019][[32[ 36.816429] ACPI Error: m OK ] ReaNo handler for Region [SYSI] (ffff8db8a9e77a68) [IPMI]ched target RPC (20130517/evregion-162) [Thu Dec 12 07:46:59 2019]Port Mapper. [Thu Dec 12 07:46:59 2019] [ 36.828968] ACPI Error: MountingRegion IPMI (ID=7) has no handler POSIX Message Q (20130517/exfldio-305) [Thu Dec 12 07:46:59 2019]ueue File System[ 36.840574] ACPI Error: ... [Thu Dec 12 07:46:59 2019][ OKMethod parse/execution failed ] Listenin[\_SB_.PMI0._GHL] (Node ffff8dc8a9e7a5a0)g on Delayed Shu, AE_NOT_EXISTtdown Socket. [Thu Dec 12 07:46:59 2019] (20130517/psparse-536) [Thu Dec 12 07:46:59 2019] Startin[ 36.859521] ACPI Error: g Collect Read-Ahead Data... [Thu Dec 12 07:46:59 2019][[ 36.863333] ipmi message handler version 39.2 [Thu Dec 12 07:46:59 2019] OK ][ 36.863505] ACPI Exception: AE_NOT_EXIST, Reached target Evaluating _PMCPaths. [Thu Dec 12 07:46:59 2019][ (20130517/power_meter-753) [Thu Dec 12 07:46:59 2019] OK ] StartMethod parse/execution failed [\_SB_.PMI0._PMC] (Node ffff8dc8a9e7a500), AE_NOT_EXIST (20130517/psparse-536) [Thu Dec 12 07:46:59 2019]ed Replay Read-A[ 36.883074] piix4_smbus 0000:00:14.0: SMBus Host Controller at 0xb00, revision 0 [Thu Dec 12 07:46:59 2019]head Data. [Thu Dec 12 07:46:59 2019][[[ 36.902168] piix4_smbus 0000:00:14.0: Using register 0x2e for SMBus port selection [Thu Dec 12 07:46:59 2019]32m OK ] Started Create li[ 36.913609] ipmi device interface [Thu Dec 12 07:46:59 2019]st of required sta...ce nodes fo[ 36.918997] ccp 0000:02:00.2: 3 command queues available [Thu Dec 12 07:46:59 2019]r the current ke[ 36.925472] ccp 0000:02:00.2: Queue 2 can access 4 LSB regions [Thu Dec 12 07:46:59 2019]rnel. [Thu Dec 12 07:46:59 2019][ [ 36.932522] ccp 0000:02:00.2: Queue 3 can access 4 LSB regions [Thu Dec 12 07:46:59 2019]OK ] Starte[ 36.934319] sd 0:2:0:0: Attached scsi generic sg0 type 0 [Thu Dec 12 07:46:59 2019]d Availability o[ 36.935096] scsi 1:0:0:0: Attached scsi generic sg1 type 13 [Thu Dec 12 07:46:59 2019]f block devices.[ 36.935457] scsi 1:0:1:0: Attached scsi generic sg2 type 13 [Thu Dec 12 07:46:59 2019] [Thu Dec 12 07:46:59 2019][ OK [ 36.935821] sd 1:0:2:0: Attached scsi generic sg3 type 0 [Thu Dec 12 07:46:59 2019][0m] Started Col[ 36.936117] sd 1:0:3:0: Attached scsi generic sg4 type 0 [Thu Dec 12 07:46:59 2019]lect Read-Ahead [ 36.936298] sd 1:0:4:0: Attached scsi generic sg5 type 0 [Thu Dec 12 07:46:59 2019]Data. [Thu Dec 12 07:46:59 2019] [ 36.936468] sd 1:0:5:0: Attached scsi generic sg6 type 0 [Thu Dec 12 07:46:59 2019] Starting Create[ 36.936623] sd 1:0:6:0: Attached scsi generic sg7 type 0 [Thu Dec 12 07:46:59 2019] Static Device N[ 36.936694] sd 1:0:7:0: Attached scsi generic sg8 type 0 [Thu Dec 12 07:46:59 2019]odes in /dev... [ 36.936754] sd 1:0:8:0: Attached scsi generic sg9 type 0 [Thu Dec 12 07:46:59 2019] [Thu Dec 12 07:46:59 2019] Start[ 36.936797] sd 1:0:9:0: Attached scsi generic sg10 type 0 [Thu Dec 12 07:46:59 2019]ing Remount Root[ 36.936885] sd 1:0:10:0: Attached scsi generic sg11 type 0 [Thu Dec 12 07:46:59 2019] and Kernel File[ 36.937745] sd 1:0:11:0: Attached scsi generic sg12 type 0 [Thu Dec 12 07:46:59 2019] Systems... [Thu Dec 12 07:46:59 2019] [ 36.938193] sd 1:0:12:0: Attached scsi generic sg13 type 0 [Thu Dec 12 07:46:59 2019] Starting [ 36.938772] sd 1:0:13:0: Attached scsi generic sg14 type 0 [Thu Dec 12 07:46:59 2019]Apply Kernel Var[ 36.939021] sd 1:0:14:0: Attached scsi generic sg15 type 0 [Thu Dec 12 07:46:59 2019]iables... [Thu Dec 12 07:46:59 2019][[3[ 36.939445] sd 1:0:15:0: Attached scsi generic sg16 type 0 [Thu Dec 12 07:46:59 2019]2m OK ] Mo[ 36.939967] sd 1:0:16:0: Attached scsi generic sg17 type 0 [Thu Dec 12 07:46:59 2019]unted Huge Pages[ 36.940333] sd 1:0:17:0: Attached scsi generic sg18 type 0 [Thu Dec 12 07:46:59 2019] File System. [Thu Dec 12 07:46:59 2019][ 36.940833] sd 1:0:18:0: Attached scsi generic sg19 type 0 [Thu Dec 12 07:46:59 2019][ OK [ 36.941184] sd 1:0:19:0: Attached scsi generic sg20 type 0 [Thu Dec 12 07:46:59 2019]] Mounted Debug [ 36.941423] sd 1:0:20:0: Attached scsi generic sg21 type 0 [Thu Dec 12 07:46:59 2019]File System. [Thu Dec 12 07:46:59 2019][[ 36.941667] sd 1:0:21:0: Attached scsi generic sg22 type 0 [Thu Dec 12 07:46:59 2019] OK ][ 36.942267] sd 1:0:22:0: Attached scsi generic sg23 type 0 [Thu Dec 12 07:46:59 2019] Mounted POSIX M[ 36.942624] sd 1:0:23:0: Attached scsi generic sg24 type 0 [Thu Dec 12 07:46:59 2019]essage Queue Fil[ 36.942925] sd 1:0:24:0: Attached scsi generic sg25 type 0 [Thu Dec 12 07:46:59 2019]e System. [Thu Dec 12 07:46:59 2019][[3[ 36.943353] sd 1:0:25:0: Attached scsi generic sg26 type 0 [Thu Dec 12 07:46:59 2019]2m OK ] St[ 36.943606] sd 1:0:26:0: Attached scsi generic sg27 type 0 [Thu Dec 12 07:46:59 2019]arted Journal Se[ 36.944048] sd 1:0:27:0: Attached scsi generic sg28 type 0 [Thu Dec 12 07:46:59 2019]rvice. [Thu Dec 12 07:46:59 2019][ [ 36.944482] sd 1:0:28:0: Attached scsi generic sg29 type 0 [Thu Dec 12 07:46:59 2019] OK ] Start[ 36.944978] sd 1:0:29:0: Attached scsi generic sg30 type 0 [Thu Dec 12 07:46:59 2019]ed Read and set [ 36.945994] sd 1:0:30:0: Attached scsi generic sg31 type 0 [Thu Dec 12 07:46:59 2019]NIS domainname f[ 36.946346] sd 1:0:31:0: Attached scsi generic sg32 type 0 [Thu Dec 12 07:46:59 2019]rom /etc/sysconf[ 36.946664] sd 1:0:32:0: Attached scsi generic sg33 type 0 [Thu Dec 12 07:46:59 2019]ig/network. [Thu Dec 12 07:46:59 2019][[ 36.947009] sd 1:0:33:0: Attached scsi generic sg34 type 0 [Thu Dec 12 07:46:59 2019][32m OK ] [ 36.947265] sd 1:0:34:0: Attached scsi generic sg35 type 0 [Thu Dec 12 07:46:59 2019]Started Remount [ 36.947519] sd 1:0:35:0: Attached scsi generic sg36 type 0 [Thu Dec 12 07:46:59 2019]Root and Kernel [ 36.947835] sd 1:0:36:0: Attached scsi generic sg37 type 0 [Thu Dec 12 07:46:59 2019]File Systems. [Thu Dec 12 07:46:59 2019][ 36.948383] sd 1:0:37:0: Attached scsi generic sg38 type 0 [Thu Dec 12 07:46:59 2019] Startin[ 36.948915] sd 1:0:38:0: Attached scsi generic sg39 type 0 [Thu Dec 12 07:46:59 2019]g Flush Journal [ 36.949290] sd 1:0:39:0: Attached scsi generic sg40 type 0 [Thu Dec 12 07:46:59 2019]to Persistent St[ 36.949710] sd 1:0:40:0: Attached scsi generic sg41 type 0 [Thu Dec 12 07:46:59 2019]orage... [Thu Dec 12 07:46:59 2019] [ 36.950045] sd 1:0:41:0: Attached scsi generic sg42 type 0 [Thu Dec 12 07:46:59 2019] Starting Con[ 36.950419] sd 1:0:42:0: Attached scsi generic sg43 type 0 [Thu Dec 12 07:46:59 2019][ 36.950796] sd 1:0:43:0: Attached scsi generic sg44 type 0 [Thu Dec 12 07:46:59 2019]figure read-only[ 36.951414] sd 1:0:44:0: Attached scsi generic sg45 type 0 [Thu Dec 12 07:46:59 2019] root support...[ 36.952009] sd 1:0:45:0: Attached scsi generic sg46 type 0 [Thu Dec 12 07:46:59 2019] [Thu Dec 12 07:46:59 2019] Star[ 36.952468] sd 1:0:46:0: Attached scsi generic sg47 type 0 [Thu Dec 12 07:46:59 2019]ting udev Coldpl[ 36.953167] sd 1:0:47:0: Attached scsi generic sg48 type 0 [Thu Dec 12 07:46:59 2019]ug all Devices..[ 36.953700] sd 1:0:48:0: Attached scsi generic sg49 type 0 [Thu Dec 12 07:46:59 2019]. [Thu Dec 12 07:46:59 2019][ OK [ 36.954018] sd 1:0:49:0: Attached scsi generic sg50 type 0 [Thu Dec 12 07:47:00 2019]] Started Cr[ 36.954815] sd 1:0:50:0: Attached scsi generic sg51 type 0 [Thu Dec 12 07:47:00 2019]eate Static Devi[ 36.955354] sd 1:0:51:0: Attached scsi generic sg52 type 0 [Thu Dec 12 07:47:00 2019]ce Nodes in /dev[ 36.955887] sd 1:0:52:0: Attached scsi generic sg53 type 0 [Thu Dec 12 07:47:00 2019]. [Thu Dec 12 07:47:00 2019][ OK [ 36.956371] sd 1:0:53:0: Attached scsi generic sg54 type 0 [Thu Dec 12 07:47:00 2019]] Started Ap[ 36.957063] sd 1:0:54:0: Attached scsi generic sg55 type 0 [Thu Dec 12 07:47:00 2019]ply Kernel Varia[ 36.957713] sd 1:0:55:0: Attached scsi generic sg56 type 0 [Thu Dec 12 07:47:00 2019]bles. [Thu Dec 12 07:47:00 2019] [ 36.958557] sd 1:0:56:0: Attached scsi generic sg57 type 0 [Thu Dec 12 07:47:00 2019] Starting udev K[ 36.959122] sd 1:0:57:0: Attached scsi generic sg58 type 0 [Thu Dec 12 07:47:00 2019]ernel Device Man[ 36.959585] sd 1:0:58:0: Attached scsi generic sg59 type 0 [Thu Dec 12 07:47:00 2019]ager... [Thu Dec 12 07:47:00 2019][[ 36.960198] sd 1:0:59:0: Attached scsi generic sg60 type 0 [Thu Dec 12 07:47:00 2019] OK ] Reac[ 36.960998] sd 1:0:60:0: Attached scsi generic sg61 type 0 [Thu Dec 12 07:47:00 2019]hed target Local[ 36.961493] sd 1:0:61:0: Attached scsi generic sg62 type 0 [Thu Dec 12 07:47:00 2019] File Systems (P[ 36.962183] scsi 1:0:62:0: Attached scsi generic sg63 type 13 [Thu Dec 12 07:47:00 2019]re). [Thu Dec 12 07:47:00 2019][ O[ 36.962811] sd 1:0:63:0: Attached scsi generic sg64 type 0 [Thu Dec 12 07:47:00 2019][ 36.962850] IPMI System Interface driver [Thu Dec 12 07:47:00 2019]K ] Started[ 36.962991] ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS [Thu Dec 12 07:47:00 2019][ 36.962994] ipmi_si: SMBIOS: io 0xca8 regsize 1 spacing 4 irq 10 [Thu Dec 12 07:47:00 2019] Configure read-[ 36.962995] ipmi_si: Adding SMBIOS-specified kcs state machine [Thu Dec 12 07:47:00 2019][ 36.963298] sd 1:0:64:0: Attached scsi generic sg65 type 0 [Thu Dec 12 07:47:00 2019]only root suppor[ 36.963813] sd 1:0:65:0: Attached scsi generic sg66 type 0 [Thu Dec 12 07:47:00 2019]t. [Thu Dec 12 07:47:00 2019][ OK [ 36.964149] sd 1:0:66:0: Attached scsi generic sg67 type 0 [Thu Dec 12 07:47:00 2019] ] Started u[ 36.965465] sd 1:0:67:0: Attached scsi generic sg68 type 0 [Thu Dec 12 07:47:00 2019][ 36.965995] sd 1:0:68:0: Attached scsi generic sg69 type 0 [Thu Dec 12 07:47:00 2019]dev Kernel Devic[ 36.966513] sd 1:0:69:0: Attached scsi generic sg70 type 0 [Thu Dec 12 07:47:00 2019][ 36.967048] sd 1:0:70:0: Attached scsi generic sg71 type 0 [Thu Dec 12 07:47:00 2019]e Manager. [Thu Dec 12 07:47:00 2019] [ 36.967601] sd 1:0:71:0: Attached scsi generic sg72 type 0 [Thu Dec 12 07:47:00 2019][ 36.968068] sd 1:0:72:0: Attached scsi generic sg73 type 0 [Thu Dec 12 07:47:00 2019] Starting L[ 36.968899] sd 1:0:73:0: Attached scsi generic sg74 type 0 [Thu Dec 12 07:47:00 2019][ 36.969567] sd 1:0:74:0: Attached scsi generic sg75 type 0 [Thu Dec 12 07:47:00 2019]oad/Save Random [ 36.970077] sd 1:0:75:0: Attached scsi generic sg76 type 0 [Thu Dec 12 07:47:00 2019]Seed... [Thu Dec 12 07:47:00 2019][[ 36.970456] sd 1:0:76:0: Attached scsi generic sg77 type 0 [Thu Dec 12 07:47:00 2019] OK ] Star[ 36.971664] sd 1:0:77:0: Attached scsi generic sg78 type 0 [Thu Dec 12 07:47:00 2019][ 36.972993] sd 1:0:78:0: Attached scsi generic sg79 type 0 [Thu Dec 12 07:47:00 2019]ted Load/Save Ra[ 36.974257] sd 1:0:79:0: Attached scsi generic sg80 type 0 [Thu Dec 12 07:47:00 2019][ 36.974920] sd 1:0:80:0: Attached scsi generic sg81 type 0 [Thu Dec 12 07:47:00 2019]ndom Seed. [Thu Dec 12 07:47:00 2019][[[ 36.975454] sd 1:0:81:0: Attached scsi generic sg82 type 0 [Thu Dec 12 07:47:00 2019][ 36.975956] sd 1:0:82:0: Attached scsi generic sg83 type 0 [Thu Dec 12 07:47:00 2019]32m OK ] S[ 36.978457] sd 1:0:83:0: Attached scsi generic sg84 type 0 [Thu Dec 12 07:47:00 2019][ 36.979074] sd 1:0:84:0: Attached scsi generic sg85 type 0 [Thu Dec 12 07:47:00 2019]tarted Flush Jou[ 36.979554] sd 1:0:85:0: Attached scsi generic sg86 type 0 [Thu Dec 12 07:47:00 2019][ 36.980240] sd 1:0:86:0: Attached scsi generic sg87 type 0 [Thu Dec 12 07:47:00 2019]rnal to Persiste[ 36.982868] sd 1:0:87:0: Attached scsi generic sg88 type 0 [Thu Dec 12 07:47:00 2019][ 36.983268] sd 1:0:88:0: Attached scsi generic sg89 type 0 [Thu Dec 12 07:47:00 2019]nt Storage. [Thu Dec 12 07:47:00 2019][ 36.984008] sd 1:0:89:0: Attached scsi generic sg90 type 0 [Thu Dec 12 07:47:00 2019][ 36.984642] sd 1:0:90:0: Attached scsi generic sg91 type 0 [Thu Dec 12 07:47:00 2019][ 36.985259] sd 1:0:91:0: Attached scsi generic sg92 type 0 [Thu Dec 12 07:47:00 2019][ 36.985707] sd 1:0:92:0: Attached scsi generic sg93 type 0 [Thu Dec 12 07:47:00 2019][ 36.986789] sd 1:0:93:0: Attached scsi generic sg94 type 0 [Thu Dec 12 07:47:00 2019][ 36.987397] sd 1:0:94:0: Attached scsi generic sg95 type 0 [Thu Dec 12 07:47:00 2019][ 36.989116] sd 1:0:95:0: Attached scsi generic sg96 type 0 [Thu Dec 12 07:47:00 2019][ 36.989781] sd 1:0:96:0: Attached scsi generic sg97 type 0 [Thu Dec 12 07:47:00 2019][ 36.990404] sd 1:0:97:0: Attached scsi generic sg98 type 0 [Thu Dec 12 07:47:00 2019][ 36.991381] sd 1:0:98:0: Attached scsi generic sg99 type 0 [Thu Dec 12 07:47:00 2019][ 36.993331] sd 1:0:99:0: Attached scsi generic sg100 type 0 [Thu Dec 12 07:47:00 2019][ 36.993976] sd 1:0:100:0: Attached scsi generic sg101 type 0 [Thu Dec 12 07:47:00 2019][ 36.994788] sd 1:0:101:0: Attached scsi generic sg102 type 0 [Thu Dec 12 07:47:00 2019][ 36.995402] sd 1:0:102:0: Attached scsi generic sg103 type 0 [Thu Dec 12 07:47:00 2019][ 36.996399] sd 1:0:103:0: Attached scsi generic sg104 type 0 [Thu Dec 12 07:47:00 2019][ 36.997174] sd 1:0:104:0: Attached scsi generic sg105 type 0 [Thu Dec 12 07:47:00 2019][ 36.998036] sd 1:0:105:0: Attached scsi generic sg106 type 0 [Thu Dec 12 07:47:00 2019][ 36.998675] sd 1:0:106:0: Attached scsi generic sg107 type 0 [Thu Dec 12 07:47:00 2019][ 36.999208] sd 1:0:107:0: Attached scsi generic sg108 type 0 [Thu Dec 12 07:47:00 2019][ 36.999717] sd 1:0:108:0: Attached scsi generic sg109 type 0 [Thu Dec 12 07:47:00 2019][ 37.000204] sd 1:0:109:0: Attached scsi generic sg110 type 0 [Thu Dec 12 07:47:00 2019][ 37.001001] sd 1:0:110:0: Attached scsi generic sg111 type 0 [Thu Dec 12 07:47:00 2019][ 37.001546] sd 1:0:111:0: Attached scsi generic sg112 type 0 [Thu Dec 12 07:47:00 2019][ 37.002245] sd 1:0:112:0: Attached scsi generic sg113 type 0 [Thu Dec 12 07:47:00 2019][ 37.002780] sd 1:0:113:0: Attached scsi generic sg114 type 0 [Thu Dec 12 07:47:00 2019][ 37.003403] sd 1:0:114:0: Attached scsi generic sg115 type 0 [Thu Dec 12 07:47:00 2019][ 37.003859] sd 1:0:115:0: Attached scsi generic sg116 type 0 [Thu Dec 12 07:47:00 2019][ 37.004480] sd 1:0:116:0: Attached scsi generic sg117 type 0 [Thu Dec 12 07:47:00 2019][ 37.004936] sd 1:0:117:0: Attached scsi generic sg118 type 0 [Thu Dec 12 07:47:00 2019][ 37.005510] sd 1:0:118:0: Attached scsi generic sg119 type 0 [Thu Dec 12 07:47:00 2019][ 37.006375] sd 1:0:119:0: Attached scsi generic sg120 type 0 [Thu Dec 12 07:47:00 2019][ 37.007019] sd 1:0:120:0: Attached scsi generic sg121 type 0 [Thu Dec 12 07:47:00 2019][ 37.007679] sd 1:0:121:0: Attached scsi generic sg122 type 0 [Thu Dec 12 07:47:00 2019][ 37.008122] sd 1:0:122:0: Attached scsi generic sg123 type 0 [Thu Dec 12 07:47:00 2019][ 37.008475] scsi 1:0:123:0: Attached scsi generic sg124 type 13 [Thu Dec 12 07:47:00 2019][ 37.011350] sd 1:0:124:0: Attached scsi generic sg125 type 0 [Thu Dec 12 07:47:00 2019][ 37.012259] sd 1:0:125:0: Attached scsi generic sg126 type 0 [Thu Dec 12 07:47:00 2019][ 37.013096] sd 1:0:126:0: Attached scsi generic sg127 type 0 [Thu Dec 12 07:47:00 2019][ 37.014817] sd 1:0:127:0: Attached scsi generic sg128 type 0 [Thu Dec 12 07:47:00 2019][ 37.015232] sd 1:0:128:0: Attached scsi generic sg129 type 0 [Thu Dec 12 07:47:00 2019][ 37.016550] sd 1:0:129:0: Attached scsi generic sg130 type 0 [Thu Dec 12 07:47:00 2019][ 37.017003] sd 1:0:130:0: Attached scsi generic sg131 type 0 [Thu Dec 12 07:47:00 2019][ 37.017410] sd 1:0:131:0: Attached scsi generic sg132 type 0 [Thu Dec 12 07:47:00 2019][ 37.017684] sd 1:0:132:0: Attached scsi generic sg133 type 0 [Thu Dec 12 07:47:00 2019][ 37.018213] sd 1:0:133:0: Attached scsi generic sg134 type 0 [Thu Dec 12 07:47:00 2019][ 37.018738] sd 1:0:134:0: Attached scsi generic sg135 type 0 [Thu Dec 12 07:47:00 2019][ 37.019243] sd 1:0:135:0: Attached scsi generic sg136 type 0 [Thu Dec 12 07:47:00 2019][ 37.019552] sd 1:0:136:0: Attached scsi generic sg137 type 0 [Thu Dec 12 07:47:00 2019][ 37.019912] sd 1:0:137:0: Attached scsi generic sg138 type 0 [Thu Dec 12 07:47:00 2019][ 37.020305] sd 1:0:138:0: Attached scsi generic sg139 type 0 [Thu Dec 12 07:47:00 2019][ 37.020772] sd 1:0:139:0: Attached scsi generic sg140 type 0 [Thu Dec 12 07:47:00 2019][ 37.021308] sd 1:0:140:0: Attached scsi generic sg141 type 0 [Thu Dec 12 07:47:00 2019][ 37.022015] sd 1:0:141:0: Attached scsi generic sg142 type 0 [Thu Dec 12 07:47:00 2019][ 37.022490] sd 1:0:142:0: Attached scsi generic sg143 type 0 [Thu Dec 12 07:47:00 2019][ 37.023941] sd 1:0:143:0: Attached scsi generic sg144 type 0 [Thu Dec 12 07:47:00 2019][ 37.024508] sd 1:0:144:0: Attached scsi generic sg145 type 0 [Thu Dec 12 07:47:00 2019][ 37.024962] sd 1:0:145:0: Attached scsi generic sg146 type 0 [Thu Dec 12 07:47:00 2019][ 37.025479] sd 1:0:146:0: Attached scsi generic sg147 type 0 [Thu Dec 12 07:47:00 2019][ 37.026153] sd 1:0:147:0: Attached scsi generic sg148 type 0 [Thu Dec 12 07:47:00 2019][ 37.027865] sd 1:0:148:0: Attached scsi generic sg149 type 0 [Thu Dec 12 07:47:00 2019][ 37.029509] sd 1:0:149:0: Attached scsi generic sg150 type 0 [Thu Dec 12 07:47:00 2019][ 37.029938] sd 1:0:150:0: Attached scsi generic sg151 type 0 [Thu Dec 12 07:47:00 2019][ 37.030306] sd 1:0:151:0: Attached scsi generic sg152 type 0 [Thu Dec 12 07:47:00 2019][ 37.030735] sd 1:0:152:0: Attached scsi generic sg153 type 0 [Thu Dec 12 07:47:00 2019][ 37.031634] sd 1:0:153:0: Attached scsi generic sg154 type 0 [Thu Dec 12 07:47:00 2019][ 37.032651] sd 1:0:154:0: Attached scsi generic sg155 type 0 [Thu Dec 12 07:47:00 2019][ 37.034563] sd 1:0:155:0: Attached scsi generic sg156 type 0 [Thu Dec 12 07:47:00 2019][ 37.035246] sd 1:0:156:0: Attached scsi generic sg157 type 0 [Thu Dec 12 07:47:00 2019][ 37.038619] sd 1:0:157:0: Attached scsi generic sg158 type 0 [Thu Dec 12 07:47:00 2019][ 37.039195] sd 1:0:158:0: Attached scsi generic sg159 type 0 [Thu Dec 12 07:47:00 2019][ 37.039762] sd 1:0:159:0: Attached scsi generic sg160 type 0 [Thu Dec 12 07:47:00 2019][ 37.041537] sd 1:0:160:0: Attached scsi generic sg161 type 0 [Thu Dec 12 07:47:00 2019][ 37.042200] sd 1:0:161:0: Attached scsi generic sg162 type 0 [Thu Dec 12 07:47:00 2019][ 37.042658] sd 1:0:162:0: Attached scsi generic sg163 type 0 [Thu Dec 12 07:47:00 2019][ 37.043205] sd 1:0:163:0: Attached scsi generic sg164 type 0 [Thu Dec 12 07:47:00 2019][ 37.043588] sd 1:0:164:0: Attached scsi generic sg165 type 0 [Thu Dec 12 07:47:00 2019][ 37.045537] sd 1:0:165:0: Attached scsi generic sg166 type 0 [Thu Dec 12 07:47:00 2019][ 37.045905] sd 1:0:166:0: Attached scsi generic sg167 type 0 [Thu Dec 12 07:47:00 2019][ 37.048388] sd 1:0:167:0: Attached scsi generic sg168 type 0 [Thu Dec 12 07:47:00 2019][ 37.048975] sd 1:0:168:0: Attached scsi generic sg169 type 0 [Thu Dec 12 07:47:00 2019][ 37.049324] sd 1:0:169:0: Attached scsi generic sg170 type 0 [Thu Dec 12 07:47:00 2019][ 37.049718] sd 1:0:170:0: Attached scsi generic sg171 type 0 [Thu Dec 12 07:47:00 2019][ 37.051590] sd 1:0:171:0: Attached scsi generic sg172 type 0 [Thu Dec 12 07:47:00 2019][ 37.052123] sd 1:0:172:0: Attached scsi generic sg173 type 0 [Thu Dec 12 07:47:00 2019][ 37.052974] sd 1:0:173:0: Attached scsi generic sg174 type 0 [Thu Dec 12 07:47:00 2019][ 37.053620] sd 1:0:174:0: Attached scsi generic sg175 type 0 [Thu Dec 12 07:47:00 2019][ 37.054210] sd 1:0:175:0: Attached scsi generic sg176 type 0 [Thu Dec 12 07:47:00 2019][ 37.055171] sd 1:0:176:0: Attached scsi generic sg177 type 0 [Thu Dec 12 07:47:00 2019][ 37.056805] sd 1:0:177:0: Attached scsi generic sg178 type 0 [Thu Dec 12 07:47:00 2019][ 37.058321] sd 1:0:178:0: Attached scsi generic sg179 type 0 [Thu Dec 12 07:47:00 2019][ 37.059072] sd 1:0:179:0: Attached scsi generic sg180 type 0 [Thu Dec 12 07:47:00 2019][ 37.060651] sd 1:0:180:0: Attached scsi generic sg181 type 0 [Thu Dec 12 07:47:00 2019][ 37.061295] sd 1:0:181:0: Attached scsi generic sg182 type 0 [Thu Dec 12 07:47:00 2019][ 37.061942] sd 1:0:182:0: Attached scsi generic sg183 type 0 [Thu Dec 12 07:47:00 2019][ 37.062502] sd 1:0:183:0: Attached scsi generic sg184 type 0 [Thu Dec 12 07:47:00 2019][ 37.063037] scsi 1:0:184:0: Attached scsi generic sg185 type 13 [Thu Dec 12 07:47:00 2019][ 37.063584] sd 1:0:185:0: Attached scsi generic sg186 type 0 [Thu Dec 12 07:47:00 2019][ 37.069115] sd 1:0:186:0: Attached scsi generic sg187 type 0 [Thu Dec 12 07:47:00 2019][ 37.069631] sd 1:0:187:0: Attached scsi generic sg188 type 0 [Thu Dec 12 07:47:00 2019][ 37.070166] sd 1:0:188:0: Attached scsi generic sg189 type 0 [Thu Dec 12 07:47:00 2019][ 37.070926] sd 1:0:189:0: Attached scsi generic sg190 type 0 [Thu Dec 12 07:47:00 2019][ 37.071767] sd 1:0:190:0: Attached scsi generic sg191 type 0 [Thu Dec 12 07:47:00 2019][ 37.072409] sd 1:0:191:0: Attached scsi generic sg192 type 0 [Thu Dec 12 07:47:00 2019][ 37.072871] sd 1:0:192:0: Attached scsi generic sg193 type 0 [Thu Dec 12 07:47:00 2019][ 37.077221] sd 1:0:193:0: Attached scsi generic sg194 type 0 [Thu Dec 12 07:47:00 2019][ 37.079186] sd 1:0:194:0: Attached scsi generic sg195 type 0 [Thu Dec 12 07:47:00 2019][ 37.079848] sd 1:0:195:0: Attached scsi generic sg196 type 0 [Thu Dec 12 07:47:00 2019][ 37.080391] sd 1:0:196:0: Attached scsi generic sg197 type 0 [Thu Dec 12 07:47:00 2019][ 37.080793] sd 1:0:197:0: Attached scsi generic sg198 type 0 [Thu Dec 12 07:47:00 2019][ 37.081371] sd 1:0:198:0: Attached scsi generic sg199 type 0 [Thu Dec 12 07:47:00 2019][ 37.081979] sd 1:0:199:0: Attached scsi generic sg200 type 0 [Thu Dec 12 07:47:00 2019][ 37.084148] sd 1:0:200:0: Attached scsi generic sg201 type 0 [Thu Dec 12 07:47:00 2019][ 37.084713] sd 1:0:201:0: Attached scsi generic sg202 type 0 [Thu Dec 12 07:47:00 2019][ 37.087158] sd 1:0:202:0: Attached scsi generic sg203 type 0 [Thu Dec 12 07:47:00 2019][ 37.087632] sd 1:0:203:0: Attached scsi generic sg204 type 0 [Thu Dec 12 07:47:00 2019][ 37.087986] sd 1:0:204:0: Attached scsi generic sg205 type 0 [Thu Dec 12 07:47:00 2019][ 37.089869] sd 1:0:205:0: Attached scsi generic sg206 type 0 [Thu Dec 12 07:47:00 2019][ 37.090301] sd 1:0:206:0: Attached scsi generic sg207 type 0 [Thu Dec 12 07:47:00 2019][ 37.090706] sd 1:0:207:0: Attached scsi generic sg208 type 0 [Thu Dec 12 07:47:00 2019][ 37.090990] sd 1:0:208:0: Attached scsi generic sg209 type 0 [Thu Dec 12 07:47:00 2019][ 37.091256] sd 1:0:209:0: Attached scsi generic sg210 type 0 [Thu Dec 12 07:47:00 2019][ 37.091713] sd 1:0:210:0: Attached scsi generic sg211 type 0 [Thu Dec 12 07:47:00 2019][ 37.093191] sd 1:0:211:0: Attached scsi generic sg212 type 0 [Thu Dec 12 07:47:00 2019][ 37.093496] sd 1:0:212:0: Attached scsi generic sg213 type 0 [Thu Dec 12 07:47:00 2019][ 37.093680] sd 1:0:213:0: Attached scsi generic sg214 type 0 [Thu Dec 12 07:47:00 2019][ 37.094022] sd 1:0:214:0: Attached scsi generic sg215 type 0 [Thu Dec 12 07:47:00 2019][ 37.095939] sd 1:0:215:0: Attached scsi generic sg216 type 0 [Thu Dec 12 07:47:01 2019][ 37.096423] sd 1:0:216:0: Attached scsi generic sg217 type 0 [Thu Dec 12 07:47:01 2019][ 37.097243] sd 1:0:217:0: Attached scsi generic sg218 type 0 [Thu Dec 12 07:47:01 2019][ 37.098022] sd 1:0:218:0: Attached scsi generic sg219 type 0 [Thu Dec 12 07:47:01 2019][ 37.098650] sd 1:0:219:0: Attached scsi generic sg220 type 0 [Thu Dec 12 07:47:01 2019][ 37.100715] sd 1:0:220:0: Attached scsi generic sg221 type 0 [Thu Dec 12 07:47:01 2019][ 37.102699] sd 1:0:221:0: Attached scsi generic sg222 type 0 [Thu Dec 12 07:47:01 2019][ 37.103363] sd 1:0:222:0: Attached scsi generic sg223 type 0 [Thu Dec 12 07:47:01 2019][ 37.103981] sd 1:0:223:0: Attached scsi generic sg224 type 0 [Thu Dec 12 07:47:01 2019][ 37.104658] sd 1:0:224:0: Attached scsi generic sg225 type 0 [Thu Dec 12 07:47:01 2019][ 37.104995] sd 1:0:225:0: Attached scsi generic sg226 type 0 [Thu Dec 12 07:47:01 2019][ 37.105396] sd 1:0:226:0: Attached scsi generic sg227 type 0 [Thu Dec 12 07:47:01 2019][ 37.105854] sd 1:0:227:0: Attached scsi generic sg228 type 0 [Thu Dec 12 07:47:01 2019][ 37.106478] sd 1:0:228:0: Attached scsi generic sg229 type 0 [Thu Dec 12 07:47:01 2019][ 37.106919] sd 1:0:229:0: Attached scsi generic sg230 type 0 [Thu Dec 12 07:47:01 2019][ 37.108987] sd 1:0:230:0: Attached scsi generic sg231 type 0 [Thu Dec 12 07:47:01 2019][ 37.109153] sd 1:0:231:0: Attached scsi generic sg232 type 0 [Thu Dec 12 07:47:01 2019][ 37.109355] sd 1:0:232:0: Attached scsi generic sg233 type 0 [Thu Dec 12 07:47:01 2019][ 37.109675] sd 1:0:233:0: Attached scsi generic sg234 type 0 [Thu Dec 12 07:47:01 2019][ 37.111228] sd 1:0:234:0: Attached scsi generic sg235 type 0 [Thu Dec 12 07:47:01 2019][ 37.113161] sd 1:0:235:0: Attached scsi generic sg236 type 0 [Thu Dec 12 07:47:01 2019][ 37.113768] sd 1:0:236:0: Attached scsi generic sg237 type 0 [Thu Dec 12 07:47:01 2019][ 37.114456] sd 1:0:237:0: Attached scsi generic sg238 type 0 [Thu Dec 12 07:47:01 2019][ 37.115814] sd 1:0:238:0: Attached scsi generic sg239 type 0 [Thu Dec 12 07:47:01 2019][ 37.116512] sd 1:0:239:0: Attached scsi generic sg240 type 0 [Thu Dec 12 07:47:01 2019][ 37.116822] device-mapper: uevent: version 1.0.3 [Thu Dec 12 07:47:01 2019][ 37.117860] sd 1:0:240:0: Attached scsi generic sg241 type 0 [Thu Dec 12 07:47:01 2019][ 37.118095] sd 1:0:241:0: Attached scsi generic sg242 type 0 [Thu Dec 12 07:47:01 2019][ 37.118113] device-mapper: ioctl: 4.37.1-ioctl (2018-04-03) initialised: dm-devel@redhat.com [Thu Dec 12 07:47:01 2019][ 37.119607] sd 1:0:242:0: Attached scsi generic sg243 type 0 [Thu Dec 12 07:47:01 2019][ 37.120254] sd 1:0:243:0: Attached scsi generic sg244 type 0 [Thu Dec 12 07:47:01 2019][ 37.120965] sd 1:0:244:0: Attached scsi generic sg245 type 0 [Thu Dec 12 07:47:01 2019][ 37.233997] ccp 0000:02:00.2: Queue 4 can access 4 LSB regions [Thu Dec 12 07:47:01 2019][ 37.233999] ccp 0000:02:00.2: Queue 0 gets LSB 4 [Thu Dec 12 07:47:01 2019][ 37.234000] ccp 0000:02:00.2: Queue 1 gets LSB 5 [Thu Dec 12 07:47:01 2019][ 37.234001] ccp 0000:02:00.2: Queue 2 gets LSB 6 [Thu Dec 12 07:47:01 2019][ 37.234093] ipmi_si IPI0001:00: ipmi_platform: probing via ACPI [Thu Dec 12 07:47:01 2019][ 37.234295] ipmi_si IPI0001:00: [io 0x0ca8] regsize 1 spacing 4 irq 10 [Thu Dec 12 07:47:01 2019][ 37.234299] ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI [Thu Dec 12 07:47:01 2019][ 37.234301] ipmi_si: Adding ACPI-specified kcs state machine [Thu Dec 12 07:47:01 2019][ 37.234846] ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca8, slave address 0x20, irq 10 [Thu Dec 12 07:47:01 2019][ OK ] Started udev Coldplug all Devices. [Thu Dec 12 07:47:01 2019] Starting Device-Mapper Multipath[ 38.525572] ccp 0000:02:00.2: enabled [Thu Dec 12 07:47:01 2019] Device Controll[ 38.530155] ccp 0000:03:00.1: 5 command queues available [Thu Dec 12 07:47:01 2019]er... [Thu Dec 12 07:47:01 2019][ 38.539211] ccp 0000:03:00.1: Queue 0 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ 38.545053] ccp 0000:03:00.1: Queue 1 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ 38.547871] ipmi_si IPI0001:00: The BMC does not support setting the recv irq bit, compensating, but the BMC needs to be fixed. [Thu Dec 12 07:47:01 2019][ 38.555112] ipmi_si IPI0001:00: Using irq 10 [Thu Dec 12 07:47:01 2019][ 38.566661] ccp 0000:03:00.1: Queue 2 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ 38.572536] ccp 0000:03:00.1: Queue 3 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ OK [ 38.578394] ccp 0000:03:00.1: Queue 4 can access 7 LSB regions [Thu Dec 12 07:47:01 2019]] Started Device[ 38.585677] ccp 0000:03:00.1: Queue 0 gets LSB 1 [Thu Dec 12 07:47:01 2019]-Mapper Multipat[ 38.591714] ccp 0000:03:00.1: Queue 1 gets LSB 2 [Thu Dec 12 07:47:01 2019]h Device Control[ 38.597727] ccp 0000:03:00.1: Queue 2 gets LSB 3 [Thu Dec 12 07:47:01 2019]ler. [Thu Dec 12 07:47:01 2019][ 38.603742] ccp 0000:03:00.1: Queue 3 gets LSB 4 [Thu Dec 12 07:47:01 2019][ 38.608925] ccp 0000:03:00.1: Queue 4 gets LSB 5 [Thu Dec 12 07:47:01 2019][ 38.616974] ccp 0000:03:00.1: enabled [Thu Dec 12 07:47:01 2019][ 38.620943] ccp 0000:41:00.2: 3 command queues available [Thu Dec 12 07:47:01 2019][ 38.626445] ccp 0000:41:00.2: Queue 2 can access 4 LSB regions [Thu Dec 12 07:47:01 2019][ 38.627248] ipmi_si IPI0001:00: Found new BMC (man_id: 0x0002a2, prod_id: 0x0100, dev_id: 0x20) [Thu Dec 12 07:47:01 2019][ 38.641026] ccp 0000:41:00.2: Queue 3 can access 4 LSB regions [Thu Dec 12 07:47:01 2019][ 38.646876] ccp 0000:41:00.2: Queue 4 can access 4 LSB regions [Thu Dec 12 07:47:01 2019][ 38.652725] ccp 0000:41:00.2: Queue 0 gets LSB 4 [Thu Dec 12 07:47:01 2019][ 38.657359] ccp 0000:41:00.2: Queue 1 gets LSB 5 [Thu Dec 12 07:47:01 2019][ 38.661989] ccp 0000:41:00.2: Queue 2 gets LSB 6 [Thu Dec 12 07:47:01 2019][ 38.668935] ccp 0000:41:00.2: enabled [Thu Dec 12 07:47:01 2019][ 38.672896] ccp 0000:42:00.1: 5 command queues available [Thu Dec 12 07:47:01 2019][ 38.678310] ccp 0000:42:00.1: Queue 0 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ 38.684158] ccp 0000:42:00.1: Queue 1 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ 38.690008] ccp 0000:42:00.1: Queue 2 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ 38.695869] ccp 0000:42:00.1: Queue 3 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ 38.701706] ccp 0000:42:00.1: Queue 4 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ 38.707539] ccp 0000:42:00.1: Queue 0 gets LSB 1 [Thu Dec 12 07:47:01 2019][ 38.712158] ccp 0000:42:00.1: Queue 1 gets LSB 2 [Thu Dec 12 07:47:01 2019][ 38.713526] ipmi_si IPI0001:00: IPMI kcs interface initialized [Thu Dec 12 07:47:01 2019][ 38.722628] ccp 0000:42:00.1: Queue 2 gets LSB 3 [Thu Dec 12 07:47:01 2019][ 38.727257] ccp 0000:42:00.1: Queue 3 gets LSB 4 [Thu Dec 12 07:47:01 2019][ 38.731883] ccp 0000:42:00.1: Queue 4 gets LSB 5 [Thu Dec 12 07:47:01 2019][ 38.740116] ccp 0000:42:00.1: enabled [Thu Dec 12 07:47:01 2019][ 38.744074] ccp 0000:85:00.2: 3 command queues available [Thu Dec 12 07:47:01 2019][ 38.749620] ccp 0000:85:00.2: Queue 2 can access 4 LSB regions [Thu Dec 12 07:47:01 2019][ 38.755534] ccp 0000:85:00.2: Queue 3 can access 4 LSB regions [Thu Dec 12 07:47:01 2019][ 38.761500] ccp 0000:85:00.2: Queue 4 can access 4 LSB regions [Thu Dec 12 07:47:01 2019][ 38.767541] ccp 0000:85:00.2: Queue 0 gets LSB 4 [Thu Dec 12 07:47:01 2019][ 38.772252] ccp 0000:85:00.2: Queue 1 gets LSB 5 [Thu Dec 12 07:47:01 2019][ 38.777029] ccp 0000:85:00.2: Queue 2 gets LSB 6 [Thu Dec 12 07:47:01 2019][ 38.782961] ccp 0000:85:00.2: enabled [Thu Dec 12 07:47:01 2019][ 38.787025] ccp 0000:86:00.1: 5 command queues available [Thu Dec 12 07:47:01 2019][ 38.792486] ccp 0000:86:00.1: Queue 0 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ 38.798362] ccp 0000:86:00.1: Queue 1 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ 38.804245] ccp 0000:86:00.1: Queue 2 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ 38.810105] ccp 0000:86:00.1: Queue 3 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ 38.815980] ccp 0000:86:00.1: Queue 4 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ 38.821871] ccp 0000:86:00.1: Queue 0 gets LSB 1 [Thu Dec 12 07:47:01 2019][ 38.826534] ccp 0000:86:00.1: Queue 1 gets LSB 2 [Thu Dec 12 07:47:01 2019][ 38.831200] ccp 0000:86:00.1: Queue 2 gets LSB 3 [Thu Dec 12 07:47:01 2019][ 38.835890] ccp 0000:86:00.1: Queue 3 gets LSB 4 [Thu Dec 12 07:47:01 2019][ 38.840559] ccp 0000:86:00.1: Queue 4 gets LSB 5 [Thu Dec 12 07:47:01 2019][ 38.849752] ccp 0000:86:00.1: enabled [Thu Dec 12 07:47:01 2019][ 38.853728] ccp 0000:c2:00.2: 3 command queues available [Thu Dec 12 07:47:01 2019][ 38.859301] ccp 0000:c2:00.2: Queue 2 can access 4 LSB regions [Thu Dec 12 07:47:01 2019][ 38.865143] ccp 0000:c2:00.2: Queue 3 can access 4 LSB regions [Thu Dec 12 07:47:01 2019][ 38.870981] ccp 0000:c2:00.2: Queue 4 can access 4 LSB regions [Thu Dec 12 07:47:01 2019][ 38.876829] ccp 0000:c2:00.2: Queue 0 gets LSB 4 [Thu Dec 12 07:47:01 2019][ 38.881460] ccp 0000:c2:00.2: Queue 1 gets LSB 5 [Thu Dec 12 07:47:01 2019][ 38.886088] ccp 0000:c2:00.2: Queue 2 gets LSB 6 [Thu Dec 12 07:47:01 2019][ 38.897883] ccp 0000:c2:00.2: enabled [Thu Dec 12 07:47:01 2019][ 38.901818] ccp 0000:c3:00.1: 5 command queues available [Thu Dec 12 07:47:01 2019][ 38.907298] ccp 0000:c3:00.1: Queue 0 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ 38.913140] ccp 0000:c3:00.1: Queue 1 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ 38.918994] ccp 0000:c3:00.1: Queue 2 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ 38.924842] ccp 0000:c3:00.1: Queue 3 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ 38.930691] ccp 0000:c3:00.1: Queue 4 can access 7 LSB regions [Thu Dec 12 07:47:01 2019][ 38.936543] ccp 0000:c3:00.1: Queue 0 gets LSB 1 [Thu Dec 12 07:47:01 2019][ 38.941177] ccp 0000:c3:00.1: Queue 1 gets LSB 2 [Thu Dec 12 07:47:01 2019][ 38.945811] ccp 0000:c3:00.1: Queue 2 gets LSB 3 [Thu Dec 12 07:47:01 2019][ 38.950441] ccp 0000:c3:00.1: Queue 3 gets LSB 4 [Thu Dec 12 07:47:01 2019][ 38.955071] ccp 0000:c3:00.1: Queue 4 gets LSB 5 [Thu Dec 12 07:47:01 2019][ 38.969041] ccp 0000:c3:00.1: enabled [Thu Dec 12 07:47:07 2019][ 44.611193] device-mapper: multipath service-time: version 0.3.0 loaded [Thu Dec 12 07:47:07 2019][ 44.741374] input: PC Speaker as /devices/platform/pcspkr/input/input2 [Thu Dec 12 07:47:07 2019][ OK ] Found device /dev/ttyS0. [Thu Dec 12 07:47:07 2019][ 44.809911] cryptd: max_cpu_qlen set to 1000 [Thu Dec 12 07:47:07 2019][ 44.825898] AVX2 version of gcm_enc/dec engaged. [Thu Dec 12 07:47:07 2019][ 44.830517] AES CTR mode by8 optimization enabled [Thu Dec 12 07:47:07 2019][ 44.837919] alg: No test for __gcm-aes-aesni (__driver-gcm-aes-aesni) [Thu Dec 12 07:47:07 2019][ 44.844474] alg: No test for __generic-gcm-aes-aesni (__driver-generic-gcm-aes-aesni) [Thu Dec 12 07:47:07 2019][ 44.897377] kvm: Nested Paging enabled [Thu Dec 12 07:47:07 2019][ 44.903836] MCE: In-kernel MCE decoding enabled. [Thu Dec 12 07:47:07 2019][ 44.911732] AMD64 EDAC driver v3.4.0 [Thu Dec 12 07:47:07 2019][ 44.915332] EDAC amd64: DRAM ECC enabled. [Thu Dec 12 07:47:07 2019][ 44.919347] EDAC amd64: F17h detected (node 0). [Thu Dec 12 07:47:07 2019][ 44.923937] EDAC amd64: MC: 0: 0MB 1: 0MB [Thu Dec 12 07:47:07 2019][ 44.928645] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Thu Dec 12 07:47:07 2019][ 44.933352] EDAC amd64: MC: 4: 0MB 5: 0MB [Thu Dec 12 07:47:07 2019][ 44.938066] EDAC amd64: MC: 6: 0MB 7: 0MB [Thu Dec 12 07:47:07 2019][ 44.942786] EDAC amd64: MC: 0: 0MB 1: 0MB [Thu Dec 12 07:47:07 2019][ 44.947497] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Thu Dec 12 07:47:07 2019][ 44.952211] EDAC amd64: MC: 4: 0MB 5: 0MB [Thu Dec 12 07:47:07 2019][ 44.956926] EDAC amd64: MC: 6: 0MB 7: 0MB [Thu Dec 12 07:47:07 2019][ 44.961641] EDAC amd64: using x8 syndromes. [Thu Dec 12 07:47:07 2019][ 44.965835] EDAC amd64: MCT channel count: 2 [Thu Dec 12 07:47:07 2019][ 44.970255] EDAC MC0: Giving out device to 'amd64_edac' 'F17h': DEV 0000:00:18.3 [Thu Dec 12 07:47:07 2019][ 44.977664] EDAC amd64: DRAM ECC enabled. [Thu Dec 12 07:47:07 2019][ 44.981680] EDAC amd64: F17h detected (node 1). [Thu Dec 12 07:47:07 2019][ 44.986265] EDAC amd64: MC: 0: 0MB 1: 0MB [Thu Dec 12 07:47:07 2019][ 44.990977] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Thu Dec 12 07:47:07 2019][ 44.995694] EDAC amd64: MC: 4: 0MB 5: 0MB [Thu Dec 12 07:47:07 2019][ 45.000405] EDAC amd64: MC: 6: 0MB 7: 0MB [Thu Dec 12 07:47:07 2019][ 45.005117] EDAC amd64: MC: 0: 0MB 1: 0MB [Thu Dec 12 07:47:07 2019][ 45.009827] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Thu Dec 12 07:47:07 2019][ 45.014544] EDAC amd64: MC: 4: 0MB 5: 0MB [Thu Dec 12 07:47:07 2019][ 45.019257] EDAC amd64: MC: 6: 0MB 7: 0MB [Thu Dec 12 07:47:07 2019][ 45.023972] EDAC amd64: using x8 syndromes. [Thu Dec 12 07:47:07 2019][ 45.028165] EDAC amd64: MCT channel count: 2 [Thu Dec 12 07:47:07 2019][ 45.032582] EDAC MC1: Giving out device to 'amd64_edac' 'F17h': DEV 0000:00:19.3 [Thu Dec 12 07:47:07 2019][ 45.039986] EDAC amd64: DRAM ECC enabled. [Thu Dec 12 07:47:07 2019][ 45.044000] EDAC amd64: F17h detected (node 2). [Thu Dec 12 07:47:07 2019][ 45.048589] EDAC amd64: MC: 0: 0MB 1: 0MB [Thu Dec 12 07:47:07 2019][ 45.053300] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Thu Dec 12 07:47:07 2019][ 45.058015] EDAC amd64: MC: 4: 0MB 5: 0MB [Thu Dec 12 07:47:07 2019][ 45.062727] EDAC amd64: MC: 6: 0MB 7: 0MB [Thu Dec 12 07:47:07 2019][ 45.067438] EDAC amd64: MC: 0: 0MB 1: 0MB [Thu Dec 12 07:47:07 2019][ 45.072150] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Thu Dec 12 07:47:07 2019][ 45.076865] EDAC amd64: MC: 4: 0MB 5: 0MB [Thu Dec 12 07:47:07 2019][ 45.081578] EDAC amd64: MC: 6: 0MB 7: 0MB [Thu Dec 12 07:47:07 2019][ 45.086284] EDAC amd64: using x8 syndromes. [Thu Dec 12 07:47:07 2019][ 45.090470] EDAC amd64: MCT channel count: 2 [Thu Dec 12 07:47:07 2019][ 45.094864] EDAC MC2: Giving out device to 'amd64_edac' 'F17h': DEV 0000:00:1a.3 [Thu Dec 12 07:47:07 2019][ 45.102263] EDAC amd64: DRAM ECC enabled. [Thu Dec 12 07:47:07 2019][ 45.106279] EDAC amd64: F17h detected (node 3). [Thu Dec 12 07:47:07 2019][ 45.110867] EDAC amd64: MC: 0: 0MB 1: 0MB [Thu Dec 12 07:47:07 2019][ 45.115581] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Thu Dec 12 07:47:07 2019][ 45.120293] EDAC amd64: MC: 4: 0MB 5: 0MB [Thu Dec 12 07:47:07 2019][ 45.125007] EDAC amd64: MC: 6: 0MB 7: 0MB [Thu Dec 12 07:47:07 2019][ 45.129717] EDAC amd64: MC: 0: 0MB 1: 0MB [Thu Dec 12 07:47:07 2019][ 45.134429] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Thu Dec 12 07:47:07 2019][ 45.139144] EDAC amd64: MC: 4: 0MB 5: 0MB [Thu Dec 12 07:47:07 2019][ 45.143860] EDAC amd64: MC: 6: 0MB 7: 0MB [Thu Dec 12 07:47:07 2019][ 45.148570] EDAC amd64: using x8 syndromes. [Thu Dec 12 07:47:07 2019][ 45.152759] EDAC amd64: MCT channel count: 2 [Thu Dec 12 07:47:07 2019][ 45.157164] EDAC MC3: Giving out device to 'amd64_edac' 'F17h': DEV 0000:00:1b.3 [Thu Dec 12 07:47:07 2019][ 45.164578] EDAC PCI0: Giving out device to module 'amd64_edac' controller 'EDAC PCI controller': DEV '0000:00:18.0' (POLLED) [Thu Dec 12 07:47:08 2019][ 45.922337] dcdbas dcdbas: Dell Systems Management Base Driver (version 5.6.0-3.3) [Thu Dec 12 07:47:08 2019]%G%G[ 46.031170] ses 1:0:0:0: Attached Enclosure device [Thu Dec 12 07:47:08 2019][ 46.036036] ses 1:0:1:0: Attached Enclosure device [Thu Dec 12 07:47:08 2019][ 46.040924] ses 1:0:62:0: Attached Enclosure device [Thu Dec 12 07:47:08 2019][ 46.045867] ses 1:0:123:0: Attached Enclosure device [Thu Dec 12 07:47:08 2019][ 46.050917] ses 1:0:184:0: Attached Enclosure device [Thu Dec 12 07:47:51 2019][ OK ] Found device PERC_H330_Mini EFI\x20System\x20Partition. [Thu Dec 12 07:47:51 2019] Mounting /b[ 89.203497] Adding 4194300k swap on /dev/sda3. Priority:-2 extents:1 across:4194300k FS [Thu Dec 12 07:47:51 2019]oot/efi... [Thu Dec 12 07:47:51 2019][ OK ] Found device PERC_H330_Mini 3. [Thu Dec 12 07:47:51 2019][ 89.216906] FAT-fs (sda1): Volume was not properly unmounted. Some data may be corrupt. Please run fsck. [Thu Dec 12 07:47:51 2019] Activating swap /dev/disk/by-uuid/4...7-253b-4b35-98bd-0ebd94f347e5... [Thu Dec 12 07:47:51 2019][ OK ] Activated swap /dev/disk/by-uuid/401ce0e7-253b-4b35-98bd-0ebd94f347e5. [Thu Dec 12 07:47:51 2019][ OK ] Reached target Swap. [Thu Dec 12 07:47:51 2019][ OK ] Mounted /boot/efi. [Thu Dec 12 07:47:51 2019][ OK ] Reached target Local File Systems. [Thu Dec 12 07:47:51 2019] Starting Import net[ 89.254833] type=1305 audit(1576165671.743:3): audit_pid=49120 old=0 auid=4294967295 ses=4294967295 res=1 [Thu Dec 12 07:47:51 2019]work configuration from initramfs... [Thu Dec 12 07:47:51 2019] Starting Tell Plymouth To Write Out Runtime Data... [Thu Dec 12 07:47:51 2019] Starting Preproce[ 89.276647] RPC: Registered named UNIX socket transport module. [Thu Dec 12 07:47:51 2019]ss NFS configura[ 89.282697] RPC: Registered udp transport module. [Thu Dec 12 07:47:52 2019]tion... [Thu Dec 12 07:47:52 2019][[ 89.288789] RPC: Registered tcp transport module. [Thu Dec 12 07:47:52 2019] OK ] Star[ 89.294881] RPC: Registered tcp NFSv4.1 backchannel transport module. [Thu Dec 12 07:47:52 2019]ted Preprocess NFS configuration. [Thu Dec 12 07:47:52 2019][ OK ] Started Tell Plymouth To Write Out Runtime Data. [Thu Dec 12 07:47:52 2019][ OK ] Started Import network configuration from initramfs. [Thu Dec 12 07:47:52 2019] Starting Create Volatile Files and Directories... [Thu Dec 12 07:47:52 2019][ OK ] Started Create Volatile Files and Directories. [Thu Dec 12 07:47:52 2019] Starting Security Auditing Service... [Thu Dec 12 07:47:52 2019] Mounting RPC Pipe File System... [Thu Dec 12 07:47:52 2019][ OK ] Mounted RPC Pipe File System. [Thu Dec 12 07:47:52 2019][ OK ] Reached target rpc_pipefs.target. [Thu Dec 12 07:47:52 2019][ OK ] Started Security Auditing Service. [Thu Dec 12 07:47:52 2019] Starting Update UTMP about System Boot/Shutdown... [Thu Dec 12 07:47:52 2019][ OK ] Started Update UTMP about System Boot/Shutdown. [Thu Dec 12 07:47:52 2019][ OK ] Reached target System Initialization. [Thu Dec 12 07:47:52 2019][ OK ] Listening on D-Bus System Message Bus Socket. [Thu Dec 12 07:47:52 2019][ OK ] Listening on RPCbind Server Activation Socket. [Thu Dec 12 07:47:52 2019][ OK ] Reached target Sockets. [Thu Dec 12 07:47:52 2019][ OK ] Reached target Basic System. [Thu Dec 12 07:47:52 2019] Starting Dump dmesg to /var/log/dmesg... [Thu Dec 12 07:47:52 2019] Starting NTP client/server... [Thu Dec 12 07:47:52 2019][ OK ] Started irqbalance daemon. [Thu Dec 12 07:47:52 2019] Starting Resets System Activity Logs... [Thu Dec 12 07:47:52 2019] Starting Authorization Manager... [Thu Dec 12 07:47:52 2019] Starting Systems Management Event Management... [Thu Dec 12 07:47:52 2019] Starting Systems Management Device Drivers... [Thu Dec 12 07:47:52 2019] Starting Load CPU microcode update... [Thu Dec 12 07:47:52 2019][ OK ] Started D-Bus System Message Bus. [Thu Dec 12 07:47:52 2019][ OK ] Started Self Monitoring and Reporting Technology (SMART) Daemon. [Thu Dec 12 07:47:52 2019] Starting GSSAPI Proxy Daemon... [Thu Dec 12 07:47:52 2019] Starting System Security Services Daemon... [Thu Dec 12 07:47:52 2019] Starting Software RAID monitoring and management... [Thu Dec 12 07:47:52 2019] Starting RPC bind service... [Thu Dec 12 07:47:52 2019] Starting openibd - configure Mellanox devices... [Thu Dec 12 07:47:52 2019][ OK ] Started Daily Cleanup of Temporary Directories. [Thu Dec 12 07:47:52 2019][ OK ] Reached target Timers. [Thu Dec 12 07:47:52 2019][ OK ] Started Systems Management Event Management. [Thu Dec 12 07:47:52 2019][ OK ] Started Software RAID monitoring and management. [Thu Dec 12 07:47:52 2019][ OK ] Started Authorization Manager. [Thu Dec 12 07:47:52 2019][ OK ] Started RPC bind service. [Thu Dec 12 07:47:52 2019][ OK ] Started GSSAPI Proxy Daemon. [Thu Dec 12 07:47:52 2019][ OK ] Started Resets System Activity Logs. [Thu Dec 12 07:47:52 2019][ OK ] Reached target NFS client services. [Thu Dec 12 07:47:52 2019][ OK ] Started Dump dmesg to /var/log/dmesg. [Thu Dec 12 07:47:52 2019][ OK ] Started Load CPU microcode update. [Thu Dec 12 07:47:52 2019][ OK ] Started NTP client/server. [Thu Dec 12 07:47:52 2019][ OK ] Started System Security Services Daemon. [Thu Dec 12 07:47:52 2019][ OK ] Reached target User and Group Name Lookups. [Thu Dec 12 07:47:52 2019] Starting Login Service... [Thu Dec 12 07:47:52 2019][ OK ] Started Login Service. [Thu Dec 12 07:47:52 2019][ 89.933427] mlx5_core 0000:01:00.0: slow_pci_heuristic:5575:(pid 49399): Max link speed = 100000, PCI BW = 126016 [Thu Dec 12 07:47:52 2019][ 89.943754] mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0) [Thu Dec 12 07:47:52 2019][ 89.952049] mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0) [Thu Dec 12 07:47:52 2019][ OK ] Created slice system-mlnx_interface_mgr.slice. [Thu Dec 12 07:47:52 2019][ OK ] Started mlnx_interface_mgr - configure ib0. [Thu Dec 12 07:47:52 2019][ OK ] Started openibd - configure Mellanox devices. [Thu Dec 12 07:47:52 2019] Starting LSB: Bring up/down networking... [Thu Dec 12 07:47:52 2019][ OK ] Reached target Remote File Systems (Pre). [Thu Dec 12 07:48:00 2019][ 90.536681] IPv6: ADDRCONF(NETDEV_UP): em1: link is not ready [Thu Dec 12 07:48:00 2019][ 94.109264] tg3 0000:81:00.0 em1: Link is up at 1000 Mbps, full duplex [Thu Dec 12 07:48:00 2019][ 94.115819] tg3 0000:81:00.0 em1: Flow control is on for TX and on for RX [Thu Dec 12 07:48:00 2019][ 94.122627] tg3 0000:81:00.0 em1: EEE is enabled [Thu Dec 12 07:48:00 2019][ 94.127280] IPv6: ADDRCONF(NETDEV_CHANGE): em1: link becomes ready [Thu Dec 12 07:48:00 2019][ 94.891924] IPv6: ADDRCONF(NETDEV_UP): ib0: link is not ready [Thu Dec 12 07:48:00 2019][ 95.178052] IPv6: ADDRCONF(NETDEV_CHANGE): ib0: link becomes ready [Thu Dec 12 07:48:02 2019][ OK ] Started LSB: Bring up/down networking. [Thu Dec 12 07:48:02 2019][ OK ] Reached target Network. [Thu Dec 12 07:48:02 2019] Starting Dynamic System Tuning Daemon... [Thu Dec 12 07:48:02 2019][ OK ] Rea[ 99.308408] FS-Cache: Loaded [Thu Dec 12 07:48:02 2019]ched target Network is Online. [Thu Dec 12 07:48:02 2019] Starting Notify NFS peers of a restart... [Thu Dec 12 07:48:02 2019] Starting Collectd statistics daemon... [Thu Dec 12 07:48:02 2019] Mounting /share... [Thu Dec 12 07:48:02 2019] Starting OpenSSH server daemon... [Thu Dec 12 07:48:02 2019] Starting Postfix Mail Transport Agent... [Thu Dec 12 07:48:02 2019] Starting System Logging Service... [Thu Dec 12 07:48:02 2019][ OK [ 99.338586] FS-Cache: Netfs 'nfs' registered for caching [Thu Dec 12 07:48:02 2019] ] Started Notify NFS peers of a restart. [Thu Dec 12 07:48:02 2019][ 99.348331] Key type dns_resolver registered [Thu Dec 12 07:48:02 2019][ OK ] Started Collectd statistics daemon. [Thu Dec 12 07:48:02 2019][ OK ] Started System Logging Service. [Thu Dec 12 07:48:02 2019] Starting xcat service on compute no...script and update node status... [Thu Dec 12 07:48:02 2019][ OK ] Started OpenSSH server daemon. [Thu Dec 12 07:48:02 2019][ OK ] Started xcat service on compu[ 99.377021] NFS: Registering the id_resolver key type [Thu Dec 12 07:48:02 2019]te nod...otscrip[ 99.383481] Key type id_resolver registered [Thu Dec 12 07:48:02 2019]t and update nod[ 99.389056] Key type id_legacy registered [Thu Dec 12 07:48:02 2019]e status. [Thu Dec 12 07:48:02 2019][ OK ] Mounted /share. [Thu Dec 12 07:48:02 2019][ OK ] Reached target Remote File Systems. [Thu Dec 12 07:48:02 2019] Starting Crash recovery kernel arming... [Thu Dec 12 07:48:02 2019] Starting Permit User Sessions... [Thu Dec 12 07:48:02 2019][ OK ] Started Permit User Sessions. [Thu Dec 12 07:48:02 2019] Starting Terminate Plymouth Boot Screen... [Thu Dec 12 07:48:02 2019] Starting Wait for Plymouth Boot Screen to Quit... [Thu Dec 12 07:48:02 2019][ OK ] Started Command Scheduler. [Thu Dec 12 07:48:02 2019][ OK ] Started Lookout metrics collector. [Thu Dec 12 07:48:08 2019] [Thu Dec 12 07:48:08 2019]CentOS Linux 7 (Core) [Thu Dec 12 07:48:08 2019]Kernel 3.10.0-957.27.2.el7_lustre.pl2.x86_64 on an x86_64 [Thu Dec 12 07:48:08 2019] [Thu Dec 12 07:48:08 2019]fir-io8-s1 login: [-- root@localhost detached -- Thu Dec 12 07:48:54 2019] [-- root@localhost attached -- Thu Dec 12 07:49:05 2019] [ 162.984151] LNet: HW NUMA nodes: 4, HW CPU cores: 48, npartitions: 4 [Thu Dec 12 07:49:05 2019][ 162.991711] alg: No test for adler32 (adler32-zlib) [Thu Dec 12 07:49:06 2019][ 163.792143] Lustre: Lustre: Build Version: 2.12.3_4_g142b4d4 [Thu Dec 12 07:49:06 2019][ 163.897622] LNet: 63409:0:(config.c:1627:lnet_inet_enumerate()) lnet: Ignoring interface em2: it's down [Thu Dec 12 07:49:06 2019][ 163.907421] LNet: Using FastReg for registration [Thu Dec 12 07:49:06 2019][ 163.924649] LNet: Added LNI 10.0.10.115@o2ib7 [8/256/0/180] [Thu Dec 12 07:50:53 2019][ 270.897846] LNetError: 63453:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Timed out tx: active_txs, 0 seconds [Thu Dec 12 07:50:53 2019][ 270.908017] LNetError: 63453:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Timed out RDMA with 10.0.10.53@o2ib7 (76): c: 8, oc: 0, rc: 8 [Thu Dec 12 07:51:30 2019][ 307.898023] LNetError: 63453:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Timed out tx: active_txs, 0 seconds [Thu Dec 12 07:51:30 2019][ 307.908193] LNetError: 63453:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Skipped 1 previous similar message [Thu Dec 12 07:51:30 2019][ 307.918277] LNetError: 63453:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Timed out RDMA with 10.0.10.201@o2ib7 (6): c: 7, oc: 0, rc: 8 [Thu Dec 12 07:51:30 2019][ 307.930265] LNetError: 63453:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Skipped 1 previous similar message [Thu Dec 12 07:51:31 2019][ 308.898022] LNetError: 63453:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Timed out tx: active_txs, 1 seconds [Thu Dec 12 07:51:31 2019][ 308.908187] LNetError: 63453:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Skipped 1 previous similar message [Thu Dec 12 07:51:31 2019][ 308.918282] LNetError: 63453:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Timed out RDMA with 10.0.10.204@o2ib7 (7): c: 7, oc: 0, rc: 8 [Thu Dec 12 07:51:31 2019][ 308.930271] LNetError: 63453:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Skipped 2 previous similar messages [Thu Dec 12 07:51:32 2019][ 309.939186] LNetError: 63463:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.203@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 07:53:35 2019][ 432.685184] md: md8 stopped. [Thu Dec 12 07:53:35 2019][ 432.695879] async_tx: api initialized (async) [Thu Dec 12 07:53:35 2019][ 432.702195] xor: automatically using best checksumming function: [Thu Dec 12 07:53:35 2019][ 432.717634] avx : 9596.000 MB/sec [Thu Dec 12 07:53:35 2019][ 432.749632] raid6: sse2x1 gen() 6097 MB/s [Thu Dec 12 07:53:35 2019][ 432.770633] raid6: sse2x2 gen() 11332 MB/s [Thu Dec 12 07:53:35 2019][ 432.791632] raid6: sse2x4 gen() 12957 MB/s [Thu Dec 12 07:53:35 2019][ 432.812635] raid6: avx2x1 gen() 14257 MB/s [Thu Dec 12 07:53:35 2019][ 432.833634] raid6: avx2x2 gen() 18847 MB/s [Thu Dec 12 07:53:35 2019][ 432.854634] raid6: avx2x4 gen() 18843 MB/s [Thu Dec 12 07:53:35 2019][ 432.858908] raid6: using algorithm avx2x2 gen() (18847 MB/s) [Thu Dec 12 07:53:35 2019][ 432.864567] raid6: using avx2x2 recovery algorithm [Thu Dec 12 07:53:35 2019][ 432.886250] md/raid:md8: not clean -- starting background reconstruction [Thu Dec 12 07:53:35 2019][ 432.893061] md/raid:md8: device dm-32 operational as raid disk 0 [Thu Dec 12 07:53:35 2019][ 432.899072] md/raid:md8: device dm-18 operational as raid disk 9 [Thu Dec 12 07:53:35 2019][ 432.905087] md/raid:md8: device dm-17 operational as raid disk 8 [Thu Dec 12 07:53:35 2019][ 432.911101] md/raid:md8: device dm-81 operational as raid disk 7 [Thu Dec 12 07:53:35 2019][ 432.917116] md/raid:md8: device dm-5 operational as raid disk 6 [Thu Dec 12 07:53:35 2019][ 432.923044] md/raid:md8: device dm-41 operational as raid disk 5 [Thu Dec 12 07:53:35 2019][ 432.929057] md/raid:md8: device dm-74 operational as raid disk 4 [Thu Dec 12 07:53:35 2019][ 432.935065] md/raid:md8: device dm-28 operational as raid disk 3 [Thu Dec 12 07:53:35 2019][ 432.941080] md/raid:md8: device dm-27 operational as raid disk 2 [Thu Dec 12 07:53:35 2019][ 432.947096] md/raid:md8: device dm-33 operational as raid disk 1 [Thu Dec 12 07:53:35 2019][ 432.954192] md/raid:md8: raid level 6 active with 10 out of 10 devices, algorithm 2 [Thu Dec 12 07:53:35 2019][ 432.983214] md8: detected capacity change from 0 to 64011422924800 [Thu Dec 12 07:53:35 2019][ 432.989479] md: resync of RAID array md8 [Thu Dec 12 07:53:35 2019][ 433.001692] md: md6 stopped. [Thu Dec 12 07:53:35 2019][ 433.013278] md/raid:md6: not clean -- starting background reconstruction [Thu Dec 12 07:53:35 2019][ 433.020108] md/raid:md6: device dm-39 operational as raid disk 0 [Thu Dec 12 07:53:35 2019][ 433.026120] md/raid:md6: device dm-14 operational as raid disk 9 [Thu Dec 12 07:53:35 2019][ 433.032137] md/raid:md6: device dm-12 operational as raid disk 8 [Thu Dec 12 07:53:35 2019][ 433.038154] md/raid:md6: device dm-77 operational as raid disk 7 [Thu Dec 12 07:53:35 2019][ 433.044165] md/raid:md6: device dm-1 operational as raid disk 6 [Thu Dec 12 07:53:35 2019][ 433.050095] md/raid:md6: device dm-79 operational as raid disk 5 [Thu Dec 12 07:53:35 2019][ 433.056112] md/raid:md6: device dm-86 operational as raid disk 4 [Thu Dec 12 07:53:35 2019][ 433.062131] md/raid:md6: device dm-13 operational as raid disk 3 [Thu Dec 12 07:53:35 2019][ 433.068146] md/raid:md6: device dm-3 operational as raid disk 2 [Thu Dec 12 07:53:35 2019][ 433.074073] md/raid:md6: device dm-30 operational as raid disk 1 [Thu Dec 12 07:53:35 2019][ 433.080864] md/raid:md6: raid level 6 active with 10 out of 10 devices, algorithm 2 [Thu Dec 12 07:53:35 2019][ 433.110036] md6: detected capacity change from 0 to 64011422924800 [Thu Dec 12 07:53:35 2019][ 433.116338] md: resync of RAID array md6 [Thu Dec 12 07:53:35 2019][ 433.135133] md: md10 stopped. [Thu Dec 12 07:53:35 2019][ 433.147669] md/raid:md10: device dm-36 operational as raid disk 0 [Thu Dec 12 07:53:35 2019][ 433.153789] md/raid:md10: device dm-22 operational as raid disk 9 [Thu Dec 12 07:53:35 2019][ 433.159900] md/raid:md10: device dm-21 operational as raid disk 8 [Thu Dec 12 07:53:35 2019][ 433.166008] md/raid:md10: device dm-9 operational as raid disk 7 [Thu Dec 12 07:53:35 2019][ 433.172028] md/raid:md10: device dm-8 operational as raid disk 6 [Thu Dec 12 07:53:35 2019][ 433.178049] md/raid:md10: device dm-0 operational as raid disk 5 [Thu Dec 12 07:53:35 2019][ 433.184065] md/raid:md10: device dm-82 operational as raid disk 4 [Thu Dec 12 07:53:35 2019][ 433.190170] md/raid:md10: device dm-89 operational as raid disk 3 [Thu Dec 12 07:53:35 2019][ 433.196281] md/raid:md10: device dm-85 operational as raid disk 2 [Thu Dec 12 07:53:35 2019][ 433.202392] md/raid:md10: device dm-37 operational as raid disk 1 [Thu Dec 12 07:53:35 2019][ 433.209820] md/raid:md10: raid level 6 active with 10 out of 10 devices, algorithm 2 [Thu Dec 12 07:53:35 2019][ 433.239523] md10: detected capacity change from 0 to 64011422924800 [Thu Dec 12 07:53:35 2019][ 433.256228] md: md2 stopped. [Thu Dec 12 07:53:35 2019][ 433.269197] md/raid:md2: not clean -- starting background reconstruction [Thu Dec 12 07:53:35 2019][ 433.276036] md/raid:md2: device dm-67 operational as raid disk 0 [Thu Dec 12 07:53:35 2019][ 433.282053] md/raid:md2: device dm-55 operational as raid disk 9 [Thu Dec 12 07:53:35 2019][ 433.288072] md/raid:md2: device dm-54 operational as raid disk 8 [Thu Dec 12 07:53:35 2019][ 433.294088] md/raid:md2: device dm-49 operational as raid disk 7 [Thu Dec 12 07:53:35 2019][ 433.300102] md/raid:md2: device dm-116 operational as raid disk 6 [Thu Dec 12 07:53:36 2019][ 433.306209] md/raid:md2: device dm-114 operational as raid disk 5 [Thu Dec 12 07:53:36 2019][ 433.312306] md/raid:md2: device dm-111 operational as raid disk 4 [Thu Dec 12 07:53:36 2019][ 433.318406] md/raid:md2: device dm-63 operational as raid disk 3 [Thu Dec 12 07:53:36 2019][ 433.324435] md/raid:md2: device dm-62 operational as raid disk 2 [Thu Dec 12 07:53:36 2019][ 433.330462] md/raid:md2: device dm-68 operational as raid disk 1 [Thu Dec 12 07:53:36 2019][ 433.337226] md/raid:md2: raid level 6 active with 10 out of 10 devices, algorithm 2 [Thu Dec 12 07:53:36 2019][ 433.367024] md2: detected capacity change from 0 to 64011422924800 [Thu Dec 12 07:53:36 2019][ 433.373350] md: resync of RAID array md2 [Thu Dec 12 07:53:36 2019][ 433.407315] md: md0 stopped. [Thu Dec 12 07:53:36 2019][ 433.422778] md/raid:md0: device dm-104 operational as raid disk 0 [Thu Dec 12 07:53:36 2019][ 433.428885] md/raid:md0: device dm-53 operational as raid disk 9 [Thu Dec 12 07:53:36 2019][ 433.434905] md/raid:md0: device dm-112 operational as raid disk 8 [Thu Dec 12 07:53:36 2019][ 433.441014] md/raid:md0: device dm-119 operational as raid disk 7 [Thu Dec 12 07:53:36 2019][ 433.447121] md/raid:md0: device dm-47 operational as raid disk 6 [Thu Dec 12 07:53:36 2019][ 433.453140] md/raid:md0: device dm-93 operational as raid disk 5 [Thu Dec 12 07:53:36 2019][ 433.459158] md/raid:md0: device dm-113 operational as raid disk 4 [Thu Dec 12 07:53:36 2019][ 433.465261] md/raid:md0: device dm-102 operational as raid disk 3 [Thu Dec 12 07:53:36 2019][ 433.471370] md/raid:md0: device dm-48 operational as raid disk 2 [Thu Dec 12 07:53:36 2019][ 433.477379] md/raid:md0: device dm-65 operational as raid disk 1 [Thu Dec 12 07:53:36 2019][ 433.484439] md/raid:md0: raid level 6 active with 10 out of 10 devices, algorithm 2 [Thu Dec 12 07:53:36 2019][ 433.513691] md0: detected capacity change from 0 to 64011422924800 [Thu Dec 12 07:53:36 2019][ 433.536630] md: md4 stopped. [Thu Dec 12 07:53:36 2019][ 433.555791] md/raid:md4: device dm-71 operational as raid disk 0 [Thu Dec 12 07:53:36 2019][ 433.561806] md/raid:md4: device dm-59 operational as raid disk 9 [Thu Dec 12 07:53:36 2019][ 433.567830] md/raid:md4: device dm-58 operational as raid disk 8 [Thu Dec 12 07:53:36 2019][ 433.573861] md/raid:md4: device dm-51 operational as raid disk 7 [Thu Dec 12 07:53:36 2019][ 433.579884] md/raid:md4: device dm-50 operational as raid disk 6 [Thu Dec 12 07:53:36 2019][ 433.585902] md/raid:md4: device dm-106 operational as raid disk 5 [Thu Dec 12 07:53:36 2019][ 433.592014] md/raid:md4: device dm-46 operational as raid disk 4 [Thu Dec 12 07:53:36 2019][ 433.598038] md/raid:md4: device dm-92 operational as raid disk 3 [Thu Dec 12 07:53:36 2019][ 433.604064] md/raid:md4: device dm-91 operational as raid disk 2 [Thu Dec 12 07:53:36 2019][ 433.610081] md/raid:md4: device dm-72 operational as raid disk 1 [Thu Dec 12 07:53:36 2019][ 433.616812] md/raid:md4: raid level 6 active with 10 out of 10 devices, algorithm 2 [Thu Dec 12 07:53:36 2019][ 433.646210] md4: detected capacity change from 0 to 64011422924800 [Thu Dec 12 07:53:36 2019][ 433.671439] LDISKFS-fs warning (device md8): ldiskfs_multi_mount_protect:321: MMP interval 42 higher than expected, please wait. [Thu Dec 12 07:53:36 2019][ 433.671439] [Thu Dec 12 07:53:36 2019][ 433.750433] LDISKFS-fs warning (device md6): ldiskfs_multi_mount_protect:321: MMP interval 42 higher than expected, please wait. [Thu Dec 12 07:53:36 2019][ 433.750433] [Thu Dec 12 07:53:36 2019][ 433.963398] LDISKFS-fs warning (device md10): ldiskfs_multi_mount_protect:321: MMP interval 42 higher than expected, please wait. [Thu Dec 12 07:53:36 2019][ 433.963398] [Thu Dec 12 07:53:37 2019][ 434.344854] LDISKFS-fs warning (device md2): ldiskfs_multi_mount_protect:321: MMP interval 42 higher than expected, please wait. [Thu Dec 12 07:53:37 2019][ 434.344854] [Thu Dec 12 07:53:37 2019][ 434.394289] LDISKFS-fs warning (device md0): ldiskfs_multi_mount_protect:321: MMP interval 42 higher than expected, please wait. [Thu Dec 12 07:53:37 2019][ 434.394289] [Thu Dec 12 07:53:37 2019][ 434.536309] md: md8: resync done. [Thu Dec 12 07:53:37 2019][ 434.593186] LDISKFS-fs warning (device md4): ldiskfs_multi_mount_protect:321: MMP interval 42 higher than expected, please wait. [Thu Dec 12 07:53:37 2019][ 434.593186] [Thu Dec 12 07:53:39 2019][ 436.310995] md: md6: resync done. [Thu Dec 12 07:53:45 2019][ 442.797007] md: md2: resync done. [Thu Dec 12 07:54:18 2019][ 475.756079] LDISKFS-fs (md8): file extents enabled, maximum tree depth=5 [Thu Dec 12 07:54:18 2019][ 475.835014] LDISKFS-fs (md6): file extents enabled, maximum tree depth=5 [Thu Dec 12 07:54:18 2019][ 476.063046] LDISKFS-fs (md10): file extents enabled, maximum tree depth=5 [Thu Dec 12 07:54:19 2019][ 476.467057] LDISKFS-fs (md2): file extents enabled, maximum tree depth=5 [Thu Dec 12 07:54:19 2019][ 476.515026] LDISKFS-fs (md0): file extents enabled, maximum tree depth=5 [Thu Dec 12 07:54:19 2019][ 476.695058] LDISKFS-fs (md4): file extents enabled, maximum tree depth=5 [Thu Dec 12 07:54:21 2019][ 478.375468] LDISKFS-fs (md2): recovery complete [Thu Dec 12 07:54:21 2019][ 478.382631] LDISKFS-fs (md2): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Thu Dec 12 07:54:21 2019][ 479.041873] LustreError: 137-5: fir-OST0058_UUID: not available for connect from 10.9.116.18@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Thu Dec 12 07:54:21 2019][ 479.271319] Lustre: fir-OST0056: Not available for connect from 10.9.115.2@o2ib4 (not set up) [Thu Dec 12 07:54:22 2019][ 479.319553] LDISKFS-fs (md6): recovery complete [Thu Dec 12 07:54:22 2019][ 479.335044] LDISKFS-fs (md6): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Thu Dec 12 07:54:22 2019][ 479.568964] LustreError: 137-5: fir-OST0054_UUID: not available for connect from 10.8.7.7@o2ib6 (no target). If you are running an HA pair check that the target is mounted on the other server. [Thu Dec 12 07:54:22 2019][ 479.586077] LustreError: Skipped 46 previous similar messages [Thu Dec 12 07:54:22 2019][ 479.592405] LDISKFS-fs (md0): recovery complete [Thu Dec 12 07:54:22 2019][ 479.616129] LDISKFS-fs (md0): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Thu Dec 12 07:54:22 2019][ 479.648058] LDISKFS-fs (md8): recovery complete [Thu Dec 12 07:54:22 2019][ 479.664024] LDISKFS-fs (md8): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Thu Dec 12 07:54:22 2019][ 479.815598] Lustre: fir-OST0056: Not available for connect from 10.9.107.10@o2ib4 (not set up) [Thu Dec 12 07:54:22 2019][ 479.824208] Lustre: Skipped 16 previous similar messages [Thu Dec 12 07:54:22 2019][ 479.880341] LDISKFS-fs (md10): recovery complete [Thu Dec 12 07:54:22 2019][ 479.895930] LDISKFS-fs (md10): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Thu Dec 12 07:54:23 2019][ 480.385562] Lustre: fir-OST0056: Imperative Recovery enabled, recovery window shrunk from 300-900 down to 150-900 [Thu Dec 12 07:54:23 2019][ 480.454637] Lustre: fir-OST0056: Will be in recovery for at least 2:30, or until 1286 clients reconnect [Thu Dec 12 07:54:23 2019][ 480.464069] Lustre: fir-OST0056: Connection restored to (at 10.9.104.23@o2ib4) [Thu Dec 12 07:54:23 2019][ 480.602348] LustreError: 137-5: fir-OST0054_UUID: not available for connect from 10.9.106.30@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Thu Dec 12 07:54:23 2019][ 480.619719] LustreError: Skipped 119 previous similar messages [Thu Dec 12 07:54:23 2019][ 480.654149] LDISKFS-fs (md4): recovery complete [Thu Dec 12 07:54:23 2019][ 480.669190] LDISKFS-fs (md4): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Thu Dec 12 07:54:23 2019][ 481.049827] Lustre: fir-OST0056: Connection restored to (at 10.9.105.20@o2ib4) [Thu Dec 12 07:54:23 2019][ 481.053070] Lustre: fir-OST0054: Imperative Recovery enabled, recovery window shrunk from 300-900 down to 150-900 [Thu Dec 12 07:54:23 2019][ 481.053072] Lustre: Skipped 1 previous similar message [Thu Dec 12 07:54:23 2019][ 481.072596] Lustre: Skipped 35 previous similar messages [Thu Dec 12 07:54:23 2019][ 481.129834] Lustre: fir-OST0054: Will be in recovery for at least 2:30, or until 1298 clients reconnect [Thu Dec 12 07:54:23 2019][ 481.139232] Lustre: Skipped 1 previous similar message [Thu Dec 12 07:54:24 2019][ 481.306472] Lustre: fir-OST005c: Not available for connect from 10.8.21.24@o2ib6 (not set up) [Thu Dec 12 07:54:24 2019][ 481.315005] Lustre: Skipped 11 previous similar messages [Thu Dec 12 07:54:24 2019][ 482.071019] Lustre: fir-OST0058: Imperative Recovery enabled, recovery window shrunk from 300-900 down to 150-900 [Thu Dec 12 07:54:24 2019][ 482.081300] Lustre: Skipped 2 previous similar messages [Thu Dec 12 07:54:24 2019][ 482.090211] Lustre: fir-OST0058: Connection restored to (at 10.8.18.17@o2ib6) [Thu Dec 12 07:54:24 2019][ 482.097446] Lustre: Skipped 128 previous similar messages [Thu Dec 12 07:54:26 2019][ 483.706991] Lustre: fir-OST0056: Denying connection for new client e93aa557-365a-4 (at 10.8.22.24@o2ib6), waiting for 1286 known clients (91 recovered, 9 in progress, and 0 evicted) to recover in 2:26 [Thu Dec 12 07:54:26 2019][ 484.092519] Lustre: fir-OST0058: Connection restored to (at 10.8.21.29@o2ib6) [Thu Dec 12 07:54:26 2019][ 484.099751] Lustre: Skipped 377 previous similar messages [Thu Dec 12 07:54:28 2019][ 486.009541] LustreError: 137-5: fir-OST0055_UUID: not available for connect from 10.9.107.9@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Thu Dec 12 07:54:28 2019][ 486.026818] LustreError: Skipped 103 previous similar messages [Thu Dec 12 07:54:29 2019][ 486.711415] Lustre: fir-OST0056: Denying connection for new client 7e5bcac9-70c5-4 (at 10.8.23.26@o2ib6), waiting for 1286 known clients (165 recovered, 21 in progress, and 0 evicted) to recover in 2:23 [Thu Dec 12 07:54:30 2019][ 488.096716] Lustre: fir-OST005c: Connection restored to (at 10.9.102.49@o2ib4) [Thu Dec 12 07:54:30 2019][ 488.104033] Lustre: Skipped 705 previous similar messages [Thu Dec 12 07:54:36 2019][ 494.144743] Lustre: fir-OST0056: Denying connection for new client ba579da8-c6cc-4 (at 10.8.22.17@o2ib6), waiting for 1286 known clients (546 recovered, 44 in progress, and 0 evicted) to recover in 2:16 [Thu Dec 12 07:54:38 2019][ 496.123537] Lustre: fir-OST0054: Connection restored to (at 10.9.0.4@o2ib4) [Thu Dec 12 07:54:38 2019][ 496.130594] Lustre: Skipped 2730 previous similar messages [Thu Dec 12 07:54:41 2019][ 498.931984] LNetError: 63470:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.211@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 07:54:41 2019][ 498.944936] LNetError: 63470:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) Skipped 7 previous similar messages [Thu Dec 12 07:54:41 2019][ 499.005937] LustreError: 137-5: fir-OST0057_UUID: not available for connect from 10.9.107.9@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Thu Dec 12 07:54:51 2019][ 508.795280] Lustre: fir-OST0056: Denying connection for new client e93aa557-365a-4 (at 10.8.22.24@o2ib6), waiting for 1286 known clients (882 recovered, 76 in progress, and 0 evicted) to recover in 2:01 [Thu Dec 12 07:54:51 2019][ 508.813258] Lustre: Skipped 1 previous similar message [Thu Dec 12 07:54:54 2019][ 512.008643] LustreError: 137-5: fir-OST005d_UUID: not available for connect from 10.9.107.9@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Thu Dec 12 07:54:54 2019][ 512.025942] LustreError: Skipped 1 previous similar message [Thu Dec 12 07:54:54 2019][ 512.155075] Lustre: fir-OST005a: Connection restored to (at 10.8.19.2@o2ib6) [Thu Dec 12 07:54:54 2019][ 512.162215] Lustre: Skipped 1557 previous similar messages [Thu Dec 12 07:54:58 2019][ 516.120838] Lustre: fir-OST0056: Denying connection for new client 42a69631-cdb5-4 (at 10.8.22.22@o2ib6), waiting for 1286 known clients (896 recovered, 95 in progress, and 0 evicted) to recover in 1:54 [Thu Dec 12 07:54:58 2019][ 516.138861] Lustre: Skipped 1 previous similar message [Thu Dec 12 07:55:16 2019][ 533.317727] Lustre: fir-OST0056: Denying connection for new client 26dc458c-0aaf-4 (at 10.9.113.13@o2ib4), waiting for 1286 known clients (968 recovered, 121 in progress, and 0 evicted) to recover in 1:37 [Thu Dec 12 07:55:16 2019][ 533.335877] Lustre: Skipped 4 previous similar messages [Thu Dec 12 07:55:23 2019][ 541.012180] Lustre: fir-OST0054: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnected, waiting for 1298 clients in recovery for 1:30 [Thu Dec 12 07:55:23 2019][ 541.012233] LustreError: 137-5: fir-OST0057_UUID: not available for connect from 10.9.107.9@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Thu Dec 12 07:55:23 2019][ 541.012235] LustreError: Skipped 1 previous similar message [Thu Dec 12 07:55:26 2019][ 544.155714] Lustre: fir-OST0054: Connection restored to (at 10.9.108.69@o2ib4) [Thu Dec 12 07:55:26 2019][ 544.163043] Lustre: Skipped 1547 previous similar messages [Thu Dec 12 07:55:41 2019][ 558.405141] Lustre: fir-OST0056: Denying connection for new client 26dc458c-0aaf-4 (at 10.9.113.13@o2ib4), waiting for 1286 known clients (1063 recovered, 168 in progress, and 0 evicted) to recover in 1:12 [Thu Dec 12 07:55:41 2019][ 558.423381] Lustre: Skipped 8 previous similar messages [Thu Dec 12 07:55:43 2019][ 561.019196] Lustre: fir-OST0056: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnected, waiting for 1286 clients in recovery for 1:09 [Thu Dec 12 07:56:04 2019][ 582.021658] Lustre: fir-OST0058: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnected, waiting for 1298 clients in recovery for 0:50 [Thu Dec 12 07:56:04 2019][ 582.033914] Lustre: Skipped 1 previous similar message [Thu Dec 12 07:56:05 2019][ 583.021615] LustreError: 137-5: fir-OST005d_UUID: not available for connect from 10.9.107.9@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Thu Dec 12 07:56:05 2019][ 583.038904] LustreError: Skipped 2 previous similar messages [Thu Dec 12 07:56:13 2019][ 591.233902] Lustre: fir-OST0056: Denying connection for new client 6ee05204-fe40-4 (at 10.9.113.12@o2ib4), waiting for 1286 known clients (1075 recovered, 176 in progress, and 0 evicted) to recover in 0:39 [Thu Dec 12 07:56:13 2019][ 591.252153] Lustre: Skipped 15 previous similar messages [Thu Dec 12 07:56:46 2019][ 624.025680] Lustre: fir-OST005e: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnected, waiting for 1298 clients in recovery for 0:07 [Thu Dec 12 07:56:46 2019][ 624.037962] Lustre: fir-OST005e: Connection restored to (at 10.9.107.9@o2ib4) [Thu Dec 12 07:56:46 2019][ 624.045190] Lustre: Skipped 425 previous similar messages [Thu Dec 12 07:56:53 2019][ 630.455376] Lustre: fir-OST0056: recovery is timed out, evict stale exports [Thu Dec 12 07:56:53 2019][ 630.462583] Lustre: fir-OST0056: disconnecting 35 stale clients [Thu Dec 12 07:56:53 2019][ 630.468931] Lustre: 66504:0:(ldlm_lib.c:1765:extend_recovery_timer()) fir-OST0056: extended recovery timer reaching hard limit: 900, extend: 1 [Thu Dec 12 07:56:53 2019][ 631.130575] Lustre: fir-OST0054: recovery is timed out, evict stale exports [Thu Dec 12 07:56:53 2019][ 631.137547] Lustre: Skipped 1 previous similar message [Thu Dec 12 07:56:53 2019][ 631.142945] Lustre: fir-OST0054: disconnecting 33 stale clients [Thu Dec 12 07:56:53 2019][ 631.148901] Lustre: Skipped 1 previous similar message [Thu Dec 12 07:56:53 2019][ 631.154491] Lustre: 66562:0:(ldlm_lib.c:1765:extend_recovery_timer()) fir-OST0054: extended recovery timer reaching hard limit: 900, extend: 1 [Thu Dec 12 07:56:53 2019][ 631.167276] Lustre: 66562:0:(ldlm_lib.c:1765:extend_recovery_timer()) Skipped 532 previous similar messages [Thu Dec 12 07:56:59 2019][ 636.899648] LNetError: 63453:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Timed out tx: tx_queue, 0 seconds [Thu Dec 12 07:56:59 2019][ 636.909647] LNetError: 63453:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Skipped 5 previous similar messages [Thu Dec 12 07:56:59 2019][ 636.919816] LNetError: 63453:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Timed out RDMA with 10.0.10.209@o2ib7 (5): c: 0, oc: 0, rc: 8 [Thu Dec 12 07:56:59 2019][ 636.931799] LNetError: 63453:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Skipped 4 previous similar messages [Thu Dec 12 07:56:59 2019][ 636.941780] LNetError: 63453:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 07:57:06 2019][ 643.956688] LNetError: 63470:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.209@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 07:57:06 2019][ 643.969674] LustreError: 66504:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) @@@ truncated bulk GET 0(16777216) req@ffff8dd742357050 x1649530969645792/t0(21478004433) o4->1c192c26-6a2d-8fff-8f45-c6fac242e547@10.9.104.15@o2ib4:313/0 lens 488/448 e 7 to 0 dl 1576166248 ref 1 fl Complete:/4/0 rc 0/0 [Thu Dec 12 07:57:06 2019][ 643.995991] Lustre: fir-OST0056: Bulk IO write error with 1c192c26-6a2d-8fff-8f45-c6fac242e547 (at 10.9.104.15@o2ib4), client will retry: rc = -110 [Thu Dec 12 07:57:14 2019][ 652.032404] LustreError: 137-5: fir-OST0059_UUID: not available for connect from 10.9.107.9@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Thu Dec 12 07:57:14 2019][ 652.049702] LustreError: Skipped 4 previous similar messages [Thu Dec 12 07:57:21 2019][ 658.757614] Lustre: fir-OST0056: Denying connection for new client 26dc458c-0aaf-4 (at 10.9.113.13@o2ib4), waiting for 1286 known clients (1076 recovered, 175 in progress, and 35 evicted) to recover in 12:01 [Thu Dec 12 07:57:21 2019][ 658.776051] Lustre: Skipped 31 previous similar messages [Thu Dec 12 07:57:28 2019][ 666.031763] Lustre: fir-OST005e: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnected, waiting for 1298 clients in recovery for 11:55 [Thu Dec 12 07:58:10 2019][ 708.053355] Lustre: fir-OST005e: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnected, waiting for 1298 clients in recovery for 11:13 [Thu Dec 12 07:59:07 2019][ 765.055251] Lustre: fir-OST005e: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnected, waiting for 1298 clients in recovery for 10:16 [Thu Dec 12 07:59:07 2019][ 765.067623] Lustre: fir-OST005e: Connection restored to (at 10.9.107.9@o2ib4) [Thu Dec 12 07:59:07 2019][ 765.074851] Lustre: Skipped 2 previous similar messages [Thu Dec 12 07:59:30 2019][ 787.780112] Lustre: fir-OST0056: Denying connection for new client 7e5bcac9-70c5-4 (at 10.8.23.26@o2ib6), waiting for 1286 known clients (1076 recovered, 175 in progress, and 35 evicted) to recover in 9:52 [Thu Dec 12 07:59:30 2019][ 787.798352] Lustre: Skipped 61 previous similar messages [Thu Dec 12 08:00:03 2019][ 821.065861] Lustre: fir-OST005e: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnected, waiting for 1298 clients in recovery for 9:20 [Thu Dec 12 08:00:03 2019][ 821.065886] LustreError: 137-5: fir-OST005d_UUID: not available for connect from 10.9.107.9@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Thu Dec 12 08:00:03 2019][ 821.065888] LustreError: Skipped 10 previous similar messages [Thu Dec 12 08:00:03 2019][ 821.101847] Lustre: 66616:0:(ldlm_lib.c:1765:extend_recovery_timer()) fir-OST005e: extended recovery timer reaching hard limit: 900, extend: 1 [Thu Dec 12 08:00:03 2019][ 821.114679] Lustre: 66616:0:(ldlm_lib.c:1765:extend_recovery_timer()) Skipped 1557 previous similar messages [Thu Dec 12 08:04:04 2019][ 1061.839183] Lustre: fir-OST0056: Denying connection for new client 2f9d93e2-ec9b-6d68-8edf-66d4312ccfe3 (at 10.9.116.8@o2ib4), waiting for 1286 known clients (1076 recovered, 175 in progress, and 35 evicted) to recover in 5:18 [Thu Dec 12 08:04:04 2019][ 1061.859237] Lustre: Skipped 123 previous similar messages [Thu Dec 12 08:04:31 2019][ 1089.195831] LustreError: 137-5: fir-OST0055_UUID: not available for connect from 10.9.107.9@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Thu Dec 12 08:04:31 2019][ 1089.213116] LustreError: Skipped 16 previous similar messages [Thu Dec 12 08:07:42 2019][ 1280.241334] Lustre: fir-OST005c: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnected, waiting for 1298 clients in recovery for 1:41 [Thu Dec 12 08:07:43 2019][ 1280.253627] Lustre: fir-OST005c: Connection restored to (at 10.9.107.9@o2ib4) [Thu Dec 12 08:07:43 2019][ 1280.260856] Lustre: Skipped 1 previous similar message [Thu Dec 12 08:09:26 2019][ 1384.262693] Lustre: fir-OST005c: Recovery already passed deadline 0:02. If you do not want to wait more, you may force taget eviction via 'lctl --device fir-OST005c abort_recovery. [Thu Dec 12 08:09:31 2019][ 1389.272356] Lustre: fir-OST005a: Recovery already passed deadline 0:08. If you do not want to wait more, you may force taget eviction via 'lctl --device fir-OST005a abort_recovery. [Thu Dec 12 08:09:39 2019][ 1396.268916] Lustre: fir-OST005a: Recovery already passed deadline 0:15. If you do not want to wait more, you may force taget eviction via 'lctl --device fir-OST005a abort_recovery. [Thu Dec 12 08:09:41 2019][ 1398.286105] Lustre: fir-OST0056: Recovery already passed deadline 0:17. If you do not want to wait more, you may force taget eviction via 'lctl --device fir-OST0056 abort_recovery. [Thu Dec 12 08:09:45 2019][ 1403.077403] Lustre: fir-OST005a: Recovery already passed deadline 0:22. If you do not want to wait more, you may force taget eviction via 'lctl --device fir-OST005a abort_recovery. [Thu Dec 12 08:09:55 2019][ 1412.286369] Lustre: fir-OST0058: Recovery already passed deadline 0:30. If you do not want to wait more, you may force taget eviction via 'lctl --device fir-OST0058 abort_recovery. [Thu Dec 12 08:09:55 2019][ 1412.302442] Lustre: Skipped 2 previous similar messages [Thu Dec 12 08:10:15 2019][ 1433.084183] Lustre: fir-OST005c: Recovery already passed deadline 0:51. If you do not want to wait more, you may force taget eviction via 'lctl --device fir-OST005c abort_recovery. [Thu Dec 12 08:10:15 2019][ 1433.100255] Lustre: Skipped 1 previous similar message [Thu Dec 12 08:10:32 2019] [Thu Dec 12 08:10:32 2019]CentOS Linux 7 (Core) [Thu Dec 12 08:10:32 2019]Kernel 3.10.0-957.27.2.el7_lustre.pl2.x86_64 on an x86_64 [Thu Dec 12 08:10:32 2019] [Thu Dec 12 08:10:32 2019]fir-io8-s1 login: [Thu Dec 12 08:10:32 2019]CentOS Linux 7 (Core) [Thu Dec 12 08:10:32 2019]Kernel 3.10.0-957.27.2.el7_lustre.pl2.x86_64 on an x86_64 [Thu Dec 12 08:10:32 2019] [Thu Dec 12 08:10:32 2019]fir-io8-s1 login: [Thu Dec 12 08:10:32 2019]CentOS Linux 7 (Core) [Thu Dec 12 08:10:32 2019]Kernel 3.10.0-957.27.2.el7_lustre.pl2.x86_64 on an x86_64 [Thu Dec 12 08:10:32 2019] [Thu Dec 12 08:10:32 2019]fir-io8-s1 login: [Thu Dec 12 08:10:32 2019]CentOS Linux 7 (Core) [Thu Dec 12 08:10:32 2019]Kernel 3.10.0-957.27.2.el7_lustre.pl2.x86_64 on an x86_64 [Thu Dec 12 08:10:32 2019] [Thu Dec 12 08:10:32 2019]fir-io8-s1 login: [ 1475.298604] Lustre: fir-OST005c: Recovery already passed deadline 1:33. If you do not want to wait more, you may force taget eviction via 'lctl --device fir-OST005c abort_recovery. [Thu Dec 12 08:10:58 2019][ 1475.314675] Lustre: Skipped 4 previous similar messages [-- root@localhost detached -- Thu Dec 12 08:10:59 2019] [Thu Dec 12 08:11:28 2019][ 1506.231104] LustreError: 68347:0:(ofd_obd.c:1331:ofd_iocontrol()) fir-OST0054: aborting recovery [Thu Dec 12 08:11:28 2019][ 1506.239940] LustreError: 68347:0:(ldlm_lib.c:2605:target_stop_recovery_thread()) fir-OST0054: Aborting recovery [Thu Dec 12 08:11:28 2019][ 1506.250072] Lustre: 66562:0:(ldlm_lib.c:2056:target_recovery_overseer()) recovery is aborted, evict exports in recovery [Thu Dec 12 08:11:28 2019][ 1506.261110] Lustre: fir-OST0054: disconnecting 1 stale clients [Thu Dec 12 08:11:28 2019][ 1506.266949] Lustre: Skipped 3 previous similar messages [Thu Dec 12 08:11:29 2019][ 1506.425827] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3071596 to 0x1800000400:3071617 [Thu Dec 12 08:11:29 2019][ 1506.429783] LustreError: 68395:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 6034fa4f-5b0b-b211-f21f-00bf5fe7933a claims 847872 GRANT, real grant 0 [Thu Dec 12 08:11:29 2019][ 1506.433027] Lustre: 66562:0:(ldlm_lib.c:2550:target_recovery_thread()) too long recovery - read logs [Thu Dec 12 08:11:29 2019][ 1506.433069] Lustre: fir-OST0054: Recovery over after 17:06, of 1298 clients 1264 recovered and 34 were evicted. [Thu Dec 12 08:11:29 2019][ 1506.433072] LustreError: dumping log to /tmp/lustre-log.1576167089.66562 [Thu Dec 12 08:11:29 2019][ 1506.476294] Lustre: fir-OST0054: deleting orphan objects from 0x0:27458083 to 0x0:27458177 [Thu Dec 12 08:11:29 2019][ 1506.476649] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785705 to 0x1800000401:11785729 [Thu Dec 12 08:11:29 2019][ 1506.476669] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1118148 to 0x1800000402:1118209 [Thu Dec 12 08:11:30 2019][ 1507.751285] LustreError: 68755:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli dc9dbc94-e0be-4 claims 208896 GRANT, real grant 0 [Thu Dec 12 08:11:30 2019][ 1507.763492] LustreError: 68755:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 36 previous similar messages [Thu Dec 12 08:11:32 2019][ 1509.959005] LustreError: 68683:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 7eab513e-39ec-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 08:11:35 2019][ 1512.735181] LustreError: 68744:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli f7b0d72c-696c-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 08:11:35 2019][ 1512.747268] LustreError: 68744:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 8 previous similar messages [Thu Dec 12 08:11:39 2019][ 1517.004935] LustreError: 68587:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 08:11:39 2019][ 1517.017080] LustreError: 68587:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 4 previous similar messages [Thu Dec 12 08:11:40 2019][ 1517.721942] Lustre: 66660:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576167089/real 1576167089] req@ffff8dc7e1e34800 x1652729573300464/t0(0) o104->fir-OST0054@10.9.114.7@o2ib4:15/16 lens 296/224 e 0 to 1 dl 1576167100 ref 1 fl Rpc:X/0/ffffffff rc 0/-1 [Thu Dec 12 08:11:47 2019][ 1525.026682] LustreError: 68944:0:(ofd_obd.c:1331:ofd_iocontrol()) fir-OST0056: aborting recovery [Thu Dec 12 08:11:47 2019][ 1525.035483] LustreError: 68944:0:(ldlm_lib.c:2605:target_stop_recovery_thread()) fir-OST0056: Aborting recovery [Thu Dec 12 08:11:47 2019][ 1525.045595] Lustre: 66504:0:(ldlm_lib.c:2056:target_recovery_overseer()) recovery is aborted, evict exports in recovery [Thu Dec 12 08:11:47 2019][ 1525.056625] Lustre: fir-OST0056: disconnecting 175 stale clients [Thu Dec 12 08:11:47 2019][ 1525.064555] LustreError: 66504:0:(ldlm_lib.c:1634:abort_lock_replay_queue()) @@@ aborted: req@ffff8db7fc14b850 x1649559505428464/t0(0) o101->fe16bc49-4bbe-dc30-a069-fee92bf3e984@10.9.104.23@o2ib4:430/0 lens 328/0 e 12 to 0 dl 1576167120 ref 1 fl Complete:/40/ffffffff rc 0/-1 [Thu Dec 12 08:11:47 2019][ 1525.135666] Lustre: 66504:0:(ldlm_lib.c:2046:target_recovery_overseer()) fir-OST0056 recovery is aborted by hard timeout [Thu Dec 12 08:11:48 2019][ 1525.305009] Lustre: 66504:0:(ldlm_lib.c:2550:target_recovery_thread()) too long recovery - read logs [Thu Dec 12 08:11:48 2019][ 1525.314190] Lustre: fir-OST0056: Recovery over after 17:24, of 1286 clients 1076 recovered and 210 were evicted. [Thu Dec 12 08:11:48 2019][ 1525.314193] LustreError: dumping log to /tmp/lustre-log.1576167107.66504 [Thu Dec 12 08:11:51 2019][ 1528.748998] Lustre: 66660:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576167100/real 1576167100] req@ffff8dc7e1e34800 x1652729573300464/t0(0) o104->fir-OST0054@10.9.114.7@o2ib4:15/16 lens 296/224 e 0 to 1 dl 1576167111 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Thu Dec 12 08:11:58 2019][ 1536.171753] LustreError: 69003:0:(ofd_obd.c:1331:ofd_iocontrol()) fir-OST0058: aborting recovery [Thu Dec 12 08:11:58 2019][ 1536.180551] LustreError: 69003:0:(ldlm_lib.c:2605:target_stop_recovery_thread()) fir-OST0058: Aborting recovery [Thu Dec 12 08:11:58 2019][ 1536.190677] Lustre: 66643:0:(ldlm_lib.c:2056:target_recovery_overseer()) recovery is aborted, evict exports in recovery [Thu Dec 12 08:11:58 2019][ 1536.201491] Lustre: 66643:0:(ldlm_lib.c:2056:target_recovery_overseer()) Skipped 1 previous similar message [Thu Dec 12 08:11:58 2019][ 1536.211473] Lustre: fir-OST0058: disconnecting 2 stale clients [Thu Dec 12 08:11:59 2019][ 1536.363145] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3086758 to 0x1900000402:3086785 [Thu Dec 12 08:11:59 2019][ 1536.367912] LustreError: 68753:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0058: cli 8e3bf475-0833-510a-ec9f-d8743c1caa75 claims 28672 GRANT, real grant 0 [Thu Dec 12 08:11:59 2019][ 1536.369253] Lustre: 66643:0:(ldlm_lib.c:2550:target_recovery_thread()) too long recovery - read logs [Thu Dec 12 08:11:59 2019][ 1536.369294] Lustre: fir-OST0058: Recovery over after 17:35, of 1298 clients 1259 recovered and 39 were evicted. [Thu Dec 12 08:11:59 2019][ 1536.369296] LustreError: dumping log to /tmp/lustre-log.1576167119.66643 [Thu Dec 12 08:11:59 2019][ 1536.412735] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797795 to 0x1900000401:11797825 [Thu Dec 12 08:11:59 2019][ 1536.412810] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1125126 to 0x1900000400:1125153 [Thu Dec 12 08:11:59 2019][ 1536.424125] Lustre: fir-OST0058: deleting orphan objects from 0x0:27507140 to 0x0:27507201 [Thu Dec 12 08:12:02 2019][ 1539.776052] Lustre: 66660:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576167111/real 1576167111] req@ffff8dc7e1e34800 x1652729573300464/t0(0) o104->fir-OST0054@10.9.114.7@o2ib4:15/16 lens 296/224 e 0 to 1 dl 1576167122 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Thu Dec 12 08:12:08 2019][ 1545.311193] Lustre: fir-OST0058: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnecting [Thu Dec 12 08:12:13 2019][ 1550.732345] LustreError: 69140:0:(ofd_obd.c:1331:ofd_iocontrol()) fir-OST005a: aborting recovery [Thu Dec 12 08:12:13 2019][ 1550.741142] LustreError: 69140:0:(ldlm_lib.c:2605:target_stop_recovery_thread()) fir-OST005a: Aborting recovery [Thu Dec 12 08:12:13 2019][ 1550.751264] Lustre: 66534:0:(ldlm_lib.c:2056:target_recovery_overseer()) recovery is aborted, evict exports in recovery [Thu Dec 12 08:12:13 2019][ 1550.762316] Lustre: fir-OST005a: disconnecting 3 stale clients [Thu Dec 12 08:12:13 2019][ 1550.803102] Lustre: 66660:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576167122/real 1576167122] req@ffff8dc7e1e34800 x1652729573300464/t0(0) o104->fir-OST0054@10.9.114.7@o2ib4:15/16 lens 296/224 e 0 to 1 dl 1576167133 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Thu Dec 12 08:12:13 2019][ 1550.983469] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842499 to 0x1980000401:11842529 [Thu Dec 12 08:12:13 2019][ 1550.983522] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3090791 to 0x1980000402:3090817 [Thu Dec 12 08:12:13 2019][ 1550.989009] Lustre: 66534:0:(ldlm_lib.c:2550:target_recovery_thread()) too long recovery - read logs [Thu Dec 12 08:12:13 2019][ 1550.989052] Lustre: fir-OST005a: Recovery over after 17:50, of 1298 clients 1262 recovered and 36 were evicted. [Thu Dec 12 08:12:13 2019][ 1550.989055] LustreError: dumping log to /tmp/lustre-log.1576167133.66534 [Thu Dec 12 08:12:13 2019][ 1551.004148] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126630 to 0x1980000400:1126689 [Thu Dec 12 08:12:13 2019][ 1551.004227] Lustre: fir-OST005a: deleting orphan objects from 0x0:27562533 to 0x0:27562561 [Thu Dec 12 08:12:16 2019][ 1554.206040] LustreError: 68554:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005a: cli 9e1b197d-ddc8-abff-2dfd-496f87079e39 claims 28672 GRANT, real grant 0 [Thu Dec 12 08:12:16 2019][ 1554.220274] LustreError: 68554:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 122 previous similar messages [Thu Dec 12 08:12:24 2019][ 1561.830158] Lustre: 66660:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576167133/real 1576167133] req@ffff8dc7e1e34800 x1652729573300464/t0(0) o104->fir-OST0054@10.9.114.7@o2ib4:15/16 lens 296/224 e 0 to 1 dl 1576167144 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Thu Dec 12 08:12:25 2019][ 1562.892611] LustreError: 69150:0:(ofd_obd.c:1331:ofd_iocontrol()) fir-OST005c: aborting recovery [Thu Dec 12 08:12:25 2019][ 1562.901401] LustreError: 69150:0:(ldlm_lib.c:2605:target_stop_recovery_thread()) fir-OST005c: Aborting recovery [Thu Dec 12 08:12:25 2019][ 1562.911500] Lustre: 66589:0:(ldlm_lib.c:2056:target_recovery_overseer()) recovery is aborted, evict exports in recovery [Thu Dec 12 08:12:25 2019][ 1562.922548] Lustre: fir-OST005c: disconnecting 2 stale clients [Thu Dec 12 08:12:25 2019][ 1563.156924] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3049796 to 0x1a00000401:3049825 [Thu Dec 12 08:12:25 2019][ 1563.162591] Lustre: 66589:0:(ldlm_lib.c:2550:target_recovery_thread()) too long recovery - read logs [Thu Dec 12 08:12:25 2019][ 1563.162654] LustreError: dumping log to /tmp/lustre-log.1576167145.66589 [Thu Dec 12 08:12:25 2019][ 1563.162656] Lustre: fir-OST005c: Recovery over after 18:01, of 1298 clients 1262 recovered and 36 were evicted. [Thu Dec 12 08:12:26 2019][ 1563.274080] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000400:11653444 to 0x1a00000400:11653505 [Thu Dec 12 08:12:26 2019][ 1563.274267] Lustre: fir-OST005c: deleting orphan objects from 0x0:27192804 to 0x0:27192833 [Thu Dec 12 08:12:35 2019][ 1572.857213] Lustre: 66660:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576167144/real 1576167144] req@ffff8dc7e1e34800 x1652729573300464/t0(0) o104->fir-OST0054@10.9.114.7@o2ib4:15/16 lens 296/224 e 0 to 1 dl 1576167155 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Thu Dec 12 08:12:35 2019][ 1572.884596] Lustre: 66660:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 1 previous similar message [Thu Dec 12 08:12:37 2019][ 1574.312949] Lustre: fir-OST005c: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnecting [Thu Dec 12 08:12:43 2019][ 1580.313596] Lustre: fir-OST005a: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnecting [Thu Dec 12 08:12:48 2019][ 1585.525007] LustreError: 69210:0:(ofd_obd.c:1331:ofd_iocontrol()) fir-OST005e: aborting recovery [Thu Dec 12 08:12:48 2019][ 1585.533797] LustreError: 69210:0:(ldlm_lib.c:2605:target_stop_recovery_thread()) fir-OST005e: Aborting recovery [Thu Dec 12 08:12:48 2019][ 1585.543930] Lustre: 66616:0:(ldlm_lib.c:2056:target_recovery_overseer()) recovery is aborted, evict exports in recovery [Thu Dec 12 08:12:48 2019][ 1585.554959] Lustre: fir-OST005e: disconnecting 10 stale clients [Thu Dec 12 08:12:48 2019][ 1585.728153] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3083078 to 0x1a80000402:3083105 [Thu Dec 12 08:12:48 2019][ 1585.735139] Lustre: 66616:0:(ldlm_lib.c:2550:target_recovery_thread()) too long recovery - read logs [Thu Dec 12 08:12:48 2019][ 1585.735174] Lustre: fir-OST005e: Recovery over after 18:24, of 1298 clients 1251 recovered and 47 were evicted. [Thu Dec 12 08:12:48 2019][ 1585.735177] LustreError: dumping log to /tmp/lustre-log.1576167168.66616 [Thu Dec 12 08:12:48 2019][ 1585.739441] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122883 to 0x1a80000401:1122913 [Thu Dec 12 08:12:48 2019][ 1585.739531] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800899 to 0x1a80000400:11800929 [Thu Dec 12 08:12:48 2019][ 1585.739671] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467811 to 0x0:27467841 [Thu Dec 12 08:12:48 2019][ 1586.230699] LustreError: 68442:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005e: cli 622221be-23ea-af5e-7f4e-5cb8fd169641 claims 28672 GRANT, real grant 0 [Thu Dec 12 08:12:48 2019][ 1586.244623] LustreError: 68442:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 102 previous similar messages [Thu Dec 12 08:12:53 2019][ 1590.904297] LNetError: 63453:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Timed out tx: tx_queue, 0 seconds [Thu Dec 12 08:12:53 2019][ 1590.914296] LNetError: 63453:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Skipped 1 previous similar message [Thu Dec 12 08:12:53 2019][ 1590.924378] LNetError: 63453:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Timed out RDMA with 10.0.10.212@o2ib7 (0): c: 0, oc: 1, rc: 7 [Thu Dec 12 08:12:53 2019][ 1590.936369] LNetError: 63453:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Skipped 1 previous similar message [Thu Dec 12 08:12:53 2019][ 1590.946508] LNetError: 63453:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 08:12:53 2019][ 1590.958604] LNetError: 63453:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.212@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 08:12:53 2019][ 1590.971631] LNetError: 63453:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) Skipped 3 previous similar messages [Thu Dec 12 08:12:55 2019][ 1592.738320] Lustre: 69232:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for sent delay: [sent 1576167168/real 0] req@ffff8db7f9390480 x1652729573456224/t0(0) o104->fir-OST0054@10.9.108.36@o2ib4:15/16 lens 296/224 e 0 to 1 dl 1576167175 ref 2 fl Rpc:X/0/ffffffff rc 0/-1 [Thu Dec 12 08:12:55 2019][ 1592.764879] Lustre: 69232:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 3 previous similar messages [Thu Dec 12 08:12:58 2019][ 1595.318138] Lustre: fir-OST0056: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnecting [Thu Dec 12 08:13:01 2019][ 1598.983304] LNetError: 63470:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.212@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 08:13:01 2019][ 1598.983320] LustreError: 68646:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) @@@ truncated bulk GET 3994043(5042619) req@ffff8dc7d6aa8850 x1648697232366288/t0(0) o4->b2911fad-dd6c-1241-cacc-189af7d29c2b@10.9.109.11@o2ib4:323/0 lens 488/448 e 0 to 0 dl 1576167768 ref 1 fl Interpret:/2/0 rc 0/0 [Thu Dec 12 08:13:01 2019][ 1598.983457] Lustre: fir-OST005e: Bulk IO write error with b2911fad-dd6c-1241-cacc-189af7d29c2b (at 10.9.109.11@o2ib4), client will retry: rc = -110 [Thu Dec 12 08:13:05 2019][ 1602.316455] Lustre: fir-OST005e: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnecting [Thu Dec 12 08:13:19 2019][ 1616.318297] Lustre: fir-OST005c: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnecting [Thu Dec 12 08:13:20 2019][ 1617.318252] LustreError: 137-5: fir-OST0057_UUID: not available for connect from 10.9.107.9@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Thu Dec 12 08:13:20 2019][ 1617.335697] LustreError: Skipped 34 previous similar messages [Thu Dec 12 08:13:30 2019][ 1627.894456] Lustre: 66660:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576167199/real 1576167199] req@ffff8dc7e1e34800 x1652729573300464/t0(0) o104->fir-OST0054@10.9.114.7@o2ib4:15/16 lens 296/224 e 0 to 1 dl 1576167210 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Thu Dec 12 08:13:30 2019][ 1627.921728] Lustre: 66660:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 6 previous similar messages [Thu Dec 12 08:13:40 2019][ 1637.318613] Lustre: fir-OST0056: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnecting [Thu Dec 12 08:13:40 2019][ 1637.326887] Lustre: Skipped 1 previous similar message [Thu Dec 12 08:14:03 2019][ 1660.931582] LustreError: 66660:0:(ldlm_lockd.c:681:ldlm_handle_ast_error()) ### client (nid 10.9.114.7@o2ib4) failed to reply to blocking AST (req@ffff8dc7e1e34800 x1652729573300464 status 0 rc -110), evict it ns: filter-fir-OST0054_UUID lock: ffff8dc77febd340/0xbbe356b78b1761c9 lrc: 4/0,0 mode: PW/PW res: [0x1800000401:0xb3d37e:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x60000400010020 nid: 10.9.114.7@o2ib4 remote: 0xd13bb5bd1ce002 expref: 5 pid: 66562 timeout: 1799 lvb_type: 0 [Thu Dec 12 08:14:03 2019][ 1660.978013] LustreError: 138-a: fir-OST0054: A client on nid 10.9.114.7@o2ib4 was evicted due to a lock blocking callback time out: rc -110 [Thu Dec 12 08:14:03 2019][ 1660.990573] LustreError: 66407:0:(ldlm_lockd.c:256:expired_lock_main()) ### lock callback timer expired after 154s: evicting client at 10.9.114.7@o2ib4 ns: filter-fir-OST0054_UUID lock: ffff8dc77febd340/0xbbe356b78b1761c9 lrc: 3/0,0 mode: PW/PW res: [0x1800000401:0xb3d37e:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x60000400010020 nid: 10.9.114.7@o2ib4 remote: 0xd13bb5bd1ce002 expref: 6 pid: 66562 timeout: 0 lvb_type: 0 [Thu Dec 12 08:14:16 2019][ 1674.089876] LustreError: 68853:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0056: cli cff3451c-e996-8c2a-4369-5e4a58059d6b claims 1581056 GRANT, real grant 8192 [Thu Dec 12 08:14:16 2019][ 1674.104209] LustreError: 68853:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 25 previous similar messages [Thu Dec 12 08:14:17 2019][ 1674.353278] Lustre: fir-OST005a: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnecting [Thu Dec 12 08:14:17 2019][ 1674.361550] Lustre: Skipped 2 previous similar messages [Thu Dec 12 08:14:36 2019][ 1694.012733] Lustre: 68707:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576167265/real 1576167265] req@ffff8dc7e1e33a80 x1652729573400080/t0(0) o104->fir-OST005a@10.9.114.7@o2ib4:15/16 lens 296/224 e 0 to 1 dl 1576167276 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Thu Dec 12 08:14:36 2019][ 1694.039987] Lustre: 68707:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 9 previous similar messages [Thu Dec 12 08:14:47 2019][ 1705.049804] LustreError: 68707:0:(ldlm_lockd.c:681:ldlm_handle_ast_error()) ### client (nid 10.9.114.7@o2ib4) failed to reply to blocking AST (req@ffff8dc7e1e33a80 x1652729573400080 status 0 rc -110), evict it ns: filter-fir-OST005a_UUID lock: ffff8db79020de80/0xbbe356b78b17580d lrc: 4/0,0 mode: PW/PW res: [0x1980000401:0xb4b1c5:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x60000400010020 nid: 10.9.114.7@o2ib4 remote: 0xd13bb5bd1d56f4 expref: 6 pid: 66534 timeout: 1843 lvb_type: 0 [Thu Dec 12 08:14:47 2019][ 1705.096229] LustreError: 138-a: fir-OST005a: A client on nid 10.9.114.7@o2ib4 was evicted due to a lock blocking callback time out: rc -110 [Thu Dec 12 08:14:47 2019][ 1705.108842] LustreError: 66407:0:(ldlm_lockd.c:256:expired_lock_main()) ### lock callback timer expired after 154s: evicting client at 10.9.114.7@o2ib4 ns: filter-fir-OST005a_UUID lock: ffff8db79020de80/0xbbe356b78b17580d lrc: 3/0,0 mode: PW/PW res: [0x1980000401:0xb4b1c5:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x60000400010020 nid: 10.9.114.7@o2ib4 remote: 0xd13bb5bd1d56f4 expref: 7 pid: 66534 timeout: 0 lvb_type: 0 [Thu Dec 12 08:15:16 2019][ 1733.336853] Lustre: fir-OST0054: haven't heard from client 2ad23320-be63-65d3-0dd9-5fdcdbfd3811 (at 10.9.109.48@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd727040400, cur 1576167316 expire 1576167166 last 1576167089 [Thu Dec 12 08:15:34 2019][ 1751.324905] Lustre: fir-OST0056: haven't heard from client c5c1b41e-f3c0-c589-3194-4020139c27d9 (at 10.8.17.26@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd73d1ac000, cur 1576167334 expire 1576167184 last 1576167107 [Thu Dec 12 08:15:34 2019][ 1751.346625] Lustre: Skipped 46 previous similar messages [Thu Dec 12 08:15:42 2019][ 1759.655044] LustreError: 66407:0:(ldlm_lockd.c:256:expired_lock_main()) ### lock callback timer expired after 99s: evicting client at 10.9.105.22@o2ib4 ns: filter-fir-OST0056_UUID lock: ffff8da75631e9c0/0xbbe356b78b186620 lrc: 3/0,0 mode: PW/PW res: [0x1a1439f:0x0:0x0].0x0 rrc: 12 type: EXT [0->48204947455] (req 24102674432->24102678527) flags: 0x60000400020020 nid: 10.9.105.22@o2ib4 remote: 0xb1804e7d1e42c6f6 expref: 58 pid: 68368 timeout: 1759 lvb_type: 0 [Thu Dec 12 08:15:45 2019][ 1762.382459] Lustre: fir-OST0058: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnecting [Thu Dec 12 08:15:45 2019][ 1762.390730] Lustre: Skipped 5 previous similar messages [Thu Dec 12 08:15:46 2019][ 1763.329477] Lustre: fir-OST0058: haven't heard from client 84451726-da5e-16d9-ee63-43bfe8a9f835 (at 10.8.27.27@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd72522a000, cur 1576167346 expire 1576167196 last 1576167119 [Thu Dec 12 08:15:46 2019][ 1763.351188] Lustre: Skipped 37 previous similar messages [Thu Dec 12 08:16:00 2019][ 1777.326151] Lustre: fir-OST005a: haven't heard from client 2ad23320-be63-65d3-0dd9-5fdcdbfd3811 (at 10.9.109.48@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd72862dc00, cur 1576167360 expire 1576167210 last 1576167133 [Thu Dec 12 08:16:00 2019][ 1777.347938] Lustre: Skipped 46 previous similar messages [Thu Dec 12 08:16:12 2019][ 1789.337913] Lustre: fir-OST005c: haven't heard from client 2ad23320-be63-65d3-0dd9-5fdcdbfd3811 (at 10.9.109.48@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd72677b800, cur 1576167372 expire 1576167222 last 1576167145 [Thu Dec 12 08:16:12 2019][ 1789.359754] Lustre: Skipped 48 previous similar messages [Thu Dec 12 08:16:16 2019][ 1793.397307] Lustre: fir-OST005a: Connection restored to (at 10.9.107.9@o2ib4) [Thu Dec 12 08:16:16 2019][ 1793.404554] Lustre: Skipped 230 previous similar messages [Thu Dec 12 08:16:32 2019][ 1809.324320] Lustre: fir-OST0054: haven't heard from client 75b6516e-d912-63bd-698a-8f68fc05bdf0 (at 10.9.110.15@o2ib4) in 217 seconds. I think it's dead, and I am evicting it. exp ffff8dd72763e800, cur 1576167392 expire 1576167242 last 1576167175 [Thu Dec 12 08:16:32 2019][ 1809.346142] Lustre: Skipped 50 previous similar messages [Thu Dec 12 08:16:50 2019][ 1827.330280] Lustre: fir-OST0056: haven't heard from client 47a03cfb-676d-980c-969a-709c7ed465f5 (at 10.9.115.12@o2ib4) in 226 seconds. I think it's dead, and I am evicting it. exp ffff8dd7414d6000, cur 1576167410 expire 1576167260 last 1576167184 [Thu Dec 12 08:16:50 2019][ 1827.352130] Lustre: Skipped 46 previous similar messages [Thu Dec 12 08:17:02 2019][ 1839.565959] LustreError: 68771:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli b3fcac94-6a72-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 08:17:02 2019][ 1839.578102] LustreError: 68771:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 12 previous similar messages [Thu Dec 12 08:17:28 2019][ 1865.329569] Lustre: fir-OST005c: haven't heard from client 88641f57-d703-33e9-d11c-f77135f2b59e (at 10.9.116.11@o2ib4) in 211 seconds. I think it's dead, and I am evicting it. exp ffff8dd7261ac000, cur 1576167448 expire 1576167298 last 1576167237 [Thu Dec 12 08:17:28 2019][ 1865.351409] Lustre: Skipped 8 previous similar messages [Thu Dec 12 08:17:55 2019][ 1892.416040] Lustre: fir-OST0056: Client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) reconnecting [Thu Dec 12 08:17:55 2019][ 1892.424311] Lustre: Skipped 10 previous similar messages [Thu Dec 12 08:18:37 2019][ 1934.319082] Lustre: fir-OST005a: haven't heard from client 115600dd-e8d0-f526-46d3-125e5e3a170e (at 10.9.117.8@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd7284c4c00, cur 1576167517 expire 1576167367 last 1576167290 [Thu Dec 12 08:18:37 2019][ 1934.340821] Lustre: Skipped 18 previous similar messages [Thu Dec 12 08:21:15 2019][ 2092.906536] LNetError: 63453:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Timed out tx: tx_queue, 0 seconds [Thu Dec 12 08:21:15 2019][ 2092.916539] LNetError: 63453:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Timed out RDMA with 10.0.10.54@o2ib7 (31): c: 0, oc: 0, rc: 8 [Thu Dec 12 08:21:15 2019][ 2092.928802] LNetError: 63462:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.54@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 08:21:15 2019][ 2092.941699] LNetError: 63453:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 08:21:16 2019][ 2093.906544] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.54@o2ib7: 1 seconds [Thu Dec 12 08:21:20 2019][ 2098.079581] LNetError: 67995:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 08:21:20 2019][ 2098.091620] LNetError: 67995:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 1 previous similar message [Thu Dec 12 08:21:23 2019][ 2101.151752] LustreError: 68583:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 08:21:23 2019][ 2101.163919] LustreError: 68583:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 12 previous similar messages [Thu Dec 12 08:21:26 2019][ 2103.906589] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.54@o2ib7: 0 seconds [Thu Dec 12 08:21:26 2019][ 2103.916679] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 2 previous similar messages [Thu Dec 12 08:21:26 2019][ 2103.925992] LNetError: 63453:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 08:21:26 2019][ 2103.938039] Lustre: 63554:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has failed due to network error: [sent 1576167670/real 1576167686] req@ffff8dd4c579ec00 x1652729574901872/t0(0) o400->fir-MDT0003-lwp-OST005c@10.0.10.54@o2ib7:12/10 lens 224/224 e 0 to 1 dl 1576168426 ref 1 fl Rpc:eXN/0/ffffffff rc 0/-1 [Thu Dec 12 08:21:26 2019][ 2103.938050] Lustre: fir-MDT0003-lwp-OST0054: Connection to fir-MDT0003 (at 10.0.10.54@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Thu Dec 12 08:21:26 2019][ 2103.982785] Lustre: 63554:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 2 previous similar messages [Thu Dec 12 08:21:28 2019][ 2105.906593] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.54@o2ib7: 1 seconds [Thu Dec 12 08:21:28 2019][ 2105.916677] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 1 previous similar message [Thu Dec 12 08:21:32 2019][ 2110.079644] LNetError: 67995:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 08:21:32 2019][ 2110.091636] LNetError: 67995:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 1 previous similar message [Thu Dec 12 08:21:32 2019][ 2110.101585] Lustre: fir-MDT0003-lwp-OST0056: Connection to fir-MDT0003 (at 10.0.10.54@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Thu Dec 12 08:21:32 2019][ 2110.117570] Lustre: Skipped 4 previous similar messages [Thu Dec 12 08:21:33 2019][ 2110.906614] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.54@o2ib7: 0 seconds [Thu Dec 12 08:21:33 2019][ 2110.916698] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 1 previous similar message [Thu Dec 12 08:21:43 2019][ 2120.906657] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.54@o2ib7: 0 seconds [Thu Dec 12 08:21:43 2019][ 2120.916758] LNetError: 63453:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 08:21:43 2019][ 2120.928819] LNetError: 63453:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 1 previous similar message [Thu Dec 12 08:21:55 2019][ 2132.906718] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.54@o2ib7: 0 seconds [Thu Dec 12 08:22:01 2019][ 2139.087772] LNetError: 67995:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 08:22:01 2019][ 2139.099765] LNetError: 67995:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 4 previous similar messages [Thu Dec 12 08:22:36 2019][ 2174.079936] LNetError: 67995:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 08:22:36 2019][ 2174.091930] LNetError: 67995:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 2 previous similar messages [Thu Dec 12 08:22:44 2019][ 2181.614084] Lustre: fir-OST005a: haven't heard from client d833ee08-9e03-4 (at 10.9.107.9@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd728b30800, cur 1576167764 expire 1576167614 last 1576167537 [Thu Dec 12 08:22:44 2019][ 2181.633991] Lustre: Skipped 40 previous similar messages [Thu Dec 12 08:23:11 2019][ 2208.907068] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.54@o2ib7: 2 seconds [Thu Dec 12 08:24:56 2019][ 2313.907577] LNetError: 63453:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Timed out tx: tx_queue, 0 seconds [Thu Dec 12 08:24:56 2019][ 2313.917575] LNetError: 63453:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Timed out RDMA with 10.0.10.54@o2ib7 (6): c: 0, oc: 0, rc: 8 [Thu Dec 12 08:24:56 2019][ 2313.929735] LNetError: 63453:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 08:24:56 2019][ 2313.941735] LNetError: 63453:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 12 previous similar messages [Thu Dec 12 08:25:52 2019][ 2369.907841] LNetError: 63453:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Timed out tx: tx_queue, 0 seconds [Thu Dec 12 08:25:52 2019][ 2369.917844] LNetError: 63453:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Timed out RDMA with 10.0.10.54@o2ib7 (21): c: 0, oc: 0, rc: 8 [Thu Dec 12 08:26:29 2019][ 2406.411555] Lustre: fir-OST0054: Connection restored to 6f9f9911-d6c4-4 (at 10.9.116.19@o2ib4) [Thu Dec 12 08:26:29 2019][ 2406.420174] Lustre: Skipped 59 previous similar messages [Thu Dec 12 08:26:37 2019][ 2414.908063] LNetError: 63453:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Timed out tx: tx_queue, 0 seconds [Thu Dec 12 08:26:37 2019][ 2414.918067] LNetError: 63453:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Timed out RDMA with 10.0.10.54@o2ib7 (6): c: 0, oc: 0, rc: 8 [Thu Dec 12 08:28:47 2019][ 2544.322536] Lustre: fir-OST0056: haven't heard from client a1acf167-afde-6f5a-879d-1a7c0814f282 (at 10.9.117.21@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd7414fb400, cur 1576168127 expire 1576167977 last 1576167900 [Thu Dec 12 08:28:47 2019][ 2544.344329] Lustre: Skipped 11 previous similar messages [Thu Dec 12 08:30:02 2019][ 2620.018227] LustreError: 68603:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 08:30:02 2019][ 2620.030314] LustreError: 68603:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 24 previous similar messages [Thu Dec 12 08:30:17 2019][ 2634.738304] LustreError: 167-0: fir-MDT0003-lwp-OST005a: This client was evicted by fir-MDT0003; in progress operations using this service will fail. [Thu Dec 12 08:31:13 2019][ 2690.423834] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785733 to 0x1800000401:11785761 [Thu Dec 12 08:31:13 2019][ 2690.423850] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000400:11653444 to 0x1a00000400:11653537 [Thu Dec 12 08:31:13 2019][ 2690.423868] Lustre: fir-OST0056: deleting orphan objects from 0x1880000400:11777327 to 0x1880000400:11777377 [Thu Dec 12 08:31:13 2019][ 2690.423875] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797830 to 0x1900000401:11797857 [Thu Dec 12 08:31:13 2019][ 2690.424074] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800931 to 0x1a80000400:11800961 [Thu Dec 12 08:31:13 2019][ 2690.424267] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842531 to 0x1980000401:11842561 [Thu Dec 12 08:38:00 2019][ 3097.753372] Lustre: fir-OST0054: Connection restored to (at 10.9.104.42@o2ib4) [Thu Dec 12 08:38:00 2019][ 3097.760689] Lustre: Skipped 88 previous similar messages [Thu Dec 12 08:38:51 2019][ 3148.325929] Lustre: fir-OST0056: haven't heard from client dec5062c-f101-0dc5-128b-72e40bd60a5a (at 10.9.112.12@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dc7f1b58000, cur 1576168731 expire 1576168581 last 1576168504 [Thu Dec 12 08:38:51 2019][ 3148.347748] Lustre: Skipped 23 previous similar messages [Thu Dec 12 08:40:15 2019][ 3233.109051] LustreError: 68467:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 08:40:15 2019][ 3233.121136] LustreError: 68467:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 43 previous similar messages [Thu Dec 12 08:48:10 2019][ 3707.758810] Lustre: fir-OST0054: Connection restored to (at 10.9.116.14@o2ib4) [Thu Dec 12 08:48:10 2019][ 3707.766124] Lustre: Skipped 134 previous similar messages [Thu Dec 12 08:50:24 2019][ 3842.183742] LustreError: 68603:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 08:50:24 2019][ 3842.195826] LustreError: 68603:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 36 previous similar messages [Thu Dec 12 08:57:51 2019][ 4288.358555] Lustre: fir-OST005e: haven't heard from client 295209bb-0224-d868-bd7c-cd75c3b19a1c (at 10.8.18.20@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd725e7e000, cur 1576169871 expire 1576169721 last 1576169644 [Thu Dec 12 08:57:51 2019][ 4288.380264] Lustre: Skipped 17 previous similar messages [Thu Dec 12 09:00:05 2019][ 4423.266024] Lustre: fir-OST0054: Connection restored to 022acf30-b33d-ab48-4fa4-ec70c96ae93e (at 10.9.114.2@o2ib4) [Thu Dec 12 09:00:05 2019][ 4423.276399] Lustre: Skipped 193 previous similar messages [Thu Dec 12 09:00:41 2019][ 4458.842467] LustreError: 68763:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 32768 GRANT, real grant 0 [Thu Dec 12 09:00:41 2019][ 4458.854549] LustreError: 68763:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 32 previous similar messages [Thu Dec 12 09:10:44 2019][ 5062.143917] LustreError: 68484:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 09:10:44 2019][ 5062.155994] LustreError: 68484:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 28 previous similar messages [Thu Dec 12 09:11:45 2019][ 5122.445826] Lustre: fir-OST0054: Connection restored to (at 10.8.27.10@o2ib6) [Thu Dec 12 09:11:45 2019][ 5122.453071] Lustre: Skipped 22 previous similar messages [Thu Dec 12 09:21:26 2019][ 5703.647055] LustreError: 68486:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli b3fcac94-6a72-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 09:21:26 2019][ 5703.659140] LustreError: 68486:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 41 previous similar messages [Thu Dec 12 09:22:44 2019][ 5781.337646] Lustre: fir-OST005e: haven't heard from client 75167b5d-e2d7-d704-ea07-95d8feb377a6 (at 10.9.102.1@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd725accc00, cur 1576171364 expire 1576171214 last 1576171137 [Thu Dec 12 09:22:44 2019][ 5781.359346] Lustre: Skipped 5 previous similar messages [Thu Dec 12 09:24:00 2019][ 5857.342953] Lustre: fir-OST0056: haven't heard from client 135543df-9fa8-fe17-ef67-a6cd12881d1d (at 10.8.7.19@o2ib6) in 151 seconds. I think it's dead, and I am evicting it. exp ffff8dd73d5e9800, cur 1576171440 expire 1576171290 last 1576171289 [Thu Dec 12 09:24:00 2019][ 5857.364620] Lustre: Skipped 5 previous similar messages [Thu Dec 12 09:29:51 2019][ 6208.366592] Lustre: fir-OST005e: haven't heard from client 3fa61b7b-3364-0c3e-efb9-55ce1343c799 (at 10.8.23.34@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd725acbc00, cur 1576171791 expire 1576171641 last 1576171564 [Thu Dec 12 09:29:51 2019][ 6208.388361] Lustre: Skipped 5 previous similar messages [Thu Dec 12 09:31:44 2019][ 6322.066335] LustreError: 68582:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 09:31:44 2019][ 6322.078658] LustreError: 68582:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 36 previous similar messages [Thu Dec 12 09:32:12 2019][ 6349.514748] Lustre: fir-OST0054: Connection restored to 295209bb-0224-d868-bd7c-cd75c3b19a1c (at 10.8.18.20@o2ib6) [Thu Dec 12 09:32:12 2019][ 6349.525096] Lustre: Skipped 18 previous similar messages [Thu Dec 12 09:40:11 2019][ 6828.343094] Lustre: fir-OST005c: haven't heard from client ee8a8d10-65c2-ae96-bc67-9f6bae32e110 (at 10.8.18.18@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd7265e0400, cur 1576172411 expire 1576172261 last 1576172184 [Thu Dec 12 09:40:11 2019][ 6828.364803] Lustre: Skipped 5 previous similar messages [Thu Dec 12 09:42:36 2019][ 6973.876832] LustreError: 68705:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 09:42:36 2019][ 6973.888912] LustreError: 68705:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 37 previous similar messages [Thu Dec 12 09:51:09 2019][ 7486.343746] Lustre: fir-OST005a: haven't heard from client ef78dfe0-80b9-391e-81c2-9236655a36fe (at 10.9.103.59@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8da7e8097800, cur 1576173069 expire 1576172919 last 1576172842 [Thu Dec 12 09:51:09 2019][ 7486.365538] Lustre: Skipped 5 previous similar messages [Thu Dec 12 09:52:48 2019][ 7586.055385] LustreError: 68628:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 09:52:48 2019][ 7586.067501] LustreError: 68628:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 21 previous similar messages [Thu Dec 12 09:54:14 2019][ 7671.648505] Lustre: fir-OST0054: Connection restored to (at 10.9.104.7@o2ib4) [Thu Dec 12 09:54:14 2019][ 7671.655739] Lustre: Skipped 5 previous similar messages [Thu Dec 12 09:55:05 2019][ 7722.721066] Lustre: fir-OST0054: Connection restored to (at 10.9.106.13@o2ib4) [Thu Dec 12 09:55:05 2019][ 7722.728398] Lustre: Skipped 11 previous similar messages [Thu Dec 12 10:01:26 2019][ 8104.191623] Lustre: fir-OST0054: Connection restored to (at 10.9.102.1@o2ib4) [Thu Dec 12 10:01:26 2019][ 8104.198850] Lustre: Skipped 5 previous similar messages [Thu Dec 12 10:02:57 2019][ 8194.784648] LustreError: 68476:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0056: cli cff3451c-e996-8c2a-4369-5e4a58059d6b claims 1576960 GRANT, real grant 0 [Thu Dec 12 10:02:57 2019][ 8194.798728] LustreError: 68476:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 19 previous similar messages [Thu Dec 12 10:04:56 2019][ 8313.887057] Lustre: fir-OST0054: Connection restored to (at 10.8.23.34@o2ib6) [Thu Dec 12 10:04:56 2019][ 8313.894334] Lustre: Skipped 5 previous similar messages [Thu Dec 12 10:06:33 2019][ 8410.351186] Lustre: fir-OST005e: haven't heard from client 227d7a25-50be-a469-9b6d-83846499cd76 (at 10.8.27.14@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd725b7bc00, cur 1576173993 expire 1576173843 last 1576173766 [Thu Dec 12 10:06:33 2019][ 8410.372928] Lustre: Skipped 5 previous similar messages [Thu Dec 12 10:06:37 2019][ 8414.343539] Lustre: fir-OST0054: Connection restored to (at 10.8.20.19@o2ib6) [Thu Dec 12 10:06:37 2019][ 8414.350772] Lustre: Skipped 11 previous similar messages [Thu Dec 12 10:13:34 2019][ 8832.069773] LustreError: 68792:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0058: cli 262affea-6f08-6e05-c2e8-d629eeb38f83 claims 28672 GRANT, real grant 0 [Thu Dec 12 10:13:34 2019][ 8832.083694] LustreError: 68792:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 24 previous similar messages [Thu Dec 12 10:16:13 2019][ 8991.048011] Lustre: fir-OST0054: Connection restored to (at 10.8.18.18@o2ib6) [Thu Dec 12 10:16:13 2019][ 8991.055244] Lustre: Skipped 5 previous similar messages [Thu Dec 12 10:23:44 2019][ 9441.967162] LustreError: 68598:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 10:23:44 2019][ 9441.979285] LustreError: 68598:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 32 previous similar messages [Thu Dec 12 10:26:37 2019][ 9614.360455] Lustre: fir-OST0054: haven't heard from client 4ddd9e11-580e-5fd9-690c-d09be6f90077 (at 10.9.101.42@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd727269400, cur 1576175197 expire 1576175047 last 1576174970 [Thu Dec 12 10:26:37 2019][ 9614.382274] Lustre: Skipped 17 previous similar messages [Thu Dec 12 10:34:06 2019][10063.548504] LustreError: 68628:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0056: cli cff3451c-e996-8c2a-4369-5e4a58059d6b claims 10018816 GRANT, real grant 0 [Thu Dec 12 10:34:06 2019][10063.562669] LustreError: 68628:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 37 previous similar messages [Thu Dec 12 10:41:27 2019][10505.253115] Lustre: fir-OST0054: Connection restored to (at 10.8.27.14@o2ib6) [Thu Dec 12 10:41:27 2019][10505.260347] Lustre: Skipped 11 previous similar messages [Thu Dec 12 10:44:37 2019][10694.981078] LustreError: 68486:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 10:44:37 2019][10694.993162] LustreError: 68486:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 25 previous similar messages [Thu Dec 12 10:49:39 2019][10997.255989] Lustre: fir-OST0054: Connection restored to (at 10.9.116.14@o2ib4) [Thu Dec 12 10:49:39 2019][10997.263302] Lustre: Skipped 10 previous similar messages [Thu Dec 12 10:50:23 2019][11040.365999] Lustre: fir-OST0058: haven't heard from client 5d110741-f52f-a556-c0fd-775bc1eebbda (at 10.9.105.33@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd72469c800, cur 1576176623 expire 1576176473 last 1576176396 [Thu Dec 12 10:50:23 2019][11040.387787] Lustre: Skipped 59 previous similar messages [Thu Dec 12 10:51:09 2019][11086.777130] Lustre: fir-OST0054: Connection restored to (at 10.8.26.36@o2ib6) [Thu Dec 12 10:51:09 2019][11086.784366] Lustre: Skipped 11 previous similar messages [Thu Dec 12 10:53:45 2019][11242.746304] Lustre: fir-OST0056: Connection restored to ec6f0728-3f9f-b5fd-43eb-c07bc3da43b2 (at 10.9.117.12@o2ib4) [Thu Dec 12 10:53:45 2019][11242.746305] Lustre: fir-OST0054: Connection restored to ec6f0728-3f9f-b5fd-43eb-c07bc3da43b2 (at 10.9.117.12@o2ib4) [Thu Dec 12 10:53:45 2019][11242.746308] Lustre: Skipped 6 previous similar messages [Thu Dec 12 10:53:45 2019][11242.772437] Lustre: Skipped 4 previous similar messages [Thu Dec 12 10:54:46 2019][11303.416049] LustreError: 68539:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 10:54:46 2019][11303.428133] LustreError: 68539:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 15 previous similar messages [Thu Dec 12 11:05:00 2019][11917.498631] LustreError: 68441:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 11:05:00 2019][11917.510715] LustreError: 68441:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 16 previous similar messages [Thu Dec 12 11:13:47 2019][12444.372777] Lustre: fir-OST0058: haven't heard from client e8e18d90-dcac-7195-a7b7-bbaf10be70ce (at 10.9.103.52@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd725100400, cur 1576178027 expire 1576177877 last 1576177800 [Thu Dec 12 11:13:47 2019][12444.394570] Lustre: Skipped 5 previous similar messages [Thu Dec 12 11:15:45 2019][12562.589399] LustreError: 68569:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 11:15:45 2019][12562.601487] LustreError: 68569:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 17 previous similar messages [Thu Dec 12 11:20:55 2019][12873.038311] Lustre: fir-OST0054: Connection restored to (at 10.9.105.33@o2ib4) [Thu Dec 12 11:20:55 2019][12873.045637] Lustre: Skipped 40 previous similar messages [Thu Dec 12 11:25:49 2019][13167.024338] LustreError: 68476:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 11:25:49 2019][13167.036436] LustreError: 68476:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 16 previous similar messages [Thu Dec 12 11:36:32 2019][13809.411126] LustreError: 68569:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 11:36:32 2019][13809.423207] LustreError: 68569:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 14 previous similar messages [Thu Dec 12 11:40:56 2019][14074.156301] Lustre: fir-OST0054: Connection restored to (at 10.9.103.52@o2ib4) [Thu Dec 12 11:40:56 2019][14074.163635] Lustre: Skipped 5 previous similar messages [Thu Dec 12 11:43:38 2019][14235.376015] Lustre: fir-OST0058: haven't heard from client c804f06b-97c0-205b-aa77-e2392ade35bd (at 10.8.7.7@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd7252a6400, cur 1576179818 expire 1576179668 last 1576179591 [Thu Dec 12 11:43:38 2019][14235.397572] Lustre: Skipped 5 previous similar messages [Thu Dec 12 11:46:41 2019][14419.093265] LustreError: 68485:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli b3fcac94-6a72-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 11:46:41 2019][14419.105398] LustreError: 68485:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 16 previous similar messages [Thu Dec 12 11:46:59 2019][14436.375865] Lustre: fir-OST0056: haven't heard from client 1d444526-0c94-9229-34be-9d214c0c6bbd (at 10.9.101.46@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8da7f336b000, cur 1576180019 expire 1576179869 last 1576179792 [Thu Dec 12 11:46:59 2019][14436.397715] Lustre: Skipped 5 previous similar messages [Thu Dec 12 11:51:07 2019][14684.376134] Lustre: fir-OST005c: haven't heard from client 7126efc2-9676-1db9-94d0-ae09c1520697 (at 10.9.101.26@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd726152c00, cur 1576180267 expire 1576180117 last 1576180040 [Thu Dec 12 11:51:07 2019][14684.397945] Lustre: Skipped 5 previous similar messages [Thu Dec 12 11:57:36 2019][15073.800909] LustreError: 68745:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 11:57:36 2019][15073.812996] LustreError: 68745:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 22 previous similar messages [Thu Dec 12 12:05:19 2019][15536.379457] Lustre: fir-OST0056: haven't heard from client 02eb8135-4034-bcb2-8df8-77d00506e76a (at 10.8.7.15@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd73d5ed000, cur 1576181119 expire 1576180969 last 1576180892 [Thu Dec 12 12:05:19 2019][15536.401103] Lustre: Skipped 5 previous similar messages [Thu Dec 12 12:08:07 2019][15704.652511] LustreError: 68461:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 12:08:07 2019][15704.664617] LustreError: 68461:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 14 previous similar messages [Thu Dec 12 12:12:50 2019][15987.370247] Lustre: fir-OST0054: Connection restored to (at 10.8.7.7@o2ib6) [Thu Dec 12 12:12:50 2019][15987.377305] Lustre: Skipped 5 previous similar messages [Thu Dec 12 12:18:44 2019][16341.902916] LustreError: 68471:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 12:18:44 2019][16341.914999] LustreError: 68471:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 20 previous similar messages [Thu Dec 12 12:24:43 2019][16700.983383] Lustre: fir-OST0054: Connection restored to 1d444526-0c94-9229-34be-9d214c0c6bbd (at 10.9.101.46@o2ib4) [Thu Dec 12 12:24:43 2019][16700.993820] Lustre: Skipped 5 previous similar messages [Thu Dec 12 12:29:00 2019][16957.649677] LustreError: 68720:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 12:29:00 2019][16957.661767] LustreError: 68720:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 15 previous similar messages [Thu Dec 12 12:30:31 2019][17048.925797] Lustre: fir-OST0054: Connection restored to 7126efc2-9676-1db9-94d0-ae09c1520697 (at 10.9.101.26@o2ib4) [Thu Dec 12 12:30:31 2019][17048.936231] Lustre: Skipped 5 previous similar messages [Thu Dec 12 12:34:07 2019][17264.966179] Lustre: fir-OST0054: Connection restored to (at 10.8.7.15@o2ib6) [Thu Dec 12 12:34:07 2019][17264.973321] Lustre: Skipped 5 previous similar messages [Thu Dec 12 12:34:58 2019][17316.225739] Lustre: fir-OST0054: Connection restored to (at 10.8.22.31@o2ib6) [Thu Dec 12 12:34:58 2019][17316.232963] Lustre: Skipped 5 previous similar messages [Thu Dec 12 12:37:05 2019][17442.974940] LNetError: 63453:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Timed out tx: tx_queue, 1 seconds [Thu Dec 12 12:37:05 2019][17442.984945] LNetError: 63453:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Timed out RDMA with 10.0.10.52@o2ib7 (32): c: 0, oc: 0, rc: 8 [Thu Dec 12 12:37:05 2019][17442.997144] LNetError: 63461:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.52@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 12:37:05 2019][17443.010045] LNetError: 63453:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 12:37:05 2019][17443.022068] LNetError: 63453:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 2 previous similar messages [Thu Dec 12 12:37:06 2019][17443.974949] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.52@o2ib7: 2 seconds [Thu Dec 12 12:37:06 2019][17443.985034] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 12 previous similar messages [Thu Dec 12 12:37:12 2019][17449.974978] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.52@o2ib7: 1 seconds [Thu Dec 12 12:37:12 2019][17449.985138] Lustre: 63554:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has failed due to network error: [sent 1576183019/real 1576183032] req@ffff8dd211e44c80 x1652729606699888/t0(0) o400->fir-MDT0001-lwp-OST005a@10.0.10.52@o2ib7:12/10 lens 224/224 e 0 to 1 dl 1576183775 ref 1 fl Rpc:eXN/0/ffffffff rc 0/-1 [Thu Dec 12 12:37:12 2019][17450.013894] Lustre: 63554:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 4 previous similar messages [Thu Dec 12 12:37:12 2019][17450.023649] Lustre: fir-MDT0001-lwp-OST005a: Connection to fir-MDT0001 (at 10.0.10.52@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Thu Dec 12 12:37:17 2019][17454.708058] Lustre: fir-MDT0001-lwp-OST0054: Connection to fir-MDT0001 (at 10.0.10.52@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Thu Dec 12 12:37:17 2019][17454.724045] Lustre: Skipped 1 previous similar message [Thu Dec 12 12:37:22 2019][17459.732041] LNetError: 73902:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 12:37:22 2019][17459.744063] LNetError: 73902:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 4 previous similar messages [Thu Dec 12 12:37:22 2019][17459.754134] Lustre: fir-MDT0001-lwp-OST0056: Connection to fir-MDT0001 (at 10.0.10.52@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Thu Dec 12 12:37:22 2019][17459.770136] Lustre: Skipped 1 previous similar message [Thu Dec 12 12:37:23 2019][17460.975027] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.52@o2ib7: 0 seconds [Thu Dec 12 12:37:23 2019][17460.985112] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 2 previous similar messages [Thu Dec 12 12:37:56 2019][17493.700196] LNetError: 73902:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 12:37:56 2019][17493.712243] LNetError: 73902:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 6 previous similar messages [Thu Dec 12 12:39:03 2019][17560.975472] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.52@o2ib7: 0 seconds [Thu Dec 12 12:39:03 2019][17560.985554] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 4 previous similar messages [Thu Dec 12 12:39:03 2019][17560.994867] LNetError: 63453:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 12:39:03 2019][17561.006860] LNetError: 63453:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 5 previous similar messages [Thu Dec 12 12:39:36 2019][17593.975642] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.52@o2ib7: 0 seconds [Thu Dec 12 12:39:36 2019][17593.985726] LNet: 63453:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 7 previous similar messages [Thu Dec 12 12:39:43 2019][17600.397844] Lustre: fir-OST005c: haven't heard from client fir-MDT0001-mdtlov_UUID (at 10.0.10.52@o2ib7) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd7273b7400, cur 1576183183 expire 1576183033 last 1576182956 [Thu Dec 12 12:39:43 2019][17600.418544] Lustre: Skipped 5 previous similar messages [Thu Dec 12 12:39:50 2019][17607.508634] LustreError: 68539:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 12:39:50 2019][17607.520716] LustreError: 68539:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 14 previous similar messages [Thu Dec 12 12:39:55 2019][17612.420609] Lustre: fir-OST005a: haven't heard from client fir-MDT0001-mdtlov_UUID (at 10.0.10.52@o2ib7) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8da7e94f2800, cur 1576183195 expire 1576183045 last 1576182968 [Thu Dec 12 12:39:55 2019][17612.441188] Lustre: Skipped 1 previous similar message [Thu Dec 12 12:39:59 2019][17616.402155] Lustre: fir-OST0058: haven't heard from client fir-MDT0001-mdtlov_UUID (at 10.0.10.52@o2ib7) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd723c07000, cur 1576183199 expire 1576183049 last 1576182972 [Thu Dec 12 12:39:59 2019][17616.422755] Lustre: Skipped 2 previous similar messages [Thu Dec 12 12:40:23 2019][17640.637379] Lustre: fir-OST0054: Connection restored to 565dd543-352a-9672-d117-0e804d453e00 (at 10.9.103.28@o2ib4) [Thu Dec 12 12:40:23 2019][17640.647816] Lustre: Skipped 5 previous similar messages [Thu Dec 12 12:40:47 2019][17664.975938] LNetError: 63453:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Timed out tx: tx_queue, 1 seconds [Thu Dec 12 12:40:47 2019][17664.985938] LNetError: 63453:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Timed out RDMA with 10.0.10.52@o2ib7 (7): c: 0, oc: 0, rc: 8 [Thu Dec 12 12:42:11 2019][17748.976323] LNetError: 63453:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Timed out tx: tx_queue, 0 seconds [Thu Dec 12 12:42:11 2019][17748.986322] LNetError: 63453:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Timed out RDMA with 10.0.10.52@o2ib7 (16): c: 0, oc: 0, rc: 8 [Thu Dec 12 12:42:11 2019][17748.998566] LNetError: 63453:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Thu Dec 12 12:42:11 2019][17749.010578] LNetError: 63453:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 9 previous similar messages [Thu Dec 12 12:42:56 2019][17793.976513] LNetError: 63453:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Timed out tx: tx_queue, 0 seconds [Thu Dec 12 12:42:56 2019][17793.986509] LNetError: 63453:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Timed out RDMA with 10.0.10.52@o2ib7 (21): c: 0, oc: 0, rc: 8 [Thu Dec 12 12:45:32 2019][17949.949683] Lustre: fir-OST0054: Connection restored to fir-MDT0001-mdtlov_UUID (at 10.0.10.52@o2ib7) [Thu Dec 12 12:45:32 2019][17949.958909] Lustre: Skipped 1 previous similar message [Thu Dec 12 12:46:07 2019][17984.438562] LustreError: 167-0: fir-MDT0001-lwp-OST005c: This client was evicted by fir-MDT0001; in progress operations using this service will fail. [Thu Dec 12 12:46:07 2019][17984.451950] LustreError: Skipped 5 previous similar messages [Thu Dec 12 12:46:07 2019][17984.459521] Lustre: fir-MDT0001-lwp-OST005c: Connection restored to 10.0.10.52@o2ib7 (at 10.0.10.52@o2ib7) [Thu Dec 12 12:46:07 2019][17984.469177] Lustre: Skipped 7 previous similar messages [Thu Dec 12 12:47:16 2019][18053.685627] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1118269 to 0x1800000402:1118305 [Thu Dec 12 12:47:16 2019][18053.685628] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1125232 to 0x1900000400:1125249 [Thu Dec 12 12:47:16 2019][18053.685652] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1126760 to 0x1980000400:1126785 [Thu Dec 12 12:47:16 2019][18053.685672] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122969 to 0x1a80000401:1123009 [Thu Dec 12 12:47:16 2019][18053.719044] Lustre: fir-OST0056: deleting orphan objects from 0x1880000401:1118894 to 0x1880000401:1118913 [Thu Dec 12 12:47:16 2019][18053.729797] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1106341 to 0x1a00000402:1106401 [Thu Dec 12 12:50:25 2019][18242.407331] LustreError: 68656:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 12:50:25 2019][18242.419431] LustreError: 68656:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 52 previous similar messages [Thu Dec 12 13:01:14 2019][18892.090499] LustreError: 68430:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 13:01:14 2019][18892.102589] LustreError: 68430:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 34 previous similar messages [Thu Dec 12 13:11:29 2019][19507.116614] LustreError: 68747:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli b3fcac94-6a72-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 13:11:29 2019][19507.128702] LustreError: 68747:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 22 previous similar messages [Thu Dec 12 13:12:37 2019][19574.400572] Lustre: fir-OST0056: haven't heard from client f97f048d-b027-4 (at 10.8.9.1@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd73d54ac00, cur 1576185157 expire 1576185007 last 1576184930 [Thu Dec 12 13:21:34 2019][20112.252629] LustreError: 68842:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 26dc458c-0aaf-4 claims 225280 GRANT, real grant 0 [Thu Dec 12 13:21:34 2019][20112.264797] LustreError: 68842:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 18 previous similar messages [Thu Dec 12 13:32:01 2019][20738.658772] LustreError: 68656:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 13:32:01 2019][20738.670874] LustreError: 68656:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 16 previous similar messages [Thu Dec 12 13:41:03 2019][21280.416431] Lustre: fir-OST0056: haven't heard from client 4359a6d6-39f4-3744-7f0f-dc517a2bb4c6 (at 10.8.28.3@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd73d06a400, cur 1576186863 expire 1576186713 last 1576186636 [Thu Dec 12 13:41:03 2019][21280.438060] Lustre: Skipped 5 previous similar messages [Thu Dec 12 13:42:17 2019][21355.045437] LustreError: 68471:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 13:42:17 2019][21355.057519] LustreError: 68471:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 14 previous similar messages [Thu Dec 12 13:53:07 2019][22004.712366] LustreError: 68556:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 13:53:07 2019][22004.724448] LustreError: 68556:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 15 previous similar messages [Thu Dec 12 14:03:01 2019][22598.955344] Lustre: fir-OST0054: Connection restored to (at 10.9.108.12@o2ib4) [Thu Dec 12 14:03:01 2019][22598.962666] Lustre: Skipped 4 previous similar messages [Thu Dec 12 14:03:41 2019][22639.017871] LustreError: 68583:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005a: cli 189dcd2e-de74-4 claims 16752640 GRANT, real grant 0 [Thu Dec 12 14:03:41 2019][22639.030212] LustreError: 68583:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 16 previous similar messages [Thu Dec 12 14:07:36 2019][22873.644338] Lustre: fir-OST0054: Connection restored to (at 10.8.28.3@o2ib6) [Thu Dec 12 14:07:36 2019][22873.651483] Lustre: Skipped 5 previous similar messages [Thu Dec 12 14:14:21 2019][23278.829430] LustreError: 68717:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli b3fcac94-6a72-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 14:14:21 2019][23278.841517] LustreError: 68717:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 16 previous similar messages [Thu Dec 12 14:24:32 2019][23890.384895] LustreError: 68717:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 14:24:32 2019][23890.396987] LustreError: 68717:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 12 previous similar messages [Thu Dec 12 14:34:42 2019][24499.587755] LustreError: 68556:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 14:34:42 2019][24499.599842] LustreError: 68556:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 14 previous similar messages [Thu Dec 12 14:45:13 2019][25130.502535] LustreError: 68627:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 14:45:13 2019][25130.514624] LustreError: 68627:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 22 previous similar messages [Thu Dec 12 14:55:50 2019][25768.076423] LustreError: 66433:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 14:55:50 2019][25768.088512] LustreError: 66433:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 15 previous similar messages [Thu Dec 12 15:05:57 2019][26375.131499] LustreError: 68487:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli b3fcac94-6a72-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 15:05:57 2019][26375.143583] LustreError: 68487:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 12 previous similar messages [Thu Dec 12 15:09:47 2019][26604.428999] Lustre: fir-OST0056: haven't heard from client 596201d1-c6e4-ccb1-282c-e46d8e32a779 (at 10.9.101.71@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd73d10fc00, cur 1576192187 expire 1576192037 last 1576191960 [Thu Dec 12 15:09:47 2019][26604.450788] Lustre: Skipped 5 previous similar messages [Thu Dec 12 15:16:19 2019][26997.183037] LustreError: 68515:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 15:16:19 2019][26997.195122] LustreError: 68515:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 15 previous similar messages [Thu Dec 12 15:26:41 2019][27619.201886] LustreError: 68735:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 15:26:41 2019][27619.214015] LustreError: 68735:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 13 previous similar messages [Thu Dec 12 15:36:59 2019][28237.188691] LustreError: 66434:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 15:36:59 2019][28237.200775] LustreError: 66434:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 14 previous similar messages [Thu Dec 12 15:47:14 2019][28852.135632] LustreError: 68476:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 15:47:14 2019][28852.147720] LustreError: 68476:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 14 previous similar messages [Thu Dec 12 15:48:06 2019][28903.768618] Lustre: fir-OST0054: Connection restored to (at 10.9.101.71@o2ib4) [Thu Dec 12 15:48:06 2019][28903.775944] Lustre: Skipped 5 previous similar messages [Thu Dec 12 15:57:24 2019][29462.138554] LustreError: 68620:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 15:57:24 2019][29462.150641] LustreError: 68620:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 15 previous similar messages [Thu Dec 12 16:08:11 2019][30108.876946] LustreError: 68705:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli b3fcac94-6a72-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 16:08:11 2019][30108.889030] LustreError: 68705:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 17 previous similar messages [Thu Dec 12 16:14:02 2019][30459.464280] Lustre: fir-OST005a: haven't heard from client af8d5000-0c68-4 (at 10.8.0.67@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd728ba7800, cur 1576196042 expire 1576195892 last 1576195815 [Thu Dec 12 16:14:02 2019][30459.484080] Lustre: Skipped 5 previous similar messages [Thu Dec 12 16:14:03 2019][30460.448190] Lustre: fir-OST0056: haven't heard from client af8d5000-0c68-4 (at 10.8.0.67@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd73d06b400, cur 1576196043 expire 1576195893 last 1576195816 [Thu Dec 12 16:14:03 2019][30460.467992] Lustre: Skipped 1 previous similar message [Thu Dec 12 16:14:04 2019][30461.457761] Lustre: fir-OST0058: haven't heard from client af8d5000-0c68-4 (at 10.8.0.67@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd725150c00, cur 1576196044 expire 1576195894 last 1576195817 [Thu Dec 12 16:14:04 2019][30461.477576] Lustre: Skipped 1 previous similar message [Thu Dec 12 16:14:08 2019][30465.678354] Lustre: fir-OST0054: Connection restored to af8d5000-0c68-4 (at 10.8.0.67@o2ib6) [Thu Dec 12 16:14:08 2019][30465.686798] Lustre: Skipped 5 previous similar messages [Thu Dec 12 16:18:20 2019][30717.680473] LustreError: 68556:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 16:18:20 2019][30717.692555] LustreError: 68556:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 15 previous similar messages [Thu Dec 12 16:28:43 2019][31340.755359] LustreError: 68743:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 16:28:43 2019][31340.767441] LustreError: 68743:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 17 previous similar messages [Thu Dec 12 16:39:17 2019][31974.613759] LustreError: 68430:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli b3fcac94-6a72-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 16:39:17 2019][31974.625847] LustreError: 68430:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 14 previous similar messages [Thu Dec 12 16:50:18 2019][32635.737708] LustreError: 68467:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 16:50:18 2019][32635.749815] LustreError: 68467:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 14 previous similar messages [Thu Dec 12 17:00:25 2019][33242.460391] LustreError: 68394:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 17:00:25 2019][33242.472475] LustreError: 68394:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 18 previous similar messages [Thu Dec 12 17:10:33 2019][33851.278709] LustreError: 68513:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli b3fcac94-6a72-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 17:10:33 2019][33851.290794] LustreError: 68513:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 14 previous similar messages [Thu Dec 12 17:19:39 2019][34396.468951] Lustre: fir-OST005e: haven't heard from client 62f117dd-237d-c074-d679-5244422357ce (at 10.9.103.27@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd725a53400, cur 1576199979 expire 1576199829 last 1576199752 [Thu Dec 12 17:19:39 2019][34396.490764] Lustre: Skipped 1 previous similar message [Thu Dec 12 17:21:20 2019][34498.162215] LustreError: 68764:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 17:21:20 2019][34498.174303] LustreError: 68764:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 12 previous similar messages [Thu Dec 12 17:26:41 2019][34818.563379] Lustre: fir-MDT0000-lwp-OST0056: Connection to fir-MDT0000 (at 10.0.10.51@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Thu Dec 12 17:26:41 2019][34818.579368] Lustre: Skipped 5 previous similar messages [Thu Dec 12 17:29:42 2019][34999.477710] Lustre: fir-OST0054: Connection restored to fir-MDT0000-mdtlov_UUID (at 10.0.10.51@o2ib7) [Thu Dec 12 17:29:42 2019][34999.486933] Lustre: Skipped 3 previous similar messages [Thu Dec 12 17:29:55 2019][35013.054872] Lustre: fir-OST005c: deleting orphan objects from 0x0:27193149 to 0x0:27193185 [Thu Dec 12 17:29:55 2019][35013.054916] Lustre: fir-OST005a: deleting orphan objects from 0x0:27562860 to 0x0:27562881 [Thu Dec 12 17:29:55 2019][35013.055030] Lustre: fir-OST005e: deleting orphan objects from 0x0:27468123 to 0x0:27468161 [Thu Dec 12 17:29:55 2019][35013.055264] Lustre: fir-OST0054: deleting orphan objects from 0x0:27458495 to 0x0:27458529 [Thu Dec 12 17:29:55 2019][35013.055265] Lustre: fir-OST0058: deleting orphan objects from 0x0:27507508 to 0x0:27507553 [Thu Dec 12 17:29:55 2019][35013.131307] Lustre: fir-OST0056: deleting orphan objects from 0x0:27479877 to 0x0:27479905 [Thu Dec 12 17:30:01 2019][35019.268335] LustreError: 167-0: fir-MDT0000-lwp-OST005a: This client was evicted by fir-MDT0000; in progress operations using this service will fail. [Thu Dec 12 17:30:01 2019][35019.281739] LustreError: Skipped 5 previous similar messages [Thu Dec 12 17:30:01 2019][35019.291554] Lustre: fir-MDT0000-lwp-OST0058: Connection restored to 10.0.10.51@o2ib7 (at 10.0.10.51@o2ib7) [Thu Dec 12 17:30:01 2019][35019.301208] Lustre: Skipped 5 previous similar messages [Thu Dec 12 17:31:30 2019][35107.589229] LustreError: 68490:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 17:31:30 2019][35107.601313] LustreError: 68490:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 13 previous similar messages [Thu Dec 12 17:36:54 2019][35431.470608] Lustre: fir-OST0054: haven't heard from client 38a705fd-9485-904c-b274-225796b849f2 (at 10.9.102.53@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd727bee000, cur 1576201014 expire 1576200864 last 1576200787 [Thu Dec 12 17:36:54 2019][35431.492406] Lustre: Skipped 5 previous similar messages [Thu Dec 12 17:42:19 2019][35757.032752] LustreError: 68712:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 17:42:19 2019][35757.044837] LustreError: 68712:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 14 previous similar messages [Thu Dec 12 17:51:38 2019][36316.299116] Lustre: fir-OST0054: Connection restored to (at 10.9.103.27@o2ib4) [Thu Dec 12 17:51:38 2019][36316.306432] Lustre: Skipped 5 previous similar messages [Thu Dec 12 17:52:33 2019][36370.698737] LustreError: 68496:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli b3fcac94-6a72-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 17:52:33 2019][36370.710831] LustreError: 68496:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 16 previous similar messages [Thu Dec 12 18:03:26 2019][37023.854553] LustreError: 68406:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 18:03:26 2019][37023.866639] LustreError: 68406:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 13 previous similar messages [Thu Dec 12 18:13:44 2019][37641.473659] LustreError: 68777:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 32768 GRANT, real grant 0 [Thu Dec 12 18:13:44 2019][37641.485739] LustreError: 68777:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 12 previous similar messages [Thu Dec 12 18:15:51 2019][37768.956233] Lustre: fir-OST0054: Connection restored to (at 10.9.102.53@o2ib4) [Thu Dec 12 18:15:51 2019][37768.963556] Lustre: Skipped 5 previous similar messages [Thu Dec 12 18:24:03 2019][38260.995965] LustreError: 68525:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli b3fcac94-6a72-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 18:24:03 2019][38261.008047] LustreError: 68525:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 14 previous similar messages [Thu Dec 12 18:34:43 2019][38900.599781] LustreError: 68556:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 32768 GRANT, real grant 0 [Thu Dec 12 18:34:43 2019][38900.611908] LustreError: 68556:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 13 previous similar messages [Thu Dec 12 18:36:02 2019][38979.502707] Lustre: fir-OST005a: haven't heard from client 7de2709b-434b-c2b2-ee11-fe99c3a9d16f (at 10.9.101.51@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd728473000, cur 1576204562 expire 1576204412 last 1576204335 [Thu Dec 12 18:45:26 2019][39543.626607] LustreError: 68531:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 18:45:26 2019][39543.638691] LustreError: 68531:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 15 previous similar messages [Thu Dec 12 18:56:17 2019][40195.069701] LustreError: 68556:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 18:56:17 2019][40195.081791] LustreError: 68556:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 15 previous similar messages [Thu Dec 12 19:00:58 2019][40476.292568] Lustre: fir-OST0054: Connection restored to c463879e-71d6-cfb3-b583-923d4925c479 (at 10.9.104.28@o2ib4) [Thu Dec 12 19:00:58 2019][40476.303005] Lustre: Skipped 5 previous similar messages [Thu Dec 12 19:06:34 2019][40812.160538] LustreError: 68650:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 19:06:34 2019][40812.172664] LustreError: 68650:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 10 previous similar messages [Thu Dec 12 19:12:36 2019][41173.860961] Lustre: fir-OST0054: Connection restored to 7de2709b-434b-c2b2-ee11-fe99c3a9d16f (at 10.9.101.51@o2ib4) [Thu Dec 12 19:12:36 2019][41173.871395] Lustre: Skipped 5 previous similar messages [Thu Dec 12 19:17:17 2019][41454.831756] LustreError: 68456:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005e: cli 34b263e7-c235-6737-be01-1bc0ec67d622 claims 28672 GRANT, real grant 0 [Thu Dec 12 19:17:17 2019][41454.845659] LustreError: 68456:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 7 previous similar messages [Thu Dec 12 19:27:32 2019][42070.054551] LustreError: 68741:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 19:27:32 2019][42070.066630] LustreError: 68741:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 51 previous similar messages [Thu Dec 12 19:38:31 2019][42729.369880] LustreError: 68475:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 19:38:31 2019][42729.381990] LustreError: 68475:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 11 previous similar messages [Thu Dec 12 19:39:03 2019][42761.358219] perf: interrupt took too long (2503 > 2500), lowering kernel.perf_event_max_sample_rate to 79000 [Thu Dec 12 19:48:41 2019][43338.940854] LustreError: 68741:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 19:48:41 2019][43338.952962] LustreError: 68741:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 10 previous similar messages [Thu Dec 12 19:59:23 2019][43980.720056] LustreError: 68632:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 19:59:23 2019][43980.732187] LustreError: 68632:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 14 previous similar messages [Thu Dec 12 20:09:28 2019][44585.778829] LustreError: 68712:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 20:09:28 2019][44585.790912] LustreError: 68712:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 13 previous similar messages [Thu Dec 12 20:19:32 2019][45190.321020] LustreError: 68870:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0056: cli 03dd52b8-a4fc-4 claims 36864 GRANT, real grant 0 [Thu Dec 12 20:19:32 2019][45190.333128] LustreError: 68870:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 19 previous similar messages [Thu Dec 12 20:30:04 2019][45822.280833] LustreError: 68650:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 20:30:04 2019][45822.292962] LustreError: 68650:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 33 previous similar messages [Thu Dec 12 20:40:16 2019][46433.659729] LustreError: 68777:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 20:40:16 2019][46433.671808] LustreError: 68777:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 11 previous similar messages [Thu Dec 12 20:44:17 2019][46674.535307] Lustre: fir-OST0058: haven't heard from client 935b75df-613a-c7ad-95b7-8cbfb8326a67 (at 10.9.101.28@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8dd7244f2800, cur 1576212257 expire 1576212107 last 1576212030 [Thu Dec 12 20:44:17 2019][46674.557140] Lustre: Skipped 5 previous similar messages [Thu Dec 12 20:50:36 2019][47054.350294] LustreError: 68402:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 20:50:36 2019][47054.362467] LustreError: 68402:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 12 previous similar messages [Thu Dec 12 21:01:23 2019][47700.801476] LustreError: 68696:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 21:01:23 2019][47700.813609] LustreError: 68696:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 11 previous similar messages [Thu Dec 12 21:12:11 2019][48348.548461] LustreError: 68545:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 21:12:11 2019][48348.560555] LustreError: 68545:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 11 previous similar messages [Thu Dec 12 21:17:43 2019][48680.835819] Lustre: fir-OST0056: Connection restored to (at 10.9.109.8@o2ib4) [Thu Dec 12 21:17:43 2019][48680.843072] Lustre: Skipped 4 previous similar messages [Thu Dec 12 21:21:25 2019][48903.130743] Lustre: fir-OST0054: Connection restored to 935b75df-613a-c7ad-95b7-8cbfb8326a67 (at 10.9.101.28@o2ib4) [Thu Dec 12 21:21:25 2019][48903.141184] Lustre: Skipped 6 previous similar messages [Thu Dec 12 21:22:40 2019][48977.543508] LustreError: 68402:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 21:22:40 2019][48977.555587] LustreError: 68402:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 12 previous similar messages [Thu Dec 12 21:32:52 2019][49589.594292] LustreError: 68598:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 21:32:52 2019][49589.606396] LustreError: 68598:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 10 previous similar messages [Thu Dec 12 21:43:14 2019][50212.397155] LustreError: 68510:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 21:43:14 2019][50212.409260] LustreError: 68510:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 10 previous similar messages [Thu Dec 12 21:53:57 2019][50855.264423] LustreError: 68686:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 21:53:57 2019][50855.276510] LustreError: 68686:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 10 previous similar messages [Thu Dec 12 22:04:14 2019][51472.051261] LustreError: 68519:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 22:04:14 2019][51472.063346] LustreError: 68519:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 10 previous similar messages [Thu Dec 12 22:14:44 2019][52101.990362] LustreError: 68484:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 22:14:44 2019][52102.002510] LustreError: 68484:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 9 previous similar messages [Thu Dec 12 22:25:03 2019][52721.353244] LustreError: 68442:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 22:25:03 2019][52721.365358] LustreError: 68442:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 9 previous similar messages [Thu Dec 12 22:35:55 2019][53372.876306] LustreError: 68656:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 30125e88-c9a3-4 claims 28672 GRANT, real grant 0 [Thu Dec 12 22:35:55 2019][53372.888394] LustreError: 68656:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 12 previous similar messages