[LU-2932] slow metadata performance at NOAA again Created: 08/Mar/13  Updated: 09/May/14  Resolved: 09/May/14

Status: Resolved
Project: Lustre
Component/s: None
Affects Version/s: Lustre 1.8.7
Fix Version/s: None

Type: Bug Priority: Major
Reporter: Oz Rentas Assignee: Cliff White (Inactive)
Resolution: Won't Fix Votes: 0
Labels: None

Attachments: Microsoft Word scr1mdtest.xlsx     Microsoft Word scr2mdtest.xlsx     File version_patches_b1_8_patches_increase-ldlm-hashbuckets.patch    
Severity: 3
Rank (Obsolete): 7049

 Description   

During the last downtime at NOAA, we discovered that the slow metadata performance reported in LU-2649 had returned. The strange thing is that when the clients were shutdown, the performance returned to normal. The current theory is that the jobs that run on one filesystem are taking more/different locks than the jobs on the other. We are taking a downtime next Thursday to investigate this more, but would like some pointers as to what we can look for. Does this theory make sense?



 Comments   
Comment by Cliff White (Inactive) [ 08/Mar/13 ]

If you are taking a downtime, it would be good to try and isolate the problem as much as possible, eliminate any possible background load. If you can dial down to the smallest reproducer it will help. You might run some baseline tests, such as lnet_selftest to verify that other factors such as network are still equal between the systems.

From the last bug, i don't think you tried increasing mdt threads, that may be something to look at.
If the jobs are accessing data of the same size, in the same manner, and inside the same sort of file/directory structure, the locking shouldn't be much different. You could trace ldlm traffic, but the hit from enabling debug would likely obscure the result.

It might be good to strace some jobs, and see if you can find where the the time is going (-T, or with some version -ttt) or if there are other differences between the jobs. Can you run the exact same task on both filesystems?

Comment by Cliff White (Inactive) [ 08/Mar/13 ]

Again, from the previous bug, looking at disk hardware performance on the mdt would be good. sestets, iostat, or even sar would perhaps show something.

Comment by Kit Westneat (Inactive) [ 08/Mar/13 ]

Hi Cliff,

We've pretty much exhausted the hardware tests. It's really bizarre that simply shutting down the clients causes performance to go up significantly. I should mention that before we shut them down, we have looked at the export stats to verify that there is absolutely no load on the system. Even with no load at all (all the stats file remaining constant), the performance is degraded until the clients are shutdown, and therefore evicted from the system. When the clients are put back into production, performance starts slowly going down. After a day or two, it's back to the "bad" performance numbers. This, coupled with the dlmtraces from the last bug, makes me think it's some kind of issue in the lock manager that gets cleared when the clients and their locks are evicted.

As mentioned in the previous bug, we have been testing exclusively with mdtest. We hope to use this downtime to discover what applications or operations cause the filesystem to go into degraded performance mode.

Thanks,
Kit

Comment by Nate Pearlstein (Inactive) [ 08/Mar/13 ]

I think running perf with the same tests to show where the kernel is
spending its time would be helpful as well. It should be less expensive
than dlmtraces and well give a more global picture of what is going on.

Comment by Dennis Nelson [ 19/Mar/13 ]

Can we get an update on this? The customer filesystems are mounted with -o flock. There are 2318 client systems mounting the two Lustre filesystems. We ran a mdtest benchmark on a login node with all jobs stopped in the cluster and the results still showed degraded performance. Once the compute nodes were shutdown, the same mdtest was run on a login node and the performance jumped dramatically. We are seeing this behavior on both filesystems but the impact is much greater on the scratch2 filesystem. We are still looking for input from Intel to pointers on how to track down what causes this degradation and how to control it.

Comment by Cliff White (Inactive) [ 19/Mar/13 ]

Have you run the tests with strace as we requested earlier? Can you give us the data from your tests?

Comment by Dennis Nelson [ 19/Mar/13 ]

Sorry, I was on vacation last week and Kit is on vacation this week so I did not get a chance to discuss with him. I did not see anything in the ticket talking about an strace. We have a spreadsheet with the results of various mdtest runs made over the last month or so. I can attach it. You will notice that on both filesystems, after a restart of the clients, mdtest performance is good. After the nodes get put into production, mdtest performance drops. A drop is expected since there are jobs running on the filesystem but you will see by the results that the drop is more significant on scratch2. Again, that can probably be attributed to the jobs running on the system but we found that once the performance drops, stopping all jobs did not make it come back up. The performance went back up after all of the clients were shutdown. You can see that took place on the March 5th - March 6th timeframe.

Comment by Cliff White (Inactive) [ 19/Mar/13 ]

Well, the question seems to be: what is the differences between these two clusters at the moment you experience this performance delta?
If you can get into a state where you can run mdtest without other jobs on the system, and you have the performance delta visible, we'd like you to take a detailed look at client memory consumption, server memory consumption, any errors/other indications in system logs,
basically you need to give us some idea of where the delta is occurring.
One system is in some way more loaded (memory, CPU, network), or being less responsive.
Are you running the same version of Lustre on both clusters?
Can you compare roc_stats on the clients when running the same workloads?
When there is a mix of work running on the cluster, isolating the issue can be difficult. If you have customer workload that sees different performance on the two clusters, strace (-T or -ttt) would provide some indication where the delta in performance is occurring.
The last activity on this bug (08 March) there was an indication you were taking downtime to examine this in more detail - did that downtime happen?

Comment by Dennis Nelson [ 19/Mar/13 ]

Cliff,

I am wondering if this is an indication of an issue.

[3/19/13 1:30:53 PM] Dennis Nelson: Snapshot from scratch1:
[3/19/13 1:30:54 PM] Dennis Nelson: ldlm.namespaces.scratch1-MDT0000-mdc-ffff880c133fa800.lock_count=127
ldlm.namespaces.scratch1-OST0000-osc-ffff880c133fa800.lock_count=9
ldlm.namespaces.scratch1-OST0001-osc-ffff880c133fa800.lock_count=9
ldlm.namespaces.scratch1-OST0002-osc-ffff880c133fa800.lock_count=7
ldlm.namespaces.scratch1-OST0003-osc-ffff880c133fa800.lock_count=6
ldlm.namespaces.scratch1-OST0004-osc-ffff880c133fa800.lock_count=12
ldlm.namespaces.scratch1-OST0005-osc-ffff880c133fa800.lock_count=5
ldlm.namespaces.scratch1-OST0006-osc-ffff880c133fa800.lock_count=4
ldlm.namespaces.scratch1-OST0007-osc-ffff880c133fa800.lock_count=4
ldlm.namespaces.scratch1-OST0008-osc-ffff880c133fa800.lock_count=2
ldlm.namespaces.scratch1-OST0009-osc-ffff880c133fa800.lock_count=6
ldlm.namespaces.scratch1-OST000a-osc-ffff880c133fa800.lock_count=5
ldlm.namespaces.scratch1-OST000b-osc-ffff880c133fa800.lock_count=7
ldlm.namespaces.scratch1-OST000c-osc-ffff880c133fa800.lock_count=8
ldlm.namespaces.scratch1-OST000d-osc-ffff880c133fa800.lock_count=3
ldlm.namespaces.scratch1-OST000e-osc-ffff880c133fa800.lock_count=4
ldlm.namespaces.scratch1-OST000f-osc-ffff880c133fa800.lock_count=9
ldlm.namespaces.scratch1-OST0010-osc-ffff880c133fa800.lock_count=5
[3/19/13 1:31:02 PM] Dennis Nelson: snapshot from scratch2:
[3/19/13 1:31:17 PM] Dennis Nelson: ldlm.namespaces.scratch2-MDT0000-mdc-ffff881809580000.lock_count=2400
ldlm.namespaces.scratch2-OST0000-osc-ffff881809580000.lock_count=2400
ldlm.namespaces.scratch2-OST0001-osc-ffff881809580000.lock_count=2400
ldlm.namespaces.scratch2-OST0002-osc-ffff881809580000.lock_count=2400
ldlm.namespaces.scratch2-OST0003-osc-ffff881809580000.lock_count=2400
ldlm.namespaces.scratch2-OST0004-osc-ffff881809580000.lock_count=2400
ldlm.namespaces.scratch2-OST0005-osc-ffff881809580000.lock_count=2400
ldlm.namespaces.scratch2-OST0006-osc-ffff881809580000.lock_count=2400
ldlm.namespaces.scratch2-OST0007-osc-ffff881809580000.lock_count=2400
ldlm.namespaces.scratch2-OST0008-osc-ffff881809580000.lock_count=2400
ldlm.namespaces.scratch2-OST0009-osc-ffff881809580000.lock_count=2400
ldlm.namespaces.scratch2-OST000a-osc-ffff881809580000.lock_count=2400
ldlm.namespaces.scratch2-OST000b-osc-ffff881809580000.lock_count=2400
ldlm.namespaces.scratch2-OST000c-osc-ffff881809580000.lock_count=2400
ldlm.namespaces.scratch2-OST000d-osc-ffff881809580000.lock_count=2400
ldlm.namespaces.scratch2-OST000e-osc-ffff881809580000.lock_count=2400
ldlm.namespaces.scratch2-OST000f-osc-ffff881809580000.lock_count=2400

Comment by Dennis Nelson [ 19/Mar/13 ]

There are a total of 176 OSTs on scratch1 and 220 OSTs on scratch2. I did not show all of them. But scratch2 shows all to be pegged at 2400. I'd like to get some guidance on whether this might be the cause of the slowness and also get an idea on how to track down what is causing this difference between the two filesystems.

Comment by Oleg Drokin [ 19/Mar/13 ]

it's not so much indication of issue, but it does show that scratch2 is much more actively used by your clients, and in such a way that it's actually accesses all OSTs too which is not typical. Did you forget to shutdown some sort of locatedb-alike job for this filesystem I wonder?

Comment by Dennis Nelson [ 19/Mar/13 ]

Cliff, the customer is willing to take some downtime but before that takes place, we need a test plan. They want to have a plan in place before we schedule the downtime. I'd like some guidance on what path we should go down to determine what resources are being used up to cause this issue. We know that rebooting the client releases whatever resources are being used but the problem is to figure out the problem.

Comment by Dennis Nelson [ 19/Mar/13 ]

Is it possible that an engineer could join us tomorrow morning in a conference call to help build an action plan of what to look for in resolving this issue? We can start a webex session and look at anything regarding the configuration.

Comment by Oleg Drokin [ 20/Mar/13 ]

even before downtime, can you grab content of /proc/fs/lustre/ldlm/namespaces/mdt*/pool/* files on the mds (the name might be a bit different, I don't have 1.8 system at hand to see the exact name for mdt service, potentially might be called MDT or MDS or scratch2-MDT/MDS)

also on the mds, content of /proc/meminfo, /proc/slabinfo and also run top to see what's using cpu, I wonder if ldlm_poold shows anywhere near the top and how much wall clock time did it use?

Do all clients lockstat looks like this: ldlm.namespaces.scratch2-MDT0000-mdc-ffff881809580000.lock_count=2400 ? or just some of them (2318*2400 = 5563200 which is way too much I would suspect and there must be some nodes that are using less locks?, if so, any pattern in the nodes that use a lot of locks?) if you drop the locks on one such client (echo clear >/proc/fs/lustre/ldlm/namespaces/scratch2-MDT0000-mdc-*/lru_size), how quickly does the count go back? if quickly, what are the most active processes in top?

What does md activity breakdown look like on the mds (file like /proc/fs/lustre/mdt/lustre-MDT0000/md_stats adopted to your config)

Did you already rule out clients (at least some, like login nodes) doing wide filesystem scans on a regular basis, like updatedb, find /lustre ... and the like?

Do you regain a bit of performance with every client being unmounted or are there some key clients that bump performance quite a bit once unmounted? (if so, how are those clients special) (I imagine if you just bump high lock-using clients dentified in one of previous steps, you might regain your performance back quite a bit even without shutting the rest of clients down - worth trying).

Comment by Dennis Nelson [ 20/Mar/13 ]

There are no pool subdirectories under /proc/fs/lustre/ldlm on teh mds. Perhaps this is what you want:

[root@lfs-mds-2-2 mds-scratch2-MDT0000_UUID]# pwd
/proc/fs/lustre/ldlm/namespaces/mds-scratch2-MDT0000_UUID

[root@lfs-mds-2-2 mds-scratch2-MDT0000_UUID]# ls
contended_locks lock_count max_nolock_bytes
contention_seconds lock_timeouts resource_count
[root@lfs-mds-2-2 mds-scratch2-MDT0000_UUID]# cat contended_locks
32
[root@lfs-mds-2-2 mds-scratch2-MDT0000_UUID]# cat contention_seconds
2
[root@lfs-mds-2-2 mds-scratch2-MDT0000_UUID]# cat lock_count
1289976
[root@lfs-mds-2-2 mds-scratch2-MDT0000_UUID]# cat lock_timeouts
0
[root@lfs-mds-2-2 mds-scratch2-MDT0000_UUID]# cat max_nolock_bytes
0
[root@lfs-mds-2-2 mds-scratch2-MDT0000_UUID]# cat resource_count
830982

[root@lfs-mds-2-2 mds-scratch2-MDT0000_UUID]# cat /proc/meminfo
MemTotal: 74040648 kB
MemFree: 295656 kB
Buffers: 67193528 kB
Cached: 123020 kB
SwapCached: 256 kB
Active: 18290328 kB
Inactive: 49170020 kB
HighTotal: 0 kB
HighFree: 0 kB
LowTotal: 74040648 kB
LowFree: 295656 kB
SwapTotal: 2097144 kB
SwapFree: 2096604 kB
Dirty: 8809560 kB
Writeback: 0 kB
AnonPages: 144036 kB
Mapped: 70544 kB
Slab: 4757588 kB
PageTables: 13056 kB
NFS_Unstable: 0 kB
Bounce: 0 kB
CommitLimit: 39117468 kB
Committed_AS: 216928 kB
VmallocTotal: 34359738367 kB
VmallocUsed: 1255852 kB
VmallocChunk: 34358454703 kB
HugePages_Total: 0
HugePages_Free: 0
HugePages_Rsvd: 0
Hugepagesize: 2048 kB

[root@lfs-mds-2-2 mds-scratch2-MDT0000_UUID]# top -b -n 1
top - 11:27:59 up 23 days, 14:24, 1 user, load average: 4.65, 3.45, 4.48
Tasks: 1054 total, 3 running, 1050 sleeping, 0 stopped, 1 zombie
Cpu(s): 0.0%us, 4.4%sy, 0.0%ni, 95.0%id, 0.4%wa, 0.1%hi, 0.0%si, 0.0%st
Mem: 74040648k total, 73767384k used, 273264k free, 67197684k buffers
Swap: 2097144k total, 540k used, 2096604k free, 125792k cached

PID PPID RUSER USER CODE DATA SHR TIME+ %CPU %MEM PR NI S VIRT SWAP RES UID COMMAND
6752 1 root root 24 65m 57m 0:31.46 0.0 0.1 0 -18 S 148m 79m 68m 0 [dmeventd]
9069 1 root root 12 16m 1252 107:18.18 0.0 0.0 18 0 S 112m 94m 17m 0 /usr/bin/perl -w /usr/bin/collectl -D
15155 15098 hacluste hacluste 92 14m 2504 1:12.04 0.0 0.0 15 0 S 78028 59m 17m 100 /usr/lib64/heartbeat/cib
15157 15098 root root 72 352 6216 0:09.28 0.0 0.0 15 0 S 66128 57m 6824 0 /usr/lib64/heartbeat/stonithd
15098 1 root root 180 1828 4708 64:49.78 1.9 0.0 -2 0 S 49652 41m 6700 0 heartbeat: master control process
15148 15098 root root 180 1560 4708 0:22.32 0.0 0.0 -2 0 S 49384 41m 6432 0 heartbeat: write: ucast ib3
15149 15098 root root 180 1560 4708 4:02.33 0.0 0.0 -2 0 S 49384 41m 6432 0 heartbeat: read: ucast ib3
15146 15098 root root 180 1556 4708 0:28.73 0.0 0.0 -2 0 S 49380 41m 6428 0 heartbeat: write: ucast ib3
15147 15098 root root 180 1556 4708 0:00.00 0.0 0.0 -2 0 S 49380 41m 6428 0 heartbeat: read: ucast ib3
15142 15098 root root 180 1552 4708 0:22.04 0.0 0.0 -2 0 S 49376 41m 6424 0 heartbeat: write: ucast ib3
15143 15098 root root 180 1552 4708 0:00.00 0.0 0.0 -2 0 S 49376 41m 6424 0 heartbeat: read: ucast ib3
15144 15098 root root 180 1552 4708 0:22.47 0.0 0.0 -2 0 S 49376 41m 6424 0 heartbeat: write: ucast ib3
15145 15098 root root 180 1552 4708 0:00.00 0.0 0.0 -2 0 S 49376 41m 6424 0 heartbeat: read: ucast ib3
15136 15098 root root 180 1548 4708 0:26.42 0.0 0.0 -2 0 S 49372 41m 6420 0 heartbeat: write: ucast ib2
15137 15098 root root 180 1548 4708 6:27.08 0.0 0.0 -2 0 S 49372 41m 6420 0 heartbeat: read: ucast ib2
15138 15098 root root 180 1548 4708 0:23.93 0.0 0.0 -2 0 S 49372 41m 6420 0 heartbeat: write: ucast ib3
15139 15098 root root 180 1548 4708 0:00.00 0.0 0.0 -2 0 S 49372 41m 6420 0 heartbeat: read: ucast ib3
15140 15098 root root 180 1548 4708 0:29.64 0.0 0.0 -2 0 S 49372 41m 6420 0 heartbeat: write: ucast ib3
15141 15098 root root 180 1548 4708 0:00.00 0.0 0.0 -2 0 S 49372 41m 6420 0 heartbeat: read: ucast ib3
15132 15098 root root 180 1544 4708 0:20.97 0.0 0.0 -2 0 S 49368 41m 6416 0 heartbeat: write: ucast ib2
15133 15098 root root 180 1544 4708 0:00.00 0.0 0.0 -2 0 S 49368 41m 6416 0 heartbeat: read: ucast ib2
15134 15098 root root 180 1544 4708 0:33.46 0.0 0.0 -2 0 S 49368 41m 6416 0 heartbeat: write: ucast ib2
15135 15098 root root 180 1544 4708 0:00.00 0.0 0.0 -2 0 S 49368 41m 6416 0 heartbeat: read: ucast ib2
15126 15098 root root 180 1540 4708 0:24.60 0.0 0.0 -2 0 S 49364 41m 6412 0 heartbeat: write: ucast ib2
15127 15098 root root 180 1540 4708 0:00.00 0.0 0.0 -2 0 S 49364 41m 6412 0 heartbeat: read: ucast ib2
15128 15098 root root 180 1540 4708 0:26.57 0.0 0.0 -2 0 S 49364 41m 6412 0 heartbeat: write: ucast ib2
15129 15098 root root 180 1540 4708 0:00.00 0.0 0.0 -2 0 S 49364 41m 6412 0 heartbeat: read: ucast ib2
15130 15098 root root 180 1540 4708 0:20.68 0.0 0.0 -2 0 S 49364 41m 6412 0 heartbeat: write: ucast ib2
15131 15098 root root 180 1540 4708 0:00.00 0.0 0.0 -2 0 S 49364 41m 6412 0 heartbeat: read: ucast ib2
15124 15098 root root 180 1536 4708 0:20.14 0.0 0.0 -2 0 S 49360 41m 6408 0 heartbeat: write: ucast ib1
15125 15098 root root 180 1536 4708 7:04.06 0.0 0.0 -2 0 S 49360 41m 6408 0 heartbeat: read: ucast ib1
15118 15098 root root 180 1532 4708 0:21.33 0.0 0.0 -2 0 S 49356 41m 6404 0 heartbeat: write: ucast ib1
15119 15098 root root 180 1532 4708 0:00.00 0.0 0.0 -2 0 S 49356 41m 6404 0 heartbeat: read: ucast ib1
15120 15098 root root 180 1532 4708 0:30.19 0.0 0.0 -2 0 S 49356 41m 6404 0 heartbeat: write: ucast ib1
15121 15098 root root 180 1532 4708 0:00.00 0.0 0.0 -2 0 S 49356 41m 6404 0 heartbeat: read: ucast ib1
15122 15098 root root 180 1532 4708 0:25.99 0.0 0.0 -2 0 S 49356 41m 6404 0 heartbeat: write: ucast ib1
15123 15098 root root 180 1532 4708 0:00.00 0.0 0.0 -2 0 S 49356 41m 6404 0 heartbeat: read: ucast ib1
15114 15098 root root 180 1528 4708 0:21.75 0.0 0.0 -2 0 S 49352 41m 6400 0 heartbeat: write: ucast ib1
15115 15098 root root 180 1528 4708 0:00.00 0.0 0.0 -2 0 S 49352 41m 6400 0 heartbeat: read: ucast ib1
15116 15098 root root 180 1528 4708 0:28.30 0.0 0.0 -2 0 S 49352 41m 6400 0 heartbeat: write: ucast ib1
15117 15098 root root 180 1528 4708 0:00.00 0.0 0.0 -2 0 S 49352 41m 6400 0 heartbeat: read: ucast ib1
15108 15098 root root 180 1524 4708 0:27.08 0.0 0.0 -2 0 S 49348 41m 6396 0 heartbeat: write: ucast eth0
15109 15098 root root 180 1524 4708 0:00.00 0.0 0.0 -2 0 S 49348 41m 6396 0 heartbeat: read: ucast eth0
15110 15098 root root 180 1524 4708 0:20.75 0.0 0.0 -2 0 S 49348 41m 6396 0 heartbeat: write: ucast eth0
15111 15098 root root 180 1524 4708 0:00.00 0.0 0.0 -2 0 S 49348 41m 6396 0 heartbeat: read: ucast eth0
15112 15098 root root 180 1524 4708 0:26.87 0.0 0.0 -2 0 S 49348 41m 6396 0 heartbeat: write: ucast eth0
15113 15098 root root 180 1524 4708 7:13.08 0.0 0.0 -2 0 S 49348 41m 6396 0 heartbeat: read: ucast eth0
15106 15098 root root 180 1520 4708 0:22.39 0.0 0.0 -2 0 S 49344 41m 6392 0 heartbeat: write: ucast eth0
15107 15098 root root 180 1520 4708 0:00.00 0.0 0.0 -2 0 S 49344 41m 6392 0 heartbeat: read: ucast eth0
15101 15098 root root 180 1460 4708 0:00.39 0.0 0.0 -2 0 S 49284 41m 6332 0 heartbeat: FIFO reader
15102 15098 root root 180 1456 4708 0:28.36 0.0 0.0 -2 0 S 49280 41m 6328 0 heartbeat: write: ucast eth0
15103 15098 root root 180 1456 4708 0:00.00 0.0 0.0 -2 0 S 49280 41m 6328 0 heartbeat: read: ucast eth0
15104 15098 root root 180 1456 4708 0:31.50 0.0 0.0 -2 0 S 49280 41m 6328 0 heartbeat: write: ucast eth0
15105 15098 root root 180 1456 4708 0:00.00 0.0 0.0 -2 0 S 49280 41m 6328 0 heartbeat: read: ucast eth0
8948 1 ntp ntp 460 888 3804 1:04.16 0.0 0.0 15 0 S 19212 13m 4908 38 ntpd -u ntp:ntp -p /var/run/ntpd.pid -g
7514 1 root root 256 65m 2824 11:40.43 0.0 0.0 RT 0 S 88588 81m 4676 0 /sbin/multipathd
15159 15098 hacluste hacluste 248 1504 2512 0:03.91 0.0 0.0 15 0 S 73008 67m 4088 100 /usr/lib64/heartbeat/crmd
24548 8915 root root 384 816 2596 0:00.10 0.0 0.0 16 0 S 88080 82m 3324 0 sshd: root@pts/0
15156 15098 root root 76 1148 1520 2:26.55 0.0 0.0 15 0 S 63684 59m 2800 0 /usr/lib64/heartbeat/lrmd -r
15158 15098 hacluste hacluste 24 352 2064 0:05.57 0.0 0.0 16 0 S 65724 61m 2468 100 /usr/lib64/heartbeat/attrd
14704 14699 postfix postfix 252 704 1884 0:01.44 0.0 0.0 15 0 S 54412 50m 2432 89 qmgr -l -t fifo -u
14699 1 root root 132 584 1796 0:01.33 0.0 0.0 15 0 S 54172 50m 2324 0 /usr/libexec/postfix/master
11395 14699 postfix postfix 192 588 1804 0:00.00 0.0 0.0 16 0 S 54236 50m 2308 89 pickup -l -t fifo -u
15154 15098 hacluste hacluste 72 756 1284 0:04.70 0.0 0.0 15 0 S 36128 33m 1984 100 /usr/lib64/heartbeat/ccm
27920 24558 root root 56 1168 740 0:00.05 5.7 0.0 15 0 R 13424 11m 1756 0 top -b -n 1
24558 24548 root root 712 544 1216 0:00.06 0.0 0.0 15 0 S 66204 63m 1664 0 -bash
8610 1 root root 344 1108 584 0:10.02 0.0 0.0 15 0 S 13404 11m 1548 0 syslog-ng -p /var/run/syslog-ng.pid
8891 1 root root 208 153m 1124 0:12.90 0.0 0.0 20 0 S 183m 181m 1544 0 automount
27819 15156 root root 712 424 1060 0:00.00 0.0 0.0 18 0 S 66084 63m 1420 0 /bin/sh /usr/lib/ocf/resource.d//pacemaker/ping monitor
27876 15156 root root 712 424 1060 0:00.00 0.0 0.0 18 0 S 66084 63m 1416 0 /bin/sh /usr/lib/ocf/resource.d//pacemaker/ping monitor
8915 1 root root 384 636 648 0:06.34 0.0 0.0 15 0 S 62648 59m 1212 0 /usr/sbin/sshd
8974 1 root root 48 1012 596 0:00.95 0.0 0.0 15 0 S 74832 71m 1160 0 crond
8734 1 root root 20 320 828 0:13.10 0.0 0.0 15 0 S 35616 33m 1128 0 ha_logd: read process
8932 1 root root 152 520 680 0:00.00 0.0 0.0 22 0 S 21664 20m 892 0 xinetd -stayalive -pidfile /var/run/xinetd.pid
8364 1 root root 156 74m 588 0:02.95 0.0 0.0 11 -4 S 92880 89m 864 0 auditd
2158 1 root root 60 524 392 0:00.41 0.0 0.0 12 -4 S 12692 11m 856 0 /sbin/udevd -d
8366 8364 root root 84 74m 628 0:01.29 0.0 0.0 8 -8 S 81820 79m 828 0 /sbin/audispd
27918 27819 root root 712 424 388 0:00.00 0.0 0.0 18 0 S 66084 63m 752 0 /bin/sh /usr/lib/ocf/resource.d//pacemaker/ping monitor
27923 27876 root root 712 424 388 0:00.00 0.0 0.0 18 0 S 66084 63m 744 0 /bin/sh /usr/lib/ocf/resource.d//pacemaker/ping monitor
8503 1 dbus dbus 292 340 488 0:00.00 0.0 0.0 22 0 S 21276 20m 708 81 dbus-daemon --system
1 0 root root 36 316 584 0:04.24 0.0 0.0 15 0 S 10368 9672 696 0 init [3]
8761 8734 root root 20 320 420 0:10.21 0.0 0.0 15 0 S 35616 34m 680 0 ha_logd: write process
27919 27918 root root 36 360 508 0:00.00 0.0 0.0 16 0 S 6044 5428 616 0 ping -n -q -W 2 -c 2 10.174.79.230
27924 27923 root root 36 360 508 0:00.00 0.0 0.0 21 0 S 6044 5428 616 0 ping -n -q -W 2 -c 2 10.174.80.20
8765 1 root root 152 10m 480 277:55.57 0.0 0.0 18 0 D 18476 17m 608 0 /usr/sbin/memlogd -b -l 10 -f /var/log/memlog
9098 1 root root 264 464 336 0:00.94 0.0 0.0 18 0 S 18436 17m 604 0 /usr/sbin/smartd -q never
8417 1 rpc rpc 36 284 472 0:00.00 0.0 0.0 23 0 S 8072 7484 588 32 portmap
8593 1 root root 20 272 464 0:00.00 0.0 0.0 23 0 S 3820 3264 556 0 /usr/sbin/acpid
9108 1 root root 16 280 476 0:00.00 0.0 0.0 17 0 S 3820 3272 548 0 /sbin/agetty -L ttyS1 115200 vt100
9101 1 root root 12 272 424 0:00.00 0.0 0.0 17 0 S 3812 3316 496 0 /sbin/mingetty tty1
9102 1 root root 12 272 424 0:00.00 0.0 0.0 17 0 S 3812 3316 496 0 /sbin/mingetty tty2
9105 1 root root 12 272 424 0:00.00 0.0 0.0 18 0 S 3812 3316 496 0 /sbin/mingetty tty5
9103 1 root root 12 272 424 0:00.00 0.0 0.0 18 0 S 3812 3320 492 0 /sbin/mingetty tty3
9104 1 root root 12 272 424 0:00.00 0.0 0.0 18 0 S 3812 3320 492 0 /sbin/mingetty tty4
9107 1 root root 12 272 424 0:00.00 0.0 0.0 19 0 S 3812 3320 492 0 /sbin/mingetty tty6
8999 1 root root 20 328 304 0:00.00 0.0 0.0 18 0 S 18752 17m 452 0 /usr/sbin/atd
8397 1 root root 24 324 252 0:28.87 0.0 0.0 18 0 S 10780 10m 392 0 irqbalance
8961 1 root root 88 288 284 0:00.00 0.0 0.0 18 0 S 6476 6104 372 0 gpm -m /dev/input/mice -t exps2
2 1 root root 0 0 0 0:01.41 0.0 0.0 RT -5 S 0 0 0 0 [migration/0]
3 1 root root 0 0 0 0:03.23 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/0]
4 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/0]
5 1 root root 0 0 0 0:02.25 0.0 0.0 RT -5 S 0 0 0 0 [migration/1]
6 1 root root 0 0 0 0:42.60 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/1]
7 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/1]
8 1 root root 0 0 0 0:00.79 0.0 0.0 RT -5 S 0 0 0 0 [migration/2]
9 1 root root 0 0 0 0:33.22 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/2]
10 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/2]
11 1 root root 0 0 0 0:00.44 0.0 0.0 RT -5 S 0 0 0 0 [migration/3]
12 1 root root 0 0 0 0:08.54 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/3]
13 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/3]
14 1 root root 0 0 0 0:00.43 0.0 0.0 RT -5 S 0 0 0 0 [migration/4]
15 1 root root 0 0 0 0:02.90 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/4]
16 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/4]
17 1 root root 0 0 0 0:00.36 0.0 0.0 RT -5 S 0 0 0 0 [migration/5]
18 1 root root 0 0 0 0:02.21 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/5]
19 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/5]
20 1 root root 0 0 0 0:03.01 0.0 0.0 RT -5 S 0 0 0 0 [migration/6]
21 1 root root 0 0 0 0:03.01 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/6]
22 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/6]
23 1 root root 0 0 0 0:00.46 0.0 0.0 RT -5 S 0 0 0 0 [migration/7]
24 1 root root 0 0 0 0:02.00 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/7]
25 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/7]
26 1 root root 0 0 0 0:00.59 0.0 0.0 RT -5 S 0 0 0 0 [migration/8]
27 1 root root 0 0 0 0:01.34 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/8]
28 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/8]
29 1 root root 0 0 0 0:00.55 0.0 0.0 RT -5 S 0 0 0 0 [migration/9]
30 1 root root 0 0 0 0:01.58 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/9]
31 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/9]
32 1 root root 0 0 0 0:00.56 0.0 0.0 RT -5 S 0 0 0 0 [migration/10]
33 1 root root 0 0 0 0:02.28 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/10]
34 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/10]
35 1 root root 0 0 0 0:00.42 0.0 0.0 RT -5 S 0 0 0 0 [migration/11]
36 1 root root 0 0 0 0:01.03 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/11]
37 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/11]
38 1 root root 0 0 0 0:00.17 0.0 0.0 RT -5 S 0 0 0 0 [migration/12]
39 1 root root 0 0 0 0:04.75 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/12]
40 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/12]
41 1 root root 0 0 0 0:00.67 0.0 0.0 RT -5 S 0 0 0 0 [migration/13]
42 1 root root 0 0 0 0:09.88 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/13]
43 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/13]
44 1 root root 0 0 0 0:00.28 0.0 0.0 RT -5 S 0 0 0 0 [migration/14]
45 1 root root 0 0 0 0:07.69 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/14]
46 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/14]
47 1 root root 0 0 0 0:00.18 0.0 0.0 RT -5 S 0 0 0 0 [migration/15]
48 1 root root 0 0 0 0:05.68 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/15]
49 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/15]
50 1 root root 0 0 0 0:00.15 0.0 0.0 RT -5 S 0 0 0 0 [migration/16]
51 1 root root 0 0 0 0:05.58 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/16]
52 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/16]
53 1 root root 0 0 0 0:00.21 0.0 0.0 RT -5 S 0 0 0 0 [migration/17]
54 1 root root 0 0 0 0:02.38 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/17]
55 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/17]
56 1 root root 0 0 0 0:00.10 0.0 0.0 RT -5 S 0 0 0 0 [migration/18]
57 1 root root 0 0 0 0:02.40 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/18]
58 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/18]
59 1 root root 0 0 0 0:00.28 0.0 0.0 RT -5 S 0 0 0 0 [migration/19]
60 1 root root 0 0 0 0:02.88 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/19]
61 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/19]
62 1 root root 0 0 0 0:00.13 0.0 0.0 RT -5 S 0 0 0 0 [migration/20]
63 1 root root 0 0 0 0:02.16 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/20]
64 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/20]
65 1 root root 0 0 0 0:00.12 0.0 0.0 RT -5 S 0 0 0 0 [migration/21]
66 1 root root 0 0 0 0:02.45 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/21]
67 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/21]
68 1 root root 0 0 0 0:00.13 0.0 0.0 RT -5 S 0 0 0 0 [migration/22]
69 1 root root 0 0 0 0:02.87 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/22]
70 1 root root 0 0 0 0:00.00 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/22]
71 1 root root 0 0 0 0:00.25 0.0 0.0 RT -5 S 0 0 0 0 [migration/23]
72 1 root root 0 0 0 0:00.42 0.0 0.0 34 19 S 0 0 0 0 [ksoftirqd/23]
73 1 root root 0 0 0 0:00.12 0.0 0.0 RT -5 S 0 0 0 0 [watchdog/23]
74 1 root root 0 0 0 0:00.09 0.0 0.0 10 -5 S 0 0 0 0 [events/0]
75 1 root root 0 0 0 0:00.19 0.0 0.0 10 -5 S 0 0 0 0 [events/1]
76 1 root root 0 0 0 0:00.20 0.0 0.0 10 -5 S 0 0 0 0 [events/2]
77 1 root root 0 0 0 0:00.08 0.0 0.0 10 -5 S 0 0 0 0 [events/3]
78 1 root root 0 0 0 0:00.03 0.0 0.0 10 -5 S 0 0 0 0 [events/4]
79 1 root root 0 0 0 0:00.04 0.0 0.0 10 -5 S 0 0 0 0 [events/5]
80 1 root root 0 0 0 0:00.04 0.0 0.0 10 -5 S 0 0 0 0 [events/6]
81 1 root root 0 0 0 0:00.05 0.0 0.0 10 -5 S 0 0 0 0 [events/7]
82 1 root root 0 0 0 0:00.04 0.0 0.0 10 -5 S 0 0 0 0 [events/8]
83 1 root root 0 0 0 0:00.03 0.0 0.0 10 -5 S 0 0 0 0 [events/9]
84 1 root root 0 0 0 0:00.04 0.0 0.0 10 -5 S 0 0 0 0 [events/10]
85 1 root root 0 0 0 0:00.02 0.0 0.0 10 -5 S 0 0 0 0 [events/11]
86 1 root root 0 0 0 0:00.13 0.0 0.0 10 -5 S 0 0 0 0 [events/12]
87 1 root root 0 0 0 0:00.20 0.0 0.0 10 -5 S 0 0 0 0 [events/13]
88 1 root root 0 0 0 0:00.12 0.0 0.0 10 -5 S 0 0 0 0 [events/14]
89 1 root root 0 0 0 0:00.09 0.0 0.0 10 -5 S 0 0 0 0 [events/15]
90 1 root root 0 0 0 0:00.04 0.0 0.0 10 -5 S 0 0 0 0 [events/16]
91 1 root root 0 0 0 0:00.27 0.0 0.0 10 -5 S 0 0 0 0 [events/17]
92 1 root root 0 0 0 0:00.05 0.0 0.0 10 -5 S 0 0 0 0 [events/18]
93 1 root root 0 0 0 0:00.22 0.0 0.0 10 -5 S 0 0 0 0 [events/19]
94 1 root root 0 0 0 0:00.04 0.0 0.0 10 -5 S 0 0 0 0 [events/20]
95 1 root root 0 0 0 0:00.05 0.0 0.0 10 -5 S 0 0 0 0 [events/21]
96 1 root root 0 0 0 0:00.02 0.0 0.0 10 -5 S 0 0 0 0 [events/22]
97 1 root root 0 0 0 0:09.71 0.0 0.0 10 -5 S 0 0 0 0 [events/23]
98 1 root root 0 0 0 0:15.37 0.0 0.0 10 -5 S 0 0 0 0 [khelper]
683 1 root root 0 0 0 0:00.01 0.0 0.0 16 -5 S 0 0 0 0 [kthread]
711 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/0]
712 683 root root 0 0 0 0:01.07 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/1]
713 683 root root 0 0 0 0:00.72 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/2]
714 683 root root 0 0 0 0:00.16 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/3]
715 683 root root 0 0 0 0:00.02 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/4]
716 683 root root 0 0 0 0:00.02 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/5]
717 683 root root 0 0 0 0:00.02 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/6]
718 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/7]
719 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/8]
720 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/9]
721 683 root root 0 0 0 0:00.01 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/10]
722 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/11]
723 683 root root 0 0 0 0:02.59 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/12]
724 683 root root 0 0 0 0:35.94 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/13]
725 683 root root 0 0 0 0:29.31 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/14]
726 683 root root 0 0 0 0:08.34 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/15]
727 683 root root 0 0 0 0:03.25 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/16]
728 683 root root 0 0 0 0:02.57 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/17]
729 683 root root 0 0 0 0:01.17 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/18]
730 683 root root 0 0 0 0:00.83 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/19]
731 683 root root 0 0 0 0:00.61 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/20]
732 683 root root 0 0 0 0:00.57 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/21]
733 683 root root 0 0 0 0:00.86 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/22]
734 683 root root 0 0 0 0:00.35 0.0 0.0 10 -5 S 0 0 0 0 [kblockd/23]
735 683 root root 0 0 0 0:00.00 0.0 0.0 20 -5 S 0 0 0 0 [kacpid]
946 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [cqueue/0]
947 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [cqueue/1]
948 683 root root 0 0 0 0:00.00 0.0 0.0 13 -5 S 0 0 0 0 [cqueue/2]
949 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [cqueue/3]
950 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [cqueue/4]
951 683 root root 0 0 0 0:00.00 0.0 0.0 16 -5 S 0 0 0 0 [cqueue/5]
952 683 root root 0 0 0 0:00.00 0.0 0.0 17 -5 S 0 0 0 0 [cqueue/6]
953 683 root root 0 0 0 0:00.00 0.0 0.0 18 -5 S 0 0 0 0 [cqueue/7]
954 683 root root 0 0 0 0:00.00 0.0 0.0 18 -5 S 0 0 0 0 [cqueue/8]
955 683 root root 0 0 0 0:00.00 0.0 0.0 18 -5 S 0 0 0 0 [cqueue/9]
956 683 root root 0 0 0 0:00.00 0.0 0.0 18 -5 S 0 0 0 0 [cqueue/10]
957 683 root root 0 0 0 0:00.00 0.0 0.0 18 -5 S 0 0 0 0 [cqueue/11]
958 683 root root 0 0 0 0:00.00 0.0 0.0 17 -5 S 0 0 0 0 [cqueue/12]
959 683 root root 0 0 0 0:00.00 0.0 0.0 17 -5 S 0 0 0 0 [cqueue/13]
960 683 root root 0 0 0 0:00.00 0.0 0.0 17 -5 S 0 0 0 0 [cqueue/14]
961 683 root root 0 0 0 0:00.00 0.0 0.0 17 -5 S 0 0 0 0 [cqueue/15]
962 683 root root 0 0 0 0:00.00 0.0 0.0 17 -5 S 0 0 0 0 [cqueue/16]
963 683 root root 0 0 0 0:00.00 0.0 0.0 17 -5 S 0 0 0 0 [cqueue/17]
964 683 root root 0 0 0 0:00.00 0.0 0.0 17 -5 S 0 0 0 0 [cqueue/18]
965 683 root root 0 0 0 0:00.00 0.0 0.0 17 -5 S 0 0 0 0 [cqueue/19]
966 683 root root 0 0 0 0:00.00 0.0 0.0 17 -5 S 0 0 0 0 [cqueue/20]
967 683 root root 0 0 0 0:00.00 0.0 0.0 17 -5 S 0 0 0 0 [cqueue/21]
968 683 root root 0 0 0 0:00.00 0.0 0.0 17 -5 S 0 0 0 0 [cqueue/22]
969 683 root root 0 0 0 0:00.00 0.0 0.0 17 -5 S 0 0 0 0 [cqueue/23]
972 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [khubd]
974 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [kseriod]
1226 683 root root 0 0 0 0:00.17 0.0 0.0 15 0 S 0 0 0 0 [khungtaskd]
1227 683 root root 0 0 0 0:36.70 0.0 0.0 15 0 S 0 0 0 0 [pdflush]
1228 683 root root 0 0 0 17:05.51 0.0 0.0 15 0 S 0 0 0 0 [pdflush]
1229 683 root root 0 0 0 0:47.46 0.0 0.0 10 -5 S 0 0 0 0 [kswapd0]
1230 683 root root 0 0 0 0:29.39 0.0 0.0 11 -5 S 0 0 0 0 [kswapd1]
1231 683 root root 0 0 0 0:00.00 0.0 0.0 20 -5 S 0 0 0 0 [aio/0]
1232 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [aio/1]
1233 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [aio/2]
1234 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [aio/3]
1235 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [aio/4]
1236 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [aio/5]
1237 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [aio/6]
1238 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [aio/7]
1239 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [aio/8]
1240 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [aio/9]
1241 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [aio/10]
1242 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [aio/11]
1243 683 root root 0 0 0 0:00.00 0.0 0.0 11 -5 S 0 0 0 0 [aio/12]
1244 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [aio/13]
1245 683 root root 0 0 0 0:00.00 0.0 0.0 13 -5 S 0 0 0 0 [aio/14]
1246 683 root root 0 0 0 0:00.00 0.0 0.0 13 -5 S 0 0 0 0 [aio/15]
1247 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [aio/16]
1248 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [aio/17]
1249 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [aio/18]
1250 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [aio/19]
1251 683 root root 0 0 0 0:00.00 0.0 0.0 16 -5 S 0 0 0 0 [aio/20]
1252 683 root root 0 0 0 0:00.00 0.0 0.0 17 -5 S 0 0 0 0 [aio/21]
1253 683 root root 0 0 0 0:00.00 0.0 0.0 18 -5 S 0 0 0 0 [aio/22]
1254 683 root root 0 0 0 0:00.00 0.0 0.0 18 -5 S 0 0 0 0 [aio/23]
1421 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [kpsmoused]
1702 683 root root 0 0 0 0:00.01 0.0 0.0 10 -5 S 0 0 0 0 [mpt_poll_0]
1703 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [mpt/0]
1704 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [scsi_eh_0]
1755 683 root root 0 0 0 0:00.00 0.0 0.0 13 -5 S 0 0 0 0 [ata/0]
1756 683 root root 0 0 0 0:00.00 0.0 0.0 13 -5 S 0 0 0 0 [ata/1]
1757 683 root root 0 0 0 0:00.00 0.0 0.0 13 -5 S 0 0 0 0 [ata/2]
1758 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [ata/3]
1759 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [ata/4]
1760 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [ata/5]
1761 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [ata/6]
1762 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [ata/7]
1763 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [ata/8]
1764 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [ata/9]
1765 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [ata/10]
1766 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [ata/11]
1767 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [ata/12]
1768 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [ata/13]
1769 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [ata/14]
1770 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ata/15]
1771 683 root root 0 0 0 0:00.00 0.0 0.0 16 -5 S 0 0 0 0 [ata/16]
1772 683 root root 0 0 0 0:00.00 0.0 0.0 16 -5 S 0 0 0 0 [ata/17]
1773 683 root root 0 0 0 0:00.00 0.0 0.0 16 -5 S 0 0 0 0 [ata/18]
1774 683 root root 0 0 0 0:00.00 0.0 0.0 16 -5 S 0 0 0 0 [ata/19]
1775 683 root root 0 0 0 0:00.00 0.0 0.0 16 -5 S 0 0 0 0 [ata/20]
1776 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ata/21]
1777 683 root root 0 0 0 0:00.00 0.0 0.0 16 -5 S 0 0 0 0 [ata/22]
1778 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ata/23]
1779 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ata_aux]
1805 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [scsi_eh_1]
1806 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [scsi_eh_2]
1807 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [scsi_eh_3]
1808 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [scsi_eh_4]
1809 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [scsi_eh_5]
1810 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [scsi_eh_6]
1861 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [scsi_eh_7]
1862 683 root root 0 0 0 0:00.00 0.0 0.0 0 -20 S 0 0 0 0 [qla2xxx_7_dpc]
1863 683 root root 0 0 0 0:00.28 0.0 0.0 11 -5 S 0 0 0 0 [scsi_wq_7]
1864 683 root root 0 0 0 0:00.00 0.0 0.0 13 -5 S 0 0 0 0 [fc_wq_7]
1865 683 root root 0 0 0 0:00.00 0.0 0.0 13 -5 S 0 0 0 0 [fc_dl_7]
1866 683 root root 0 0 0 0:00.00 0.0 0.0 13 -5 S 0 0 0 0 [scsi_eh_8]
1867 683 root root 0 0 0 0:00.00 0.0 0.0 0 -20 S 0 0 0 0 [qla2xxx_8_dpc]
1868 683 root root 0 0 0 0:00.29 0.0 0.0 10 -5 S 0 0 0 0 [scsi_wq_8]
1869 683 root root 0 0 0 0:00.00 0.0 0.0 13 -5 S 0 0 0 0 [fc_wq_8]
1870 683 root root 0 0 0 0:00.00 0.0 0.0 13 -5 S 0 0 0 0 [fc_dl_8]
1896 683 root root 0 0 0 0:00.00 0.0 0.0 13 -5 S 0 0 0 0 [kstriped]
1997 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [ksnapd]
2100 683 root root 0 0 0 0:18.01 0.0 0.0 10 -5 S 0 0 0 0 [kjournald]
2125 683 root root 0 0 0 0:00.34 0.0 0.0 10 -5 S 0 0 0 0 [kauditd]
3073 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [mlx4_err]
4444 1 root root 0 0 0 1:38.13 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_32]
4445 1 root root 0 0 0 1:38.04 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_33]
4446 1 root root 0 0 0 1:37.43 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_34]
4447 1 root root 0 0 0 1:37.48 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_35]
4448 1 root root 0 0 0 1:37.49 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_36]
4449 1 root root 0 0 0 1:38.19 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_37]
4450 1 root root 0 0 0 1:38.67 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_38]
4451 1 root root 0 0 0 1:37.32 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_39]
5063 1 root root 0 0 0 8:29.66 0.0 0.0 15 0 S 0 0 0 0 [ll_mgs_06]
5064 1 root root 0 0 0 8:28.58 0.0 0.0 15 0 S 0 0 0 0 [ll_mgs_07]
5120 683 root root 0 0 0 0:00.01 0.0 0.0 10 -5 S 0 0 0 0 [mlx4_sense]
5125 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [mlx4_sense]
5154 683 root root 0 0 0 0:00.00 0.0 0.0 11 -5 S 0 0 0 0 [mlx4_en]
5155 683 root root 0 0 0 0:00.00 0.0 0.0 11 -5 S 0 0 0 0 [mlx4_en]
5232 683 root root 0 0 0 0:20.96 0.0 0.0 10 -5 S 0 0 0 0 [ib_mad1]
5233 683 root root 0 0 0 0:10.45 0.0 0.0 10 -5 S 0 0 0 0 [ib_mad2]
5234 683 root root 0 0 0 0:00.85 0.0 0.0 10 -5 S 0 0 0 0 [ib_mad1]
5235 683 root root 0 0 0 0:14.10 0.0 0.0 10 -5 S 0 0 0 0 [ib_mad2]
5919 683 root root 0 0 0 0:00.00 0.0 0.0 11 -5 S 0 0 0 0 [kmpathd/0]
5920 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [kmpathd/1]
5921 683 root root 0 0 0 0:00.00 0.0 0.0 13 -5 S 0 0 0 0 [kmpathd/2]
5922 683 root root 0 0 0 0:00.00 0.0 0.0 13 -5 S 0 0 0 0 [kmpathd/3]
5923 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [kmpathd/4]
5924 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [kmpathd/5]
5925 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [kmpathd/6]
5926 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [kmpathd/7]
5927 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [kmpathd/8]
5928 683 root root 0 0 0 0:00.00 0.0 0.0 16 -5 S 0 0 0 0 [kmpathd/9]
5929 683 root root 0 0 0 0:00.00 0.0 0.0 17 -5 S 0 0 0 0 [kmpathd/10]
5930 683 root root 0 0 0 0:00.00 0.0 0.0 17 -5 S 0 0 0 0 [kmpathd/11]
5931 683 root root 0 0 0 0:00.00 0.0 0.0 18 -5 S 0 0 0 0 [kmpathd/12]
5932 683 root root 0 0 0 0:00.00 0.0 0.0 19 -5 S 0 0 0 0 [kmpathd/13]
5933 683 root root 0 0 0 0:00.00 0.0 0.0 19 -5 S 0 0 0 0 [kmpathd/14]
5934 683 root root 0 0 0 0:00.00 0.0 0.0 19 -5 S 0 0 0 0 [kmpathd/15]
5935 683 root root 0 0 0 0:00.00 0.0 0.0 19 -5 S 0 0 0 0 [kmpathd/16]
5936 683 root root 0 0 0 0:00.00 0.0 0.0 20 -5 S 0 0 0 0 [kmpathd/17]
5937 683 root root 0 0 0 0:00.00 0.0 0.0 20 -5 S 0 0 0 0 [kmpathd/18]
5938 683 root root 0 0 0 0:00.00 0.0 0.0 20 -5 S 0 0 0 0 [kmpathd/19]
5939 683 root root 0 0 0 0:00.00 0.0 0.0 20 -5 S 0 0 0 0 [kmpathd/20]
5940 683 root root 0 0 0 0:00.00 0.0 0.0 20 -5 S 0 0 0 0 [kmpathd/21]
5941 683 root root 0 0 0 0:00.00 0.0 0.0 20 -5 S 0 0 0 0 [kmpathd/22]
5942 683 root root 0 0 0 0:00.00 0.0 0.0 20 -5 S 0 0 0 0 [kmpathd/23]
5943 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [kmpath_handlerd]
6477 683 root root 0 0 0 0:00.00 0.0 0.0 11 -5 S 0 0 0 0 [kjournald]
6484 683 root root 0 0 0 0:00.20 0.0 0.0 10 -5 S 0 0 0 0 [kjournald]
6496 683 root root 0 0 0 0:13.69 0.0 0.0 10 -5 S 0 0 0 0 [kjournald]
6831 683 root root 0 0 0 0:00.00 0.0 0.0 11 -5 S 0 0 0 0 [ib_mcast]
6832 683 root root 0 0 0 0:00.00 0.0 0.0 11 -5 S 0 0 0 0 [ib_inform]
6833 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [local_sa]
6862 683 root root 0 0 0 0:01.33 0.0 0.0 10 -5 S 0 0 0 0 [ib_addr]
6891 683 root root 0 0 0 0:00.00 0.0 0.0 13 -5 S 0 0 0 0 [iw_cm_wq]
6920 683 root root 0 0 0 0:02.79 0.0 0.0 10 -5 S 0 0 0 0 [ib_cm/0]
6921 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [ib_cm/1]
6922 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ib_cm/2]
6923 683 root root 0 0 0 0:00.04 0.0 0.0 10 -5 S 0 0 0 0 [ib_cm/3]
6924 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [ib_cm/4]
6925 683 root root 0 0 0 0:00.05 0.0 0.0 10 -5 S 0 0 0 0 [ib_cm/5]
6926 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [ib_cm/6]
6927 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ib_cm/7]
6928 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ib_cm/8]
6929 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [ib_cm/9]
6930 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ib_cm/10]
6931 683 root root 0 0 0 0:00.59 0.0 0.0 10 -5 S 0 0 0 0 [ib_cm/11]
6932 683 root root 0 0 0 0:00.02 0.0 0.0 10 -5 S 0 0 0 0 [ib_cm/12]
6933 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [ib_cm/13]
6934 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [ib_cm/14]
6935 683 root root 0 0 0 0:01.21 0.0 0.0 10 -5 S 0 0 0 0 [ib_cm/15]
6936 683 root root 0 0 0 0:00.72 0.0 0.0 10 -5 S 0 0 0 0 [ib_cm/16]
6937 683 root root 0 0 0 0:18.41 0.0 0.0 10 -5 S 0 0 0 0 [ib_cm/17]
6938 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [ib_cm/18]
6939 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ib_cm/19]
6940 683 root root 0 0 0 0:00.00 0.0 0.0 11 -5 S 0 0 0 0 [ib_cm/20]
6941 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [ib_cm/21]
6942 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [ib_cm/22]
6943 683 root root 0 0 0 0:01.14 0.0 0.0 10 -5 S 0 0 0 0 [ib_cm/23]
7060 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [rdma_cm]
7224 683 root root 0 0 0 0:00.02 0.0 0.0 10 -5 S 0 0 0 0 [ipoib]
7253 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [sdp]
7283 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [krdsd]
7312 683 root root 0 0 0 0:00.00 0.0 0.0 16 -5 S 0 0 0 0 [ib_fmr(mlx4_0)]
7349 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ib_fmr(mlx4_1)]
7412 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [iscsi_eh]
8674 683 root root 0 0 0 73:39.14 0.0 0.0 34 19 S 0 0 0 0 [kipmi0]
12544 683 root root 0 0 0 0:31.67 0.0 0.0 10 -5 S 0 0 0 0 [kcopyd]
12819 683 root root 0 0 0 0:00.13 0.0 0.0 10 -5 S 0 0 0 0 [kmmpd-dm-20]
12820 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [jbd2/dm-20-8]
12821 683 root root 0 0 0 0:00.00 0.0 0.0 16 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12822 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12823 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12824 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12825 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12826 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12827 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12828 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12829 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12830 683 root root 0 0 0 0:00.00 0.0 0.0 16 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12831 683 root root 0 0 0 0:00.00 0.0 0.0 16 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12832 683 root root 0 0 0 0:00.00 0.0 0.0 17 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12833 683 root root 0 0 0 0:00.00 0.0 0.0 18 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12834 683 root root 0 0 0 0:00.00 0.0 0.0 19 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12835 683 root root 0 0 0 0:00.00 0.0 0.0 20 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12836 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12837 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12838 683 root root 0 0 0 0:00.00 0.0 0.0 20 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12839 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12840 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12841 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12842 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12843 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12844 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [ldiskfs-dio-unw]
12845 1 root root 0 0 0 1:08.85 0.0 0.0 15 0 S 0 0 0 0 [ldlm_bl_00]
12846 1 root root 0 0 0 1:08.67 0.0 0.0 15 0 S 0 0 0 0 [ldlm_bl_01]
12847 1 root root 0 0 0 2:37.07 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_00]
12849 1 root root 0 0 0 2:36.99 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_01]
12850 1 root root 0 0 0 0:38.90 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_00]
12851 1 root root 0 0 0 0:39.18 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_01]
12852 1 root root 0 0 0 0:00.14 0.0 0.0 15 0 S 0 0 0 0 [ldlm_elt]
12854 1 root root 0 0 0 8:40.61 0.0 0.0 15 0 S 0 0 0 0 [ll_mgs_00]
12855 1 root root 0 0 0 8:42.17 0.0 0.0 15 0 S 0 0 0 0 [ll_mgs_01]
12856 1 root root 0 0 0 0:00.93 0.0 0.0 15 0 S 0 0 0 0 [ll_evictor]
12857 1 root root 0 0 0 8:41.36 0.0 0.0 15 0 R 0 0 0 0 [ll_mgs_02]
12858 1 root root 0 0 0 8:42.64 0.0 0.0 15 0 S 0 0 0 0 [ll_mgs_03]
12989 1 root root 0 0 0 8:43.07 0.0 0.0 15 0 S 0 0 0 0 [ll_mgs_04]
13002 1 root root 0 0 0 8:44.06 0.0 0.0 15 0 S 0 0 0 0 [ll_mgs_05]
13344 683 root root 0 0 0 0:00.34 0.0 0.0 10 -5 S 0 0 0 0 [kmmpd-dm-22]
13345 683 root root 0 0 0 198:05.15 0.0 0.0 10 -5 S 0 0 0 0 [jbd2/dm-22-8]
13346 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13347 683 root root 0 0 0 0:00.00 0.0 0.0 11 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13348 683 root root 0 0 0 0:00.00 0.0 0.0 11 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13349 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13350 683 root root 0 0 0 0:00.00 0.0 0.0 11 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13351 683 root root 0 0 0 0:00.00 0.0 0.0 11 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13352 683 root root 0 0 0 0:00.00 0.0 0.0 11 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13353 683 root root 0 0 0 0:00.00 0.0 0.0 11 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13354 683 root root 0 0 0 0:00.00 0.0 0.0 11 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13355 683 root root 0 0 0 0:00.00 0.0 0.0 11 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13356 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13357 683 root root 0 0 0 0:00.00 0.0 0.0 11 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13358 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13359 683 root root 0 0 0 0:00.00 0.0 0.0 12 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13360 683 root root 0 0 0 0:00.00 0.0 0.0 13 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13361 683 root root 0 0 0 0:00.00 0.0 0.0 13 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13362 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13363 683 root root 0 0 0 0:00.00 0.0 0.0 14 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13364 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13365 683 root root 0 0 0 0:00.00 0.0 0.0 15 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13366 683 root root 0 0 0 0:00.00 0.0 0.0 16 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13367 683 root root 0 0 0 0:00.00 0.0 0.0 16 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13368 683 root root 0 0 0 0:00.00 0.0 0.0 17 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13369 683 root root 0 0 0 0:00.00 0.0 0.0 18 -5 S 0 0 0 0 [ldiskfs-dio-unw]
13378 1 root root 0 0 0 203:06.57 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_00]
13379 1 root root 0 0 0 203:10.70 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_01]
13380 1 root root 0 0 0 203:17.27 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_02]
13381 1 root root 0 0 0 203:29.64 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_03]
13382 1 root root 0 0 0 203:06.25 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_04]
13383 1 root root 0 0 0 203:27.58 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_05]
13384 1 root root 0 0 0 203:44.13 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_06]
13385 1 root root 0 0 0 203:19.51 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_07]
13386 1 root root 0 0 0 203:06.79 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_08]
13387 1 root root 0 0 0 203:19.36 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_09]
13389 1 root root 0 0 0 203:15.20 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_10]
13390 1 root root 0 0 0 203:36.70 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_11]
13391 1 root root 0 0 0 203:01.10 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_12]
13392 1 root root 0 0 0 203:24.40 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_13]
13393 1 root root 0 0 0 203:19.63 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_14]
13394 1 root root 0 0 0 203:27.79 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_15]
13395 1 root root 0 0 0 203:01.07 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_16]
13396 1 root root 0 0 0 203:17.84 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_17]
13397 1 root root 0 0 0 203:30.05 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_18]
13401 1 root root 0 0 0 203:02.19 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_19]
13402 1 root root 0 0 0 203:08.25 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_20]
13403 1 root root 0 0 0 203:12.70 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_21]
13404 1 root root 0 0 0 203:13.89 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_22]
13405 1 root root 0 0 0 203:05.79 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_23]
13406 1 root root 0 0 0 203:07.14 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_24]
13407 1 root root 0 0 0 203:08.31 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_25]
13408 1 root root 0 0 0 203:06.53 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_26]
13409 1 root root 0 0 0 202:58.92 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_27]
13410 1 root root 0 0 0 203:06.74 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_28]
13411 1 root root 0 0 0 203:19.21 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_29]
13412 1 root root 0 0 0 203:13.59 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_30]
13413 1 root root 0 0 0 203:17.47 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_31]
13414 1 root root 0 0 0 2:20.67 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_00]
13415 1 root root 0 0 0 2:21.23 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_01]
13416 1 root root 0 0 0 2:21.51 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_02]
13417 1 root root 0 0 0 2:21.17 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_03]
13418 1 root root 0 0 0 2:21.70 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_04]
13419 1 root root 0 0 0 2:21.01 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_05]
13420 1 root root 0 0 0 2:20.63 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_06]
13421 1 root root 0 0 0 2:20.09 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_07]
13422 1 root root 0 0 0 2:20.61 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_08]
13423 1 root root 0 0 0 2:21.51 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_09]
13424 1 root root 0 0 0 2:20.94 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_10]
13425 1 root root 0 0 0 2:21.03 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_11]
13426 1 root root 0 0 0 2:20.91 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_12]
13427 1 root root 0 0 0 2:20.39 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_13]
13428 1 root root 0 0 0 2:19.82 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_14]
13429 1 root root 0 0 0 2:20.89 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_15]
13430 1 root root 0 0 0 2:20.48 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_16]
13431 1 root root 0 0 0 2:21.05 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_17]
13432 1 root root 0 0 0 2:20.95 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_18]
13433 1 root root 0 0 0 2:21.00 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_attr_19]
13434 1 root root 0 0 0 2:20.72 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_20]
13435 1 root root 0 0 0 2:20.90 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_21]
13436 1 root root 0 0 0 2:20.52 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_22]
13437 1 root root 0 0 0 2:21.26 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_23]
13438 1 root root 0 0 0 2:20.15 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_24]
13439 1 root root 0 0 0 2:20.59 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_25]
13440 1 root root 0 0 0 2:20.88 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_26]
13441 1 root root 0 0 0 2:21.11 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_27]
13442 1 root root 0 0 0 2:20.78 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_28]
13443 1 root root 0 0 0 2:20.79 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_29]
13444 1 root root 0 0 0 2:20.99 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_30]
13445 1 root root 0 0 0 2:20.43 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_attr_31]
13448 1 root root 0 0 0 12:00.92 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_00]
13449 1 root root 0 0 0 11:56.41 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_01]
13802 1 root root 0 0 0 12:13.70 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_02]
13850 1 root root 0 0 0 203:18.98 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_32]
13851 1 root root 0 0 0 203:08.20 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_33]
13852 1 root root 0 0 0 203:23.87 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_34]
13853 1 root root 0 0 0 202:58.41 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_35]
13854 1 root root 0 0 0 203:35.22 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_36]
13855 1 root root 0 0 0 203:09.28 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_37]
13856 1 root root 0 0 0 203:15.01 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_38]
13857 1 root root 0 0 0 203:08.45 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_39]
13858 1 root root 0 0 0 203:05.21 1.9 0.0 15 0 S 0 0 0 0 [ll_mdt_40]
13859 1 root root 0 0 0 203:21.78 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_41]
13860 1 root root 0 0 0 203:02.73 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_42]
13861 1 root root 0 0 0 203:04.67 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_43]
13862 1 root root 0 0 0 203:22.59 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_44]
13865 1 root root 0 0 0 202:56.44 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_45]
13866 1 root root 0 0 0 203:08.42 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_46]
13894 1 root root 0 0 0 202:57.45 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_47]
13895 1 root root 0 0 0 203:09.10 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_48]
13896 1 root root 0 0 0 203:18.10 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_49]
13897 1 root root 0 0 0 202:59.63 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_50]
13898 1 root root 0 0 0 203:15.63 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_51]
13899 1 root root 0 0 0 203:35.58 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_52]
13900 1 root root 0 0 0 203:01.87 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_53]
13903 1 root root 0 0 0 203:04.70 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_54]
13904 1 root root 0 0 0 203:17.16 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_55]
13905 1 root root 0 0 0 203:20.68 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_56]
13906 1 root root 0 0 0 202:58.52 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_57]
13907 1 root root 0 0 0 203:06.59 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_58]
13908 1 root root 0 0 0 203:03.35 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_59]
13909 1 root root 0 0 0 203:09.64 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_60]
13910 1 root root 0 0 0 202:58.88 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_61]
13911 1 root root 0 0 0 203:12.85 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_62]
13912 1 root root 0 0 0 203:20.78 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_63]
13913 1 root root 0 0 0 202:56.61 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_64]
13914 1 root root 0 0 0 203:08.93 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_65]
13915 1 root root 0 0 0 203:23.10 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_66]
13916 1 root root 0 0 0 203:08.36 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_67]
13917 1 root root 0 0 0 203:16.66 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_68]
13918 1 root root 0 0 0 203:06.35 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_69]
13919 1 root root 0 0 0 203:16.26 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_70]
13920 1 root root 0 0 0 202:52.21 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_71]
13921 1 root root 0 0 0 203:13.37 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_72]
13922 1 root root 0 0 0 203:25.76 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_73]
13923 1 root root 0 0 0 202:59.21 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_74]
13924 1 root root 0 0 0 203:29.27 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_75]
13925 1 root root 0 0 0 203:17.78 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_76]
13926 1 root root 0 0 0 202:59.37 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_77]
13927 1 root root 0 0 0 203:07.31 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_78]
13928 1 root root 0 0 0 202:59.65 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_79]
13929 1 root root 0 0 0 202:53.87 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_80]
13930 1 root root 0 0 0 203:07.72 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_81]
13931 1 root root 0 0 0 202:58.91 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_82]
13932 1 root root 0 0 0 203:04.61 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_83]
13933 1 root root 0 0 0 203:22.91 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_84]
13934 1 root root 0 0 0 203:04.01 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_85]
13935 1 root root 0 0 0 203:15.58 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_86]
13936 1 root root 0 0 0 203:29.93 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_87]
13937 1 root root 0 0 0 203:33.28 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_88]
13938 1 root root 0 0 0 203:12.85 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_89]
13939 1 root root 0 0 0 203:13.83 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_90]
13940 1 root root 0 0 0 202:52.79 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_91]
13941 1 root root 0 0 0 203:12.98 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_92]
13942 1 root root 0 0 0 203:11.37 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_93]
13943 1 root root 0 0 0 203:19.84 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_94]
13944 1 root root 0 0 0 203:09.27 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_95]
13945 1 root root 0 0 0 203:09.92 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_96]
13946 1 root root 0 0 0 203:10.62 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_97]
13947 1 root root 0 0 0 203:03.10 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_98]
13948 1 root root 0 0 0 203:12.18 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_99]
13949 1 root root 0 0 0 203:19.86 1.9 0.0 15 0 S 0 0 0 0 [ll_mdt_100]
13950 1 root root 0 0 0 203:13.88 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_101]
13951 1 root root 0 0 0 203:18.22 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_102]
13952 1 root root 0 0 0 202:55.70 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_103]
13953 1 root root 0 0 0 203:10.49 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_104]
13954 1 root root 0 0 0 202:56.66 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_105]
13955 1 root root 0 0 0 203:11.19 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_106]
13956 1 root root 0 0 0 203:00.90 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_107]
13957 1 root root 0 0 0 203:09.72 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_108]
13958 1 root root 0 0 0 203:00.63 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_109]
13959 1 root root 0 0 0 203:03.17 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_110]
13960 1 root root 0 0 0 203:11.16 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_111]
13961 1 root root 0 0 0 203:20.73 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_112]
13962 1 root root 0 0 0 203:04.70 1.9 0.0 15 0 S 0 0 0 0 [ll_mdt_113]
13963 1 root root 0 0 0 203:11.39 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_114]
13964 1 root root 0 0 0 202:56.59 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_115]
13965 1 root root 0 0 0 203:03.18 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_116]
13966 1 root root 0 0 0 203:03.72 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_117]
13967 1 root root 0 0 0 203:27.15 1.9 0.0 15 0 S 0 0 0 0 [ll_mdt_118]
13968 1 root root 0 0 0 203:19.15 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_119]
13969 1 root root 0 0 0 203:12.41 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_120]
13970 1 root root 0 0 0 203:22.83 0.0 0.0 16 0 R 0 0 0 0 [ll_mdt_121]
13971 1 root root 0 0 0 203:07.49 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_122]
13972 1 root root 0 0 0 202:58.56 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_123]
13973 1 root root 0 0 0 203:15.04 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_124]
13974 1 root root 0 0 0 203:19.18 1.9 0.0 15 0 S 0 0 0 0 [ll_mdt_125]
13975 1 root root 0 0 0 203:19.28 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_126]
13976 1 root root 0 0 0 203:04.32 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_127]
13992 1 root root 0 0 0 11:50.86 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_112]
13993 1 root root 0 0 0 12:01.77 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_113]
13994 1 root root 0 0 0 11:55.57 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_114]
13995 1 root root 0 0 0 12:06.45 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_115]
13996 1 root root 0 0 0 12:01.49 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_116]
13997 1 root root 0 0 0 11:48.07 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_117]
13998 1 root root 0 0 0 11:34.37 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_118]
13999 1 root root 0 0 0 11:46.65 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_119]
14000 1 root root 0 0 0 11:36.75 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_120]
14001 1 root root 0 0 0 11:40.56 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_121]
14002 1 root root 0 0 0 11:57.31 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_122]
14003 1 root root 0 0 0 11:51.63 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_123]
14004 1 root root 0 0 0 11:52.47 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_124]
14005 1 root root 0 0 0 11:39.01 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_125]
14006 1 root root 0 0 0 11:52.31 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_126]
14007 1 root root 0 0 0 11:51.54 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_127]
14054 1 root root 0 0 0 12:09.35 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_03]
14057 1 root root 0 0 0 12:21.68 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_04]
15065 1 root root 0 0 0 2:37.42 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_02]
15066 1 root root 0 0 0 2:32.64 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_52]
15067 1 root root 0 0 0 2:33.00 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_53]
15068 1 root root 0 0 0 2:31.54 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_54]
15069 1 root root 0 0 0 0:38.98 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_02]
15070 1 root root 0 0 0 2:33.50 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_55]
15071 1 root root 0 0 0 2:37.55 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_03]
15072 1 root root 0 0 0 2:32.37 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_56]
15073 1 root root 0 0 0 2:32.40 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_57]
15074 1 root root 0 0 0 2:32.07 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_58]
15075 1 root root 0 0 0 2:31.58 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_59]
15076 1 root root 0 0 0 2:32.82 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_60]
15077 1 root root 0 0 0 2:32.37 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_61]
15078 1 root root 0 0 0 2:31.93 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_62]
15079 1 root root 0 0 0 0:39.16 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_03]
15080 1 root root 0 0 0 2:32.24 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_63]
15081 1 root root 0 0 0 2:33.50 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_64]
15082 1 root root 0 0 0 2:32.71 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_65]
15083 1 root root 0 0 0 2:30.98 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_66]
15084 1 root root 0 0 0 2:33.64 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_67]
15085 1 root root 0 0 0 2:32.74 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_68]
15086 1 root root 0 0 0 2:31.97 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_69]
15087 1 root root 0 0 0 2:37.35 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_04]
15088 1 root root 0 0 0 2:33.33 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_70]
15089 1 root root 0 0 0 2:32.89 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_71]
15090 1 root root 0 0 0 2:32.82 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_72]
15091 1 root root 0 0 0 2:32.19 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_73]
15092 1 root root 0 0 0 2:31.64 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_74]
15093 1 root root 0 0 0 2:32.52 0.0 0.0 16 0 S 0 0 0 0 [ldlm_cn_75]
15094 1 root root 0 0 0 2:33.08 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_76]
15095 1 root root 0 0 0 2:32.03 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_77]
15096 1 root root 0 0 0 2:32.22 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_78]
15097 1 root root 0 0 0 2:32.80 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_79]
15170 1 root root 0 0 0 0:39.35 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_04]
15173 1 root root 0 0 0 0:39.31 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_05]
15216 1 root root 0 0 0 0:39.05 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_06]
15217 1 root root 0 0 0 0:39.37 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_07]
15220 1 root root 0 0 0 0:39.05 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_08]
15233 1 root root 0 0 0 0:38.79 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_09]
15255 1 root root 0 0 0 0:38.95 0.0 0.0 16 0 S 0 0 0 0 [ldlm_cb_10]
15256 1 root root 0 0 0 0:39.22 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_11]
15264 1 root root 0 0 0 0:38.56 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_12]
15265 1 root root 0 0 0 0:38.99 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_13]
15273 1 root root 0 0 0 0:39.16 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_14]
15274 1 root root 0 0 0 0:39.03 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_15]
15282 1 root root 0 0 0 0:39.02 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_16]
15290 1 root root 0 0 0 0:39.28 0.0 0.0 16 0 S 0 0 0 0 [ldlm_cb_17]
15298 1 root root 0 0 0 0:38.99 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_18]
15299 1 root root 0 0 0 0:38.55 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_19]
15300 1 root root 0 0 0 0:39.28 0.0 0.0 16 0 S 0 0 0 0 [ldlm_cb_20]
15309 1 root root 0 0 0 0:39.43 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_21]
15316 1 root root 0 0 0 0:38.82 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_22]
15317 1 root root 0 0 0 0:39.48 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_23]
15318 1 root root 0 0 0 0:39.41 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_24]
15319 1 root root 0 0 0 0:39.58 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_25]
15327 1 root root 0 0 0 0:39.47 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_26]
15328 1 root root 0 0 0 0:39.31 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_27]
15336 1 root root 0 0 0 0:38.77 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_28]
15337 1 root root 0 0 0 0:39.44 0.0 0.0 16 0 S 0 0 0 0 [ldlm_cb_29]
15338 1 root root 0 0 0 0:38.28 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_30]
15339 1 root root 0 0 0 0:39.39 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_31]
15340 1 root root 0 0 0 0:39.22 0.0 0.0 16 0 S 0 0 0 0 [ldlm_cb_32]
15344 1 root root 0 0 0 0:39.17 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_33]
15351 1 root root 0 0 0 0:39.42 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_34]
15352 1 root root 0 0 0 0:38.93 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_35]
15353 1 root root 0 0 0 0:39.46 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_36]
15354 1 root root 0 0 0 0:38.91 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_37]
15355 1 root root 0 0 0 0:38.91 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_38]
15356 1 root root 0 0 0 0:38.96 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_39]
15357 1 root root 0 0 0 0:39.26 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_40]
15358 1 root root 0 0 0 0:38.86 0.0 0.0 16 0 S 0 0 0 0 [ldlm_cb_41]
15359 1 root root 0 0 0 0:38.98 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_42]
15360 1 root root 0 0 0 0:38.63 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_43]
15361 1 root root 0 0 0 0:38.92 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_44]
15362 1 root root 0 0 0 0:39.10 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_45]
15363 1 root root 0 0 0 0:39.15 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_46]
15364 1 root root 0 0 0 0:39.05 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_47]
15365 1 root root 0 0 0 0:39.47 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_48]
15366 1 root root 0 0 0 0:38.77 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_49]
15367 1 root root 0 0 0 12:04.86 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_05]
15368 1 root root 0 0 0 0:38.76 0.0 0.0 16 0 S 0 0 0 0 [ldlm_cb_50]
15369 1 root root 0 0 0 0:38.69 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_51]
15370 1 root root 0 0 0 0:39.28 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_52]
15371 1 root root 0 0 0 0:38.54 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_53]
15372 1 root root 0 0 0 0:38.98 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_54]
15373 1 root root 0 0 0 0:38.81 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_55]
15374 1 root root 0 0 0 0:38.78 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_56]
15375 1 root root 0 0 0 0:39.27 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_57]
15376 1 root root 0 0 0 0:39.03 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_58]
15377 1 root root 0 0 0 0:39.30 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_59]
15378 1 root root 0 0 0 0:39.34 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_60]
15379 1 root root 0 0 0 0:39.21 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_61]
15380 1 root root 0 0 0 0:38.91 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_62]
15381 1 root root 0 0 0 0:39.03 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_63]
15382 1 root root 0 0 0 0:38.94 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_64]
15383 1 root root 0 0 0 0:38.95 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_65]
15384 1 root root 0 0 0 0:39.34 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_66]
15385 1 root root 0 0 0 0:39.24 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_67]
15386 1 root root 0 0 0 0:39.39 0.0 0.0 16 0 S 0 0 0 0 [ldlm_cb_68]
15387 1 root root 0 0 0 0:39.33 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_69]
15388 1 root root 0 0 0 0:39.10 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_70]
15391 1 root root 0 0 0 0:38.77 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_71]
15392 1 root root 0 0 0 0:39.08 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_72]
15393 1 root root 0 0 0 0:39.20 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_73]
15394 1 root root 0 0 0 0:38.98 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_74]
15395 1 root root 0 0 0 0:39.12 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_75]
15396 1 root root 0 0 0 0:38.79 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_76]
15397 1 root root 0 0 0 0:38.88 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_77]
15398 1 root root 0 0 0 0:38.95 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_78]
15399 1 root root 0 0 0 0:39.24 0.0 0.0 16 0 S 0 0 0 0 [ldlm_cb_79]
15400 1 root root 0 0 0 0:39.61 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_80]
15401 1 root root 0 0 0 0:38.65 0.0 0.0 16 0 S 0 0 0 0 [ldlm_cb_81]
15402 1 root root 0 0 0 0:39.19 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_82]
15403 1 root root 0 0 0 0:39.04 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_83]
15404 1 root root 0 0 0 0:39.24 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_84]
15405 1 root root 0 0 0 0:38.91 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_85]
15406 1 root root 0 0 0 0:39.11 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_86]
15407 1 root root 0 0 0 0:38.94 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_87]
15408 1 root root 0 0 0 0:38.92 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_88]
15409 1 root root 0 0 0 0:39.10 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_89]
15410 1 root root 0 0 0 0:39.08 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_90]
15411 1 root root 0 0 0 0:38.68 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_91]
15412 1 root root 0 0 0 0:38.86 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_92]
15413 1 root root 0 0 0 0:39.75 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_93]
15414 1 root root 0 0 0 0:38.91 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_94]
15415 1 root root 0 0 0 0:39.11 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_95]
15416 1 root root 0 0 0 0:39.52 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_96]
15417 1 root root 0 0 0 0:38.84 0.0 0.0 16 0 S 0 0 0 0 [ldlm_cb_97]
15418 1 root root 0 0 0 0:39.17 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_98]
15419 1 root root 0 0 0 0:38.67 0.0 0.0 16 0 S 0 0 0 0 [ldlm_cb_99]
15420 1 root root 0 0 0 0:38.82 0.0 0.0 16 0 S 0 0 0 0 [ldlm_cb_100]
15421 1 root root 0 0 0 0:39.04 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_101]
15422 1 root root 0 0 0 0:38.92 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_102]
15423 1 root root 0 0 0 0:39.13 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_103]
15424 1 root root 0 0 0 0:38.60 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_104]
15425 1 root root 0 0 0 0:39.69 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_105]
15426 1 root root 0 0 0 0:38.93 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_106]
15427 1 root root 0 0 0 0:39.10 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_107]
15428 1 root root 0 0 0 0:39.57 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_108]
15429 1 root root 0 0 0 0:39.24 0.0 0.0 16 0 S 0 0 0 0 [ldlm_cb_109]
15430 1 root root 0 0 0 0:38.60 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_110]
15431 1 root root 0 0 0 0:39.25 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_111]
15432 1 root root 0 0 0 0:39.02 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_112]
15433 1 root root 0 0 0 0:39.00 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_113]
15434 1 root root 0 0 0 0:38.85 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_114]
15435 1 root root 0 0 0 0:39.68 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_115]
15436 1 root root 0 0 0 0:38.74 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_116]
15437 1 root root 0 0 0 0:39.09 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_117]
15438 1 root root 0 0 0 0:39.37 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_118]
15439 1 root root 0 0 0 0:38.72 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_119]
15440 1 root root 0 0 0 0:39.28 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_120]
15441 1 root root 0 0 0 0:38.88 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_121]
15442 1 root root 0 0 0 0:38.63 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_122]
15443 1 root root 0 0 0 0:39.11 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_123]
15444 1 root root 0 0 0 0:38.83 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_124]
15445 1 root root 0 0 0 0:39.54 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_125]
15446 1 root root 0 0 0 0:38.99 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_126]
15447 1 root root 0 0 0 0:39.31 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cb_127]
15448 1 root root 0 0 0 12:18.16 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_06]
15449 1 root root 0 0 0 12:03.03 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_07]
15450 1 root root 0 0 0 12:19.51 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_08]
15451 1 root root 0 0 0 11:59.04 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_09]
15452 1 root root 0 0 0 12:24.88 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_10]
15453 1 root root 0 0 0 12:24.21 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_11]
15454 1 root root 0 0 0 12:12.30 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_12]
15455 1 root root 0 0 0 12:05.84 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_13]
15456 1 root root 0 0 0 12:07.42 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_14]
15457 1 root root 0 0 0 12:20.64 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_15]
15458 1 root root 0 0 0 11:59.61 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_16]
15459 1 root root 0 0 0 12:24.56 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_17]
15460 1 root root 0 0 0 12:11.99 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_18]
15461 1 root root 0 0 0 12:03.11 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_19]
15462 1 root root 0 0 0 12:02.75 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_20]
15463 1 root root 0 0 0 12:14.23 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_21]
15464 1 root root 0 0 0 12:11.96 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_22]
15465 1 root root 0 0 0 12:05.89 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_23]
15466 1 root root 0 0 0 11:57.26 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_24]
15467 1 root root 0 0 0 12:01.41 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_25]
15468 1 root root 0 0 0 12:39.43 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_26]
15469 1 root root 0 0 0 12:11.11 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_27]
15470 1 root root 0 0 0 12:05.67 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_28]
15471 1 root root 0 0 0 12:08.50 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_29]
15472 1 root root 0 0 0 12:03.27 1.9 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_30]
15473 1 root root 0 0 0 12:02.18 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_31]
15474 1 root root 0 0 0 12:17.59 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_32]
15475 1 root root 0 0 0 12:00.25 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_33]
15476 1 root root 0 0 0 12:01.19 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_34]
15477 1 root root 0 0 0 12:17.48 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_35]
15478 1 root root 0 0 0 12:04.58 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_36]
15479 1 root root 0 0 0 12:18.67 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_37]
15480 1 root root 0 0 0 12:04.30 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_38]
15481 1 root root 0 0 0 12:17.32 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_39]
15482 1 root root 0 0 0 11:59.85 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_40]
15483 1 root root 0 0 0 12:13.75 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_41]
15484 1 root root 0 0 0 12:04.56 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_42]
15485 1 root root 0 0 0 12:20.90 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_43]
15486 1 root root 0 0 0 12:23.31 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_44]
15487 1 root root 0 0 0 12:09.20 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_45]
15488 1 root root 0 0 0 12:01.12 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_46]
15489 1 root root 0 0 0 12:11.32 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_47]
15490 1 root root 0 0 0 12:26.00 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_48]
15491 1 root root 0 0 0 12:12.24 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_49]
15492 1 root root 0 0 0 11:51.79 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_50]
15493 1 root root 0 0 0 12:01.42 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_51]
15494 1 root root 0 0 0 12:26.71 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_52]
15495 1 root root 0 0 0 12:10.02 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_53]
15496 1 root root 0 0 0 12:15.90 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_54]
15497 1 root root 0 0 0 12:13.89 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_55]
15498 1 root root 0 0 0 12:08.74 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_56]
15499 1 root root 0 0 0 12:14.16 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_57]
15500 1 root root 0 0 0 12:09.13 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_58]
15501 1 root root 0 0 0 12:21.27 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_59]
15502 1 root root 0 0 0 12:01.64 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_60]
15503 1 root root 0 0 0 12:01.87 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_61]
15504 1 root root 0 0 0 12:16.72 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_62]
15505 1 root root 0 0 0 12:08.68 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_63]
15506 1 root root 0 0 0 12:16.97 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_64]
15507 1 root root 0 0 0 12:14.01 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_65]
15508 1 root root 0 0 0 11:58.59 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_66]
15509 1 root root 0 0 0 11:58.93 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_67]
15510 1 root root 0 0 0 12:08.14 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_68]
15511 1 root root 0 0 0 12:14.02 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_69]
15512 1 root root 0 0 0 12:06.63 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_70]
15513 1 root root 0 0 0 12:07.03 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_71]
15514 1 root root 0 0 0 11:55.98 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_72]
15515 1 root root 0 0 0 12:08.63 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_73]
15516 1 root root 0 0 0 12:31.62 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_74]
15517 1 root root 0 0 0 12:12.43 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_75]
15518 1 root root 0 0 0 12:09.26 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_76]
15519 1 root root 0 0 0 12:13.90 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_77]
15520 1 root root 0 0 0 12:31.94 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_78]
15521 1 root root 0 0 0 12:27.58 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_79]
15522 1 root root 0 0 0 12:07.83 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_80]
15523 1 root root 0 0 0 12:02.84 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_81]
15524 1 root root 0 0 0 12:12.57 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_82]
15525 1 root root 0 0 0 12:04.76 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_83]
15526 1 root root 0 0 0 12:00.87 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_84]
15527 1 root root 0 0 0 12:05.99 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_85]
15528 1 root root 0 0 0 12:25.12 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_86]
15529 1 root root 0 0 0 12:32.04 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_87]
15530 1 root root 0 0 0 12:16.08 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_88]
15531 1 root root 0 0 0 12:16.76 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_89]
15532 1 root root 0 0 0 11:54.66 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_90]
15533 1 root root 0 0 0 12:11.21 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_91]
15546 1 root root 0 0 0 2:36.12 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_05]
15547 1 root root 0 0 0 2:36.60 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_06]
15548 1 root root 0 0 0 2:37.43 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_07]
15555 1 root root 0 0 0 2:37.55 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_08]
15556 1 root root 0 0 0 2:36.43 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_09]
15557 1 root root 0 0 0 2:38.00 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_10]
15560 1 root root 0 0 0 1:09.34 0.0 0.0 15 0 S 0 0 0 0 [ldlm_bl_02]
15561 1 root root 0 0 0 2:38.40 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_11]
15562 1 root root 0 0 0 2:37.75 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_12]
15563 1 root root 0 0 0 2:38.05 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_13]
15564 1 root root 0 0 0 2:38.06 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_14]
15680 1 root root 0 0 0 2:37.31 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_15]
15681 1 root root 0 0 0 2:37.19 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_16]
15682 1 root root 0 0 0 2:36.40 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_17]
15683 1 root root 0 0 0 2:38.04 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_18]
15684 1 root root 0 0 0 2:36.90 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_19]
15685 1 root root 0 0 0 2:37.58 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_20]
15686 1 root root 0 0 0 2:37.24 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_21]
15687 1 root root 0 0 0 2:36.50 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_22]
15688 1 root root 0 0 0 2:37.10 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_23]
15689 1 root root 0 0 0 2:37.05 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_24]
15690 1 root root 0 0 0 2:37.85 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_25]
15691 1 root root 0 0 0 2:36.72 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_26]
15692 1 root root 0 0 0 2:37.71 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_27]
15693 1 root root 0 0 0 2:37.40 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_28]
15694 1 root root 0 0 0 2:37.31 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_29]
15695 1 root root 0 0 0 2:36.55 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_30]
16947 1 root root 0 0 0 1:04.50 0.0 0.0 15 0 S 0 0 0 0 [ldlm_bl_04]
17738 1 root root 0 0 0 1:09.11 0.0 0.0 15 0 S 0 0 0 0 [ldlm_bl_03]
19545 1 root root 0 0 0 3:27.27 0.0 0.0 15 0 S 0 0 0 0 [ll_mgs_08]
19546 1 root root 0 0 0 3:27.64 0.0 0.0 15 0 S 0 0 0 0 [ll_mgs_09]
19659 1 root root 0 0 0 2:25.74 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_80]
19660 1 root root 0 0 0 2:25.85 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_81]
19661 1 root root 0 0 0 2:25.66 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_82]
19662 1 root root 0 0 0 2:26.24 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_83]
19663 1 root root 0 0 0 2:26.08 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_84]
19664 1 root root 0 0 0 2:25.81 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_85]
19665 1 root root 0 0 0 2:25.46 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_86]
19666 1 root root 0 0 0 2:25.91 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_87]
19667 1 root root 0 0 0 2:25.02 0.0 0.0 16 0 S 0 0 0 0 [ldlm_cn_88]
19668 1 root root 0 0 0 2:24.77 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_89]
19669 1 root root 0 0 0 2:25.93 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_90]
19670 1 root root 0 0 0 2:25.40 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_91]
19671 1 root root 0 0 0 2:25.34 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_92]
19672 1 root root 0 0 0 2:26.28 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_93]
19673 1 root root 0 0 0 2:26.28 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_94]
19674 1 root root 0 0 0 2:25.56 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_95]
19675 1 root root 0 0 0 2:25.74 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_96]
19676 1 root root 0 0 0 2:25.36 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_97]
19677 1 root root 0 0 0 2:26.34 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_98]
19678 1 root root 0 0 0 2:25.61 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_99]
19679 1 root root 0 0 0 2:25.78 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_100]
19680 1 root root 0 0 0 2:26.13 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_101]
19681 1 root root 0 0 0 2:26.04 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_102]
19682 1 root root 0 0 0 2:26.53 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_103]
19683 1 root root 0 0 0 2:26.20 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_104]
19684 1 root root 0 0 0 2:25.71 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_105]
19685 1 root root 0 0 0 2:25.05 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_106]
19686 1 root root 0 0 0 2:25.85 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_107]
19687 1 root root 0 0 0 2:26.59 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_108]
19688 1 root root 0 0 0 2:26.02 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_109]
19689 1 root root 0 0 0 2:26.76 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_110]
19690 1 root root 0 0 0 2:26.04 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_111]
19691 1 root root 0 0 0 2:26.06 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_112]
19692 1 root root 0 0 0 2:27.17 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_113]
19693 1 root root 0 0 0 2:26.39 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_114]
19694 1 root root 0 0 0 2:25.43 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_115]
19695 1 root root 0 0 0 2:26.43 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_116]
19696 1 root root 0 0 0 2:26.03 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_117]
19697 1 root root 0 0 0 2:26.65 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_118]
19698 1 root root 0 0 0 2:25.79 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_119]
19699 1 root root 0 0 0 2:25.89 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_120]
19700 1 root root 0 0 0 2:24.90 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_121]
19701 1 root root 0 0 0 2:26.51 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_122]
19702 1 root root 0 0 0 2:26.05 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_123]
19703 1 root root 0 0 0 2:27.19 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_124]
19704 1 root root 0 0 0 2:25.43 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_125]
19705 1 root root 0 0 0 2:25.36 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_126]
19706 1 root root 0 0 0 2:25.76 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_127]
21019 1 root root 0 0 0 2:36.35 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_31]
21020 1 root root 0 0 0 2:36.54 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_32]
21021 1 root root 0 0 0 2:36.38 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_33]
21022 1 root root 0 0 0 2:36.97 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_34]
21023 1 root root 0 0 0 2:36.23 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_35]
21024 1 root root 0 0 0 2:36.53 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_36]
24880 1 root root 0 0 0 12:19.64 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_92]
24881 1 root root 0 0 0 12:13.47 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_93]
24882 1 root root 0 0 0 11:57.13 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_94]
24893 1 root root 0 0 0 12:12.29 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_95]
24894 1 root root 0 0 0 12:07.90 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_96]
24895 1 root root 0 0 0 12:08.03 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_97]
24896 1 root root 0 0 0 11:59.86 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_98]
24897 1 root root 0 0 0 11:58.93 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_99]
24898 1 root root 0 0 0 12:10.89 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_100]
24899 1 root root 0 0 0 11:59.60 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_101]
24900 1 root root 0 0 0 11:54.85 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_102]
24901 1 root root 0 0 0 12:11.66 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_103]
24902 1 root root 0 0 0 12:09.57 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_104]
24903 1 root root 0 0 0 11:51.16 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_105]
24904 1 root root 0 0 0 12:25.44 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_106]
24905 1 root root 0 0 0 11:51.83 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_107]
24906 1 root root 0 0 0 12:22.67 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_108]
25680 1 root root 0 0 0 0:00.23 0.0 0.0 15 0 S 0 0 0 0 [obd_zombid]
25735 1 root root 0 0 0 305:51.16 0.0 0.0 15 0 S 0 0 0 0 [kiblnd_sd_00]
25736 1 root root 0 0 0 305:48.02 0.0 0.0 15 0 S 0 0 0 0 [kiblnd_sd_01]
25737 1 root root 0 0 0 306:10.72 1.9 0.0 15 0 S 0 0 0 0 [kiblnd_sd_02]
25738 1 root root 0 0 0 305:47.83 1.9 0.0 15 0 S 0 0 0 0 [kiblnd_sd_03]
25739 1 root root 0 0 0 305:54.91 0.0 0.0 15 0 S 0 0 0 0 [kiblnd_sd_04]
25740 1 root root 0 0 0 306:22.09 0.0 0.0 15 0 S 0 0 0 0 [kiblnd_sd_05]
25741 1 root root 0 0 0 305:58.01 0.0 0.0 15 0 S 0 0 0 0 [kiblnd_sd_06]
25742 1 root root 0 0 0 305:59.94 1.9 0.0 15 0 S 0 0 0 0 [kiblnd_sd_07]
25743 1 root root 0 0 0 305:51.31 0.0 0.0 15 0 S 0 0 0 0 [kiblnd_sd_08]
25744 1 root root 0 0 0 306:08.65 1.9 0.0 15 0 S 0 0 0 0 [kiblnd_sd_09]
25745 1 root root 0 0 0 306:16.06 0.0 0.0 15 0 S 0 0 0 0 [kiblnd_sd_10]
25746 1 root root 0 0 0 305:53.10 0.0 0.0 15 0 S 0 0 0 0 [kiblnd_sd_11]
25747 1 root root 0 0 0 305:50.60 1.9 0.0 15 0 S 0 0 0 0 [kiblnd_sd_12]
25748 1 root root 0 0 0 306:12.94 1.9 0.0 15 0 S 0 0 0 0 [kiblnd_sd_13]
25749 1 root root 0 0 0 306:25.14 1.9 0.0 15 0 S 0 0 0 0 [kiblnd_sd_14]
25750 1 root root 0 0 0 306:13.01 1.9 0.0 15 0 S 0 0 0 0 [kiblnd_sd_15]
25751 1 root root 0 0 0 306:14.12 0.0 0.0 15 0 S 0 0 0 0 [kiblnd_sd_16]
25752 1 root root 0 0 0 306:01.96 0.0 0.0 15 0 S 0 0 0 0 [kiblnd_sd_17]
25753 1 root root 0 0 0 306:43.51 1.9 0.0 15 0 S 0 0 0 0 [kiblnd_sd_18]
25754 1 root root 0 0 0 305:54.65 0.0 0.0 15 0 S 0 0 0 0 [kiblnd_sd_19]
25755 1 root root 0 0 0 305:32.88 1.9 0.0 15 0 S 0 0 0 0 [kiblnd_sd_20]
25756 1 root root 0 0 0 306:15.45 0.0 0.0 15 0 S 0 0 0 0 [kiblnd_sd_21]
25757 1 root root 0 0 0 306:13.40 0.0 0.0 15 0 S 0 0 0 0 [kiblnd_sd_22]
25758 1 root root 0 0 0 306:28.29 0.0 0.0 15 0 S 0 0 0 0 [kiblnd_sd_23]
25759 1 root root 0 0 0 0:01.53 0.0 0.0 15 0 S 0 0 0 0 [kiblnd_connd]
25760 1 root root 0 0 0 20:18.08 0.0 0.0 15 0 S 0 0 0 0 [ptlrpcd]
25761 1 root root 0 0 0 0:00.08 0.0 0.0 15 0 S 0 0 0 0 [ptlrpcd-recov]
25762 1 root root 0 0 0 0:04.80 0.0 0.0 15 0 S 0 0 0 0 [ll_ping]
27630 683 root root 0 0 0 0:00.00 0.0 0.0 10 -5 S 0 0 0 0 [kjournald]
27860 9069 root root 0 0 0 0:00.00 0.0 0.0 20 0 Z 0 0 0 0 [sh] <defunct>
28401 1 root root 0 0 0 0:50.48 0.0 0.0 15 0 S 0 0 0 0 [ldlm_bl_05]
28402 1 root root 0 0 0 0:51.13 0.0 0.0 15 0 S 0 0 0 0 [ldlm_bl_06]
29265 1 root root 0 0 0 12:01.85 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_109]
29266 1 root root 0 0 0 11:50.98 0.0 0.0 15 0 S 0 0 0 0 [ll_mdt_rdpg_110]
29267 1 root root 0 0 0 12:00.46 0.0 0.0 16 0 S 0 0 0 0 [ll_mdt_rdpg_111]
29756 1 root root 0 0 0 2:35.29 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_37]
29757 1 root root 0 0 0 2:36.86 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_38]
29758 1 root root 0 0 0 2:36.29 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_39]
29759 1 root root 0 0 0 2:35.10 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_40]
29760 1 root root 0 0 0 2:35.06 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_41]
29761 1 root root 0 0 0 2:36.06 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_42]
29762 1 root root 0 0 0 2:37.01 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_43]
29763 1 root root 0 0 0 2:36.79 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_44]
29764 1 root root 0 0 0 2:35.82 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_45]
29765 1 root root 0 0 0 2:35.28 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_46]
29766 1 root root 0 0 0 2:35.46 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_47]
29767 1 root root 0 0 0 2:35.63 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_48]
29768 1 root root 0 0 0 2:36.50 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_49]
29769 1 root root 0 0 0 2:35.86 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_50]
29770 1 root root 0 0 0 2:36.50 0.0 0.0 15 0 S 0 0 0 0 [ldlm_cn_51]

ldlm.namespaces.scratch1-MDT0000-mdc-ffff8806202d9c00.lock_count=2772
ldlm.namespaces.scratch1-OST0000-osc-ffff8806202d9c00.lock_count=2535
ldlm.namespaces.scratch1-OST0001-osc-ffff8806202d9c00.lock_count=2773
ldlm.namespaces.scratch1-OST0002-osc-ffff8806202d9c00.lock_count=2552
ldlm.namespaces.scratch1-OST0003-osc-ffff8806202d9c00.lock_count=2562
ldlm.namespaces.scratch1-OST0004-osc-ffff8806202d9c00.lock_count=2636
ldlm.namespaces.scratch1-OST0005-osc-ffff8806202d9c00.lock_count=2552
ldlm.namespaces.scratch1-OST0006-osc-ffff8806202d9c00.lock_count=2577
ldlm.namespaces.scratch1-OST0007-osc-ffff8806202d9c00.lock_count=2548
ldlm.namespaces.scratch1-OST0008-osc-ffff8806202d9c00.lock_count=2988
ldlm.namespaces.scratch1-OST0009-osc-ffff8806202d9c00.lock_count=2710
ldlm.namespaces.scratch1-OST000a-osc-ffff8806202d9c00.lock_count=2818
ldlm.namespaces.scratch1-OST000b-osc-ffff8806202d9c00.lock_count=2539
ldlm.namespaces.scratch1-OST000c-osc-ffff8806202d9c00.lock_count=2795
ldlm.namespaces.scratch1-OST000d-osc-ffff8806202d9c00.lock_count=2562
ldlm.namespaces.scratch1-OST000e-osc-ffff8806202d9c00.lock_count=2611
ldlm.namespaces.scratch1-OST000f-osc-ffff8806202d9c00.lock_count=2765
ldlm.namespaces.scratch1-OST0010-osc-ffff8806202d9c00.lock_count=2575
ldlm.namespaces.scratch1-OST0011-osc-ffff8806202d9c00.lock_count=2558
ldlm.namespaces.scratch1-OST0012-osc-ffff8806202d9c00.lock_count=2549
ldlm.namespaces.scratch1-OST0013-osc-ffff8806202d9c00.lock_count=2990
ldlm.namespaces.scratch1-OST0014-osc-ffff8806202d9c00.lock_count=2541
ldlm.namespaces.scratch1-OST0015-osc-ffff8806202d9c00.lock_count=2822
ldlm.namespaces.scratch1-OST0016-osc-ffff8806202d9c00.lock_count=2527
ldlm.namespaces.scratch1-OST0017-osc-ffff8806202d9c00.lock_count=2799
ldlm.namespaces.scratch1-OST0018-osc-ffff8806202d9c00.lock_count=2564
ldlm.namespaces.scratch1-OST0019-osc-ffff8806202d9c00.lock_count=2617
ldlm.namespaces.scratch1-OST001a-osc-ffff8806202d9c00.lock_count=2786
ldlm.namespaces.scratch1-OST001b-osc-ffff8806202d9c00.lock_count=2578
ldlm.namespaces.scratch1-OST001c-osc-ffff8806202d9c00.lock_count=2553
ldlm.namespaces.scratch1-OST001d-osc-ffff8806202d9c00.lock_count=2552
ldlm.namespaces.scratch1-OST001e-osc-ffff8806202d9c00.lock_count=2823
ldlm.namespaces.scratch1-OST001f-osc-ffff8806202d9c00.lock_count=2702
ldlm.namespaces.scratch1-OST0020-osc-ffff8806202d9c00.lock_count=2814
ldlm.namespaces.scratch1-OST0021-osc-ffff8806202d9c00.lock_count=2541
ldlm.namespaces.scratch1-OST0022-osc-ffff8806202d9c00.lock_count=2802
ldlm.namespaces.scratch1-OST0023-osc-ffff8806202d9c00.lock_count=2556
ldlm.namespaces.scratch1-OST0024-osc-ffff8806202d9c00.lock_count=2626
ldlm.namespaces.scratch1-OST0025-osc-ffff8806202d9c00.lock_count=2796
ldlm.namespaces.scratch1-OST0026-osc-ffff8806202d9c00.lock_count=2596
ldlm.namespaces.scratch1-OST0027-osc-ffff8806202d9c00.lock_count=2548
ldlm.namespaces.scratch1-OST0028-osc-ffff8806202d9c00.lock_count=2571
ldlm.namespaces.scratch1-OST0029-osc-ffff8806202d9c00.lock_count=2818
ldlm.namespaces.scratch1-OST002a-osc-ffff8806202d9c00.lock_count=2718
ldlm.namespaces.scratch1-OST002b-osc-ffff8806202d9c00.lock_count=3055
ldlm.namespaces.scratch1-OST002c-osc-ffff8806202d9c00.lock_count=2545
ldlm.namespaces.scratch1-OST002d-osc-ffff8806202d9c00.lock_count=2557
ldlm.namespaces.scratch1-OST002e-osc-ffff8806202d9c00.lock_count=2566
ldlm.namespaces.scratch1-OST002f-osc-ffff8806202d9c00.lock_count=2613
ldlm.namespaces.scratch1-OST0030-osc-ffff8806202d9c00.lock_count=2774
ldlm.namespaces.scratch1-OST0031-osc-ffff8806202d9c00.lock_count=2582
ldlm.namespaces.scratch1-OST0032-osc-ffff8806202d9c00.lock_count=2554
ldlm.namespaces.scratch1-OST0033-osc-ffff8806202d9c00.lock_count=2605
ldlm.namespaces.scratch1-OST0034-osc-ffff8806202d9c00.lock_count=2555
ldlm.namespaces.scratch1-OST0035-osc-ffff8806202d9c00.lock_count=2736
ldlm.namespaces.scratch1-OST0036-osc-ffff8806202d9c00.lock_count=2807
ldlm.namespaces.scratch1-OST0037-osc-ffff8806202d9c00.lock_count=2542
ldlm.namespaces.scratch1-OST0038-osc-ffff8806202d9c00.lock_count=2550
ldlm.namespaces.scratch1-OST0039-osc-ffff8806202d9c00.lock_count=2570
ldlm.namespaces.scratch1-OST003a-osc-ffff8806202d9c00.lock_count=2569
ldlm.namespaces.scratch1-OST003b-osc-ffff8806202d9c00.lock_count=2797
ldlm.namespaces.scratch1-OST003c-osc-ffff8806202d9c00.lock_count=2586
ldlm.namespaces.scratch1-OST003d-osc-ffff8806202d9c00.lock_count=2562
ldlm.namespaces.scratch1-OST003e-osc-ffff8806202d9c00.lock_count=2593
ldlm.namespaces.scratch1-OST003f-osc-ffff8806202d9c00.lock_count=2554
ldlm.namespaces.scratch1-OST0040-osc-ffff8806202d9c00.lock_count=2753
ldlm.namespaces.scratch1-OST0041-osc-ffff8806202d9c00.lock_count=2799
ldlm.namespaces.scratch1-OST0042-osc-ffff8806202d9c00.lock_count=2548
ldlm.namespaces.scratch1-OST0043-osc-ffff8806202d9c00.lock_count=2546
ldlm.namespaces.scratch1-OST0044-osc-ffff8806202d9c00.lock_count=2560
ldlm.namespaces.scratch1-OST0045-osc-ffff8806202d9c00.lock_count=2569
ldlm.namespaces.scratch1-OST0046-osc-ffff8806202d9c00.lock_count=2805
ldlm.namespaces.scratch1-OST0047-osc-ffff8806202d9c00.lock_count=2591
ldlm.namespaces.scratch1-OST0048-osc-ffff8806202d9c00.lock_count=2571
ldlm.namespaces.scratch1-OST0049-osc-ffff8806202d9c00.lock_count=2607
ldlm.namespaces.scratch1-OST004a-osc-ffff8806202d9c00.lock_count=2561
ldlm.namespaces.scratch1-OST004b-osc-ffff8806202d9c00.lock_count=2871
ldlm.namespaces.scratch1-OST004c-osc-ffff8806202d9c00.lock_count=2837
ldlm.namespaces.scratch1-OST004d-osc-ffff8806202d9c00.lock_count=2566
ldlm.namespaces.scratch1-OST004e-osc-ffff8806202d9c00.lock_count=2561
ldlm.namespaces.scratch1-OST004f-osc-ffff8806202d9c00.lock_count=2561
ldlm.namespaces.scratch1-OST0050-osc-ffff8806202d9c00.lock_count=2554
ldlm.namespaces.scratch1-OST0051-osc-ffff8806202d9c00.lock_count=2805
ldlm.namespaces.scratch1-OST0052-osc-ffff8806202d9c00.lock_count=2583
ldlm.namespaces.scratch1-OST0053-osc-ffff8806202d9c00.lock_count=2573
ldlm.namespaces.scratch1-OST0054-osc-ffff8806202d9c00.lock_count=2585
ldlm.namespaces.scratch1-OST0055-osc-ffff8806202d9c00.lock_count=2565
ldlm.namespaces.scratch1-OST0056-osc-ffff8806202d9c00.lock_count=2881
ldlm.namespaces.scratch1-OST0057-osc-ffff8806202d9c00.lock_count=2577
ldlm.namespaces.scratch1-OST0058-osc-ffff8806202d9c00.lock_count=2576
ldlm.namespaces.scratch1-OST0059-osc-ffff8806202d9c00.lock_count=2585
ldlm.namespaces.scratch1-OST005a-osc-ffff8806202d9c00.lock_count=2558
ldlm.namespaces.scratch1-OST005b-osc-ffff8806202d9c00.lock_count=2559
ldlm.namespaces.scratch1-OST005c-osc-ffff8806202d9c00.lock_count=2784
ldlm.namespaces.scratch1-OST005d-osc-ffff8806202d9c00.lock_count=2586
ldlm.namespaces.scratch1-OST005e-osc-ffff8806202d9c00.lock_count=2569
ldlm.namespaces.scratch1-OST005f-osc-ffff8806202d9c00.lock_count=2569
ldlm.namespaces.scratch1-OST0060-osc-ffff8806202d9c00.lock_count=2564
ldlm.namespaces.scratch1-OST0061-osc-ffff8806202d9c00.lock_count=2902
ldlm.namespaces.scratch1-OST0062-osc-ffff8806202d9c00.lock_count=2819
ldlm.namespaces.scratch1-OST0063-osc-ffff8806202d9c00.lock_count=2571
ldlm.namespaces.scratch1-OST0064-osc-ffff8806202d9c00.lock_count=2561
ldlm.namespaces.scratch1-OST0065-osc-ffff8806202d9c00.lock_count=2575
ldlm.namespaces.scratch1-OST0066-osc-ffff8806202d9c00.lock_count=2754
ldlm.namespaces.scratch1-OST0067-osc-ffff8806202d9c00.lock_count=2527
ldlm.namespaces.scratch1-OST0068-osc-ffff8806202d9c00.lock_count=2595
ldlm.namespaces.scratch1-OST0069-osc-ffff8806202d9c00.lock_count=2577
ldlm.namespaces.scratch1-OST006a-osc-ffff8806202d9c00.lock_count=2576
ldlm.namespaces.scratch1-OST006b-osc-ffff8806202d9c00.lock_count=2575
ldlm.namespaces.scratch1-OST006c-osc-ffff8806202d9c00.lock_count=2897
ldlm.namespaces.scratch1-OST006d-osc-ffff8806202d9c00.lock_count=2827
ldlm.namespaces.scratch1-OST006e-osc-ffff8806202d9c00.lock_count=2567
ldlm.namespaces.scratch1-OST006f-osc-ffff8806202d9c00.lock_count=2619
ldlm.namespaces.scratch1-OST0070-osc-ffff8806202d9c00.lock_count=2570
ldlm.namespaces.scratch1-OST0071-osc-ffff8806202d9c00.lock_count=2908
ldlm.namespaces.scratch1-OST0072-osc-ffff8806202d9c00.lock_count=2520
ldlm.namespaces.scratch1-OST0073-osc-ffff8806202d9c00.lock_count=2592
ldlm.namespaces.scratch1-OST0074-osc-ffff8806202d9c00.lock_count=2561
ldlm.namespaces.scratch1-OST0075-osc-ffff8806202d9c00.lock_count=2572
ldlm.namespaces.scratch1-OST0076-osc-ffff8806202d9c00.lock_count=2559
ldlm.namespaces.scratch1-OST0077-osc-ffff8806202d9c00.lock_count=2599
ldlm.namespaces.scratch1-OST0078-osc-ffff8806202d9c00.lock_count=2795
ldlm.namespaces.scratch1-OST0079-osc-ffff8806202d9c00.lock_count=2576
ldlm.namespaces.scratch1-OST007a-osc-ffff8806202d9c00.lock_count=2601
ldlm.namespaces.scratch1-OST007b-osc-ffff8806202d9c00.lock_count=2563
ldlm.namespaces.scratch1-OST007c-osc-ffff8806202d9c00.lock_count=2909
ldlm.namespaces.scratch1-OST007d-osc-ffff8806202d9c00.lock_count=2537
ldlm.namespaces.scratch1-OST007e-osc-ffff8806202d9c00.lock_count=2583
ldlm.namespaces.scratch1-OST007f-osc-ffff8806202d9c00.lock_count=2543
ldlm.namespaces.scratch1-OST0080-osc-ffff8806202d9c00.lock_count=2577
ldlm.namespaces.scratch1-OST0081-osc-ffff8806202d9c00.lock_count=2594
ldlm.namespaces.scratch1-OST0082-osc-ffff8806202d9c00.lock_count=2591
ldlm.namespaces.scratch1-OST0083-osc-ffff8806202d9c00.lock_count=2809
ldlm.namespaces.scratch1-OST0084-osc-ffff8806202d9c00.lock_count=2578
ldlm.namespaces.scratch1-OST0085-osc-ffff8806202d9c00.lock_count=2601
ldlm.namespaces.scratch1-OST0086-osc-ffff8806202d9c00.lock_count=2560
ldlm.namespaces.scratch1-OST0087-osc-ffff8806202d9c00.lock_count=2886
ldlm.namespaces.scratch1-OST0088-osc-ffff8806202d9c00.lock_count=2559
ldlm.namespaces.scratch1-OST0089-osc-ffff8806202d9c00.lock_count=2580
ldlm.namespaces.scratch1-OST008a-osc-ffff8806202d9c00.lock_count=2529
ldlm.namespaces.scratch1-OST008b-osc-ffff8806202d9c00.lock_count=2573
ldlm.namespaces.scratch1-OST008c-osc-ffff8806202d9c00.lock_count=2590
ldlm.namespaces.scratch1-OST008d-osc-ffff8806202d9c00.lock_count=2564
ldlm.namespaces.scratch1-OST008e-osc-ffff8806202d9c00.lock_count=2571
ldlm.namespaces.scratch1-OST008f-osc-ffff8806202d9c00.lock_count=2568
ldlm.namespaces.scratch1-OST0090-osc-ffff8806202d9c00.lock_count=2610
ldlm.namespaces.scratch1-OST0091-osc-ffff8806202d9c00.lock_count=2541
ldlm.namespaces.scratch1-OST0092-osc-ffff8806202d9c00.lock_count=2763
ldlm.namespaces.scratch1-OST0093-osc-ffff8806202d9c00.lock_count=2544
ldlm.namespaces.scratch1-OST0094-osc-ffff8806202d9c00.lock_count=2561
ldlm.namespaces.scratch1-OST0095-osc-ffff8806202d9c00.lock_count=2551
ldlm.namespaces.scratch1-OST0096-osc-ffff8806202d9c00.lock_count=2577
ldlm.namespaces.scratch1-OST0097-osc-ffff8806202d9c00.lock_count=2744
ldlm.namespaces.scratch1-OST0098-osc-ffff8806202d9c00.lock_count=2549
ldlm.namespaces.scratch1-OST0099-osc-ffff8806202d9c00.lock_count=2577
ldlm.namespaces.scratch1-OST009a-osc-ffff8806202d9c00.lock_count=2556
ldlm.namespaces.scratch1-OST009b-osc-ffff8806202d9c00.lock_count=2543
ldlm.namespaces.scratch1-OST009c-osc-ffff8806202d9c00.lock_count=2545
ldlm.namespaces.scratch1-OST009d-osc-ffff8806202d9c00.lock_count=2614
ldlm.namespaces.scratch1-OST009e-osc-ffff8806202d9c00.lock_count=2550
ldlm.namespaces.scratch1-OST009f-osc-ffff8806202d9c00.lock_count=2558
ldlm.namespaces.scratch1-OST00a0-osc-ffff8806202d9c00.lock_count=2564
ldlm.namespaces.scratch1-OST00a1-osc-ffff8806202d9c00.lock_count=2731
ldlm.namespaces.scratch1-OST00a2-osc-ffff8806202d9c00.lock_count=2762
ldlm.namespaces.scratch1-OST00a3-osc-ffff8806202d9c00.lock_count=2554
ldlm.namespaces.scratch1-OST00a4-osc-ffff8806202d9c00.lock_count=2571
ldlm.namespaces.scratch1-OST00a5-osc-ffff8806202d9c00.lock_count=2532
ldlm.namespaces.scratch1-OST00a6-osc-ffff8806202d9c00.lock_count=2556
ldlm.namespaces.scratch1-OST00a7-osc-ffff8806202d9c00.lock_count=2540
ldlm.namespaces.scratch1-OST00a8-osc-ffff8806202d9c00.lock_count=2602
ldlm.namespaces.scratch1-OST00a9-osc-ffff8806202d9c00.lock_count=2542
ldlm.namespaces.scratch1-OST00aa-osc-ffff8806202d9c00.lock_count=2563
ldlm.namespaces.scratch1-OST00ab-osc-ffff8806202d9c00.lock_count=2565
ldlm.namespaces.scratch1-OST00ac-osc-ffff8806202d9c00.lock_count=2729
ldlm.namespaces.scratch1-OST00ad-osc-ffff8806202d9c00.lock_count=2705
ldlm.namespaces.scratch1-OST00ae-osc-ffff8806202d9c00.lock_count=2555
ldlm.namespaces.scratch1-OST00af-osc-ffff8806202d9c00.lock_count=2552
ldlm.namespaces.scratch2-MDT0000-mdc-ffff880429641c00.lock_count=2396
ldlm.namespaces.scratch2-OST0000-osc-ffff880429641c00.lock_count=1561
ldlm.namespaces.scratch2-OST0001-osc-ffff880429641c00.lock_count=1640
ldlm.namespaces.scratch2-OST0002-osc-ffff880429641c00.lock_count=1579
ldlm.namespaces.scratch2-OST0003-osc-ffff880429641c00.lock_count=1593
ldlm.namespaces.scratch2-OST0004-osc-ffff880429641c00.lock_count=1608
ldlm.namespaces.scratch2-OST0005-osc-ffff880429641c00.lock_count=1636
ldlm.namespaces.scratch2-OST0006-osc-ffff880429641c00.lock_count=1614
ldlm.namespaces.scratch2-OST0007-osc-ffff880429641c00.lock_count=2116
ldlm.namespaces.scratch2-OST0008-osc-ffff880429641c00.lock_count=1562
ldlm.namespaces.scratch2-OST0009-osc-ffff880429641c00.lock_count=2121
ldlm.namespaces.scratch2-OST000a-osc-ffff880429641c00.lock_count=1624
ldlm.namespaces.scratch2-OST000b-osc-ffff880429641c00.lock_count=1625
ldlm.namespaces.scratch2-OST000c-osc-ffff880429641c00.lock_count=1657
ldlm.namespaces.scratch2-OST000d-osc-ffff880429641c00.lock_count=1861
ldlm.namespaces.scratch2-OST000e-osc-ffff880429641c00.lock_count=1587
ldlm.namespaces.scratch2-OST000f-osc-ffff880429641c00.lock_count=1617
ldlm.namespaces.scratch2-OST0010-osc-ffff880429641c00.lock_count=1618
ldlm.namespaces.scratch2-OST0011-osc-ffff880429641c00.lock_count=1606
ldlm.namespaces.scratch2-OST0012-osc-ffff880429641c00.lock_count=2368
ldlm.namespaces.scratch2-OST0013-osc-ffff880429641c00.lock_count=1627
ldlm.namespaces.scratch2-OST0014-osc-ffff880429641c00.lock_count=2110
ldlm.namespaces.scratch2-OST0015-osc-ffff880429641c00.lock_count=1605
ldlm.namespaces.scratch2-OST0016-osc-ffff880429641c00.lock_count=1595
ldlm.namespaces.scratch2-OST0017-osc-ffff880429641c00.lock_count=1601
ldlm.namespaces.scratch2-OST0018-osc-ffff880429641c00.lock_count=1862
ldlm.namespaces.scratch2-OST0019-osc-ffff880429641c00.lock_count=1613
ldlm.namespaces.scratch2-OST001a-osc-ffff880429641c00.lock_count=1637
ldlm.namespaces.scratch2-OST001b-osc-ffff880429641c00.lock_count=1599
ldlm.namespaces.scratch2-OST001c-osc-ffff880429641c00.lock_count=1593
ldlm.namespaces.scratch2-OST001d-osc-ffff880429641c00.lock_count=2349
ldlm.namespaces.scratch2-OST001e-osc-ffff880429641c00.lock_count=1595
ldlm.namespaces.scratch2-OST001f-osc-ffff880429641c00.lock_count=1883
ldlm.namespaces.scratch2-OST0020-osc-ffff880429641c00.lock_count=1603
ldlm.namespaces.scratch2-OST0021-osc-ffff880429641c00.lock_count=1611
ldlm.namespaces.scratch2-OST0022-osc-ffff880429641c00.lock_count=1627
ldlm.namespaces.scratch2-OST0023-osc-ffff880429641c00.lock_count=1918
ldlm.namespaces.scratch2-OST0024-osc-ffff880429641c00.lock_count=1652
ldlm.namespaces.scratch2-OST0025-osc-ffff880429641c00.lock_count=1570
ldlm.namespaces.scratch2-OST0026-osc-ffff880429641c00.lock_count=1579
ldlm.namespaces.scratch2-OST0027-osc-ffff880429641c00.lock_count=1840
ldlm.namespaces.scratch2-OST0028-osc-ffff880429641c00.lock_count=2161
ldlm.namespaces.scratch2-OST0029-osc-ffff880429641c00.lock_count=1601
ldlm.namespaces.scratch2-OST002a-osc-ffff880429641c00.lock_count=1890
ldlm.namespaces.scratch2-OST002b-osc-ffff880429641c00.lock_count=1617
ldlm.namespaces.scratch2-OST002c-osc-ffff880429641c00.lock_count=1583
ldlm.namespaces.scratch2-OST002d-osc-ffff880429641c00.lock_count=1631
ldlm.namespaces.scratch2-OST002e-osc-ffff880429641c00.lock_count=1853
ldlm.namespaces.scratch2-OST002f-osc-ffff880429641c00.lock_count=1617
ldlm.namespaces.scratch2-OST0030-osc-ffff880429641c00.lock_count=1564
ldlm.namespaces.scratch2-OST0031-osc-ffff880429641c00.lock_count=1585
ldlm.namespaces.scratch2-OST0032-osc-ffff880429641c00.lock_count=1883
ldlm.namespaces.scratch2-OST0033-osc-ffff880429641c00.lock_count=1859
ldlm.namespaces.scratch2-OST0034-osc-ffff880429641c00.lock_count=1891
ldlm.namespaces.scratch2-OST0035-osc-ffff880429641c00.lock_count=1635
ldlm.namespaces.scratch2-OST0036-osc-ffff880429641c00.lock_count=1611
ldlm.namespaces.scratch2-OST0037-osc-ffff880429641c00.lock_count=1599
ldlm.namespaces.scratch2-OST0038-osc-ffff880429641c00.lock_count=1692
ldlm.namespaces.scratch2-OST0039-osc-ffff880429641c00.lock_count=1613
ldlm.namespaces.scratch2-OST003a-osc-ffff880429641c00.lock_count=1587
ldlm.namespaces.scratch2-OST003b-osc-ffff880429641c00.lock_count=1598
ldlm.namespaces.scratch2-OST003c-osc-ffff880429641c00.lock_count=1570
ldlm.namespaces.scratch2-OST003d-osc-ffff880429641c00.lock_count=1818
ldlm.namespaces.scratch2-OST003e-osc-ffff880429641c00.lock_count=1635
ldlm.namespaces.scratch2-OST003f-osc-ffff880429641c00.lock_count=1867
ldlm.namespaces.scratch2-OST0040-osc-ffff880429641c00.lock_count=1606
ldlm.namespaces.scratch2-OST0041-osc-ffff880429641c00.lock_count=1607
ldlm.namespaces.scratch2-OST0042-osc-ffff880429641c00.lock_count=1655
ldlm.namespaces.scratch2-OST0043-osc-ffff880429641c00.lock_count=1633
ldlm.namespaces.scratch2-OST0044-osc-ffff880429641c00.lock_count=1603
ldlm.namespaces.scratch2-OST0045-osc-ffff880429641c00.lock_count=1613
ldlm.namespaces.scratch2-OST0046-osc-ffff880429641c00.lock_count=1626
ldlm.namespaces.scratch2-OST0047-osc-ffff880429641c00.lock_count=1587
ldlm.namespaces.scratch2-OST0048-osc-ffff880429641c00.lock_count=1881
ldlm.namespaces.scratch2-OST0049-osc-ffff880429641c00.lock_count=1600
ldlm.namespaces.scratch2-OST004a-osc-ffff880429641c00.lock_count=1829
ldlm.namespaces.scratch2-OST004b-osc-ffff880429641c00.lock_count=1612
ldlm.namespaces.scratch2-OST004c-osc-ffff880429641c00.lock_count=1617
ldlm.namespaces.scratch2-OST004d-osc-ffff880429641c00.lock_count=1567
ldlm.namespaces.scratch2-OST004e-osc-ffff880429641c00.lock_count=1602
ldlm.namespaces.scratch2-OST004f-osc-ffff880429641c00.lock_count=1631
ldlm.namespaces.scratch2-OST0050-osc-ffff880429641c00.lock_count=1640
ldlm.namespaces.scratch2-OST0051-osc-ffff880429641c00.lock_count=1635
ldlm.namespaces.scratch2-OST0052-osc-ffff880429641c00.lock_count=1598
ldlm.namespaces.scratch2-OST0053-osc-ffff880429641c00.lock_count=1639
ldlm.namespaces.scratch2-OST0054-osc-ffff880429641c00.lock_count=1672
ldlm.namespaces.scratch2-OST0055-osc-ffff880429641c00.lock_count=1886
ldlm.namespaces.scratch2-OST0056-osc-ffff880429641c00.lock_count=1580
ldlm.namespaces.scratch2-OST0057-osc-ffff880429641c00.lock_count=1821
ldlm.namespaces.scratch2-OST0058-osc-ffff880429641c00.lock_count=1631
ldlm.namespaces.scratch2-OST0059-osc-ffff880429641c00.lock_count=1617
ldlm.namespaces.scratch2-OST005a-osc-ffff880429641c00.lock_count=1600
ldlm.namespaces.scratch2-OST005b-osc-ffff880429641c00.lock_count=1604
ldlm.namespaces.scratch2-OST005c-osc-ffff880429641c00.lock_count=1589
ldlm.namespaces.scratch2-OST005d-osc-ffff880429641c00.lock_count=1634
ldlm.namespaces.scratch2-OST005e-osc-ffff880429641c00.lock_count=1865
ldlm.namespaces.scratch2-OST005f-osc-ffff880429641c00.lock_count=1667
ldlm.namespaces.scratch2-OST0060-osc-ffff880429641c00.lock_count=1559
ldlm.namespaces.scratch2-OST0061-osc-ffff880429641c00.lock_count=1618
ldlm.namespaces.scratch2-OST0062-osc-ffff880429641c00.lock_count=1812
ldlm.namespaces.scratch2-OST0063-osc-ffff880429641c00.lock_count=1577
ldlm.namespaces.scratch2-OST0064-osc-ffff880429641c00.lock_count=1840
ldlm.namespaces.scratch2-OST0065-osc-ffff880429641c00.lock_count=1597
ldlm.namespaces.scratch2-OST0066-osc-ffff880429641c00.lock_count=1614
ldlm.namespaces.scratch2-OST0067-osc-ffff880429641c00.lock_count=1614
ldlm.namespaces.scratch2-OST0068-osc-ffff880429641c00.lock_count=1618
ldlm.namespaces.scratch2-OST0069-osc-ffff880429641c00.lock_count=1847
ldlm.namespaces.scratch2-OST006a-osc-ffff880429641c00.lock_count=1642
ldlm.namespaces.scratch2-OST006b-osc-ffff880429641c00.lock_count=1604
ldlm.namespaces.scratch2-OST006c-osc-ffff880429641c00.lock_count=1645
ldlm.namespaces.scratch2-OST006d-osc-ffff880429641c00.lock_count=1857
ldlm.namespaces.scratch2-OST006e-osc-ffff880429641c00.lock_count=1601
ldlm.namespaces.scratch2-OST006f-osc-ffff880429641c00.lock_count=1897
ldlm.namespaces.scratch2-OST0070-osc-ffff880429641c00.lock_count=1604
ldlm.namespaces.scratch2-OST0071-osc-ffff880429641c00.lock_count=1635
ldlm.namespaces.scratch2-OST0072-osc-ffff880429641c00.lock_count=1576
ldlm.namespaces.scratch2-OST0073-osc-ffff880429641c00.lock_count=1635
ldlm.namespaces.scratch2-OST0074-osc-ffff880429641c00.lock_count=1791
ldlm.namespaces.scratch2-OST0075-osc-ffff880429641c00.lock_count=1637
ldlm.namespaces.scratch2-OST0076-osc-ffff880429641c00.lock_count=1633
ldlm.namespaces.scratch2-OST0077-osc-ffff880429641c00.lock_count=1646
ldlm.namespaces.scratch2-OST0078-osc-ffff880429641c00.lock_count=1849
ldlm.namespaces.scratch2-OST0079-osc-ffff880429641c00.lock_count=1642
ldlm.namespaces.scratch2-OST007a-osc-ffff880429641c00.lock_count=1874
ldlm.namespaces.scratch2-OST007b-osc-ffff880429641c00.lock_count=1608
ldlm.namespaces.scratch2-OST007c-osc-ffff880429641c00.lock_count=1586
ldlm.namespaces.scratch2-OST007d-osc-ffff880429641c00.lock_count=1632
ldlm.namespaces.scratch2-OST007e-osc-ffff880429641c00.lock_count=1603
ldlm.namespaces.scratch2-OST007f-osc-ffff880429641c00.lock_count=1842
ldlm.namespaces.scratch2-OST0080-osc-ffff880429641c00.lock_count=1613
ldlm.namespaces.scratch2-OST0081-osc-ffff880429641c00.lock_count=1578
ldlm.namespaces.scratch2-OST0082-osc-ffff880429641c00.lock_count=1663
ldlm.namespaces.scratch2-OST0083-osc-ffff880429641c00.lock_count=1579
ldlm.namespaces.scratch2-OST0084-osc-ffff880429641c00.lock_count=1636
ldlm.namespaces.scratch2-OST0085-osc-ffff880429641c00.lock_count=1842
ldlm.namespaces.scratch2-OST0086-osc-ffff880429641c00.lock_count=1615
ldlm.namespaces.scratch2-OST0087-osc-ffff880429641c00.lock_count=1646
ldlm.namespaces.scratch2-OST0088-osc-ffff880429641c00.lock_count=1579
ldlm.namespaces.scratch2-OST0089-osc-ffff880429641c00.lock_count=1621
ldlm.namespaces.scratch2-OST008a-osc-ffff880429641c00.lock_count=1632
ldlm.namespaces.scratch2-OST008b-osc-ffff880429641c00.lock_count=1575
ldlm.namespaces.scratch2-OST008c-osc-ffff880429641c00.lock_count=1618
ldlm.namespaces.scratch2-OST008d-osc-ffff880429641c00.lock_count=1682
ldlm.namespaces.scratch2-OST008e-osc-ffff880429641c00.lock_count=1600
ldlm.namespaces.scratch2-OST008f-osc-ffff880429641c00.lock_count=1669
ldlm.namespaces.scratch2-OST0090-osc-ffff880429641c00.lock_count=1599
ldlm.namespaces.scratch2-OST0091-osc-ffff880429641c00.lock_count=1608
ldlm.namespaces.scratch2-OST0092-osc-ffff880429641c00.lock_count=1852
ldlm.namespaces.scratch2-OST0093-osc-ffff880429641c00.lock_count=1633
ldlm.namespaces.scratch2-OST0094-osc-ffff880429641c00.lock_count=1645
ldlm.namespaces.scratch2-OST0095-osc-ffff880429641c00.lock_count=1572
ldlm.namespaces.scratch2-OST0096-osc-ffff880429641c00.lock_count=1590
ldlm.namespaces.scratch2-OST0097-osc-ffff880429641c00.lock_count=1613
ldlm.namespaces.scratch2-OST0098-osc-ffff880429641c00.lock_count=1598
ldlm.namespaces.scratch2-OST0099-osc-ffff880429641c00.lock_count=1646
ldlm.namespaces.scratch2-OST009a-osc-ffff880429641c00.lock_count=1889
ldlm.namespaces.scratch2-OST009b-osc-ffff880429641c00.lock_count=1629
ldlm.namespaces.scratch2-OST009c-osc-ffff880429641c00.lock_count=1606
ldlm.namespaces.scratch2-OST009d-osc-ffff880429641c00.lock_count=1841
ldlm.namespaces.scratch2-OST009e-osc-ffff880429641c00.lock_count=1585
ldlm.namespaces.scratch2-OST009f-osc-ffff880429641c00.lock_count=1588
ldlm.namespaces.scratch2-OST00a0-osc-ffff880429641c00.lock_count=1591
ldlm.namespaces.scratch2-OST00a1-osc-ffff880429641c00.lock_count=1627
ldlm.namespaces.scratch2-OST00a2-osc-ffff880429641c00.lock_count=1586
ldlm.namespaces.scratch2-OST00a3-osc-ffff880429641c00.lock_count=1610
ldlm.namespaces.scratch2-OST00a4-osc-ffff880429641c00.lock_count=1632
ldlm.namespaces.scratch2-OST00a5-osc-ffff880429641c00.lock_count=1858
ldlm.namespaces.scratch2-OST00a6-osc-ffff880429641c00.lock_count=1648
ldlm.namespaces.scratch2-OST00a7-osc-ffff880429641c00.lock_count=1607
ldlm.namespaces.scratch2-OST00a8-osc-ffff880429641c00.lock_count=1890
ldlm.namespaces.scratch2-OST00a9-osc-ffff880429641c00.lock_count=1634
ldlm.namespaces.scratch2-OST00aa-osc-ffff880429641c00.lock_count=1572
ldlm.namespaces.scratch2-OST00ab-osc-ffff880429641c00.lock_count=1622
ldlm.namespaces.scratch2-OST00ac-osc-ffff880429641c00.lock_count=1572
ldlm.namespaces.scratch2-OST00ad-osc-ffff880429641c00.lock_count=1636
ldlm.namespaces.scratch2-OST00ae-osc-ffff880429641c00.lock_count=1610
ldlm.namespaces.scratch2-OST00af-osc-ffff880429641c00.lock_count=1649
ldlm.namespaces.scratch2-OST00b0-osc-ffff880429641c00.lock_count=2117
ldlm.namespaces.scratch2-OST00b1-osc-ffff880429641c00.lock_count=1595
ldlm.namespaces.scratch2-OST00b2-osc-ffff880429641c00.lock_count=1587
ldlm.namespaces.scratch2-OST00b3-osc-ffff880429641c00.lock_count=1837
ldlm.namespaces.scratch2-OST00b4-osc-ffff880429641c00.lock_count=1633
ldlm.namespaces.scratch2-OST00b5-osc-ffff880429641c00.lock_count=1631
ldlm.namespaces.scratch2-OST00b6-osc-ffff880429641c00.lock_count=1644
ldlm.namespaces.scratch2-OST00b7-osc-ffff880429641c00.lock_count=1616
ldlm.namespaces.scratch2-OST00b8-osc-ffff880429641c00.lock_count=1593
ldlm.namespaces.scratch2-OST00b9-osc-ffff880429641c00.lock_count=1617
ldlm.namespaces.scratch2-OST00ba-osc-ffff880429641c00.lock_count=1580
ldlm.namespaces.scratch2-OST00bb-osc-ffff880429641c00.lock_count=2118
ldlm.namespaces.scratch2-OST00bc-osc-ffff880429641c00.lock_count=1572
ldlm.namespaces.scratch2-OST00bd-osc-ffff880429641c00.lock_count=1615
ldlm.namespaces.scratch2-OST00be-osc-ffff880429641c00.lock_count=1639
ldlm.namespaces.scratch2-OST00bf-osc-ffff880429641c00.lock_count=1567
ldlm.namespaces.scratch2-OST00c0-osc-ffff880429641c00.lock_count=1588
ldlm.namespaces.scratch2-OST00c1-osc-ffff880429641c00.lock_count=1609
ldlm.namespaces.scratch2-OST00c2-osc-ffff880429641c00.lock_count=1600
ldlm.namespaces.scratch2-OST00c3-osc-ffff880429641c00.lock_count=1639
ldlm.namespaces.scratch2-OST00c4-osc-ffff880429641c00.lock_count=1594
ldlm.namespaces.scratch2-OST00c5-osc-ffff880429641c00.lock_count=1650
ldlm.namespaces.scratch2-OST00c6-osc-ffff880429641c00.lock_count=1845
ldlm.namespaces.scratch2-OST00c7-osc-ffff880429641c00.lock_count=1609
ldlm.namespaces.scratch2-OST00c8-osc-ffff880429641c00.lock_count=1586
ldlm.namespaces.scratch2-OST00c9-osc-ffff880429641c00.lock_count=1601
ldlm.namespaces.scratch2-OST00ca-osc-ffff880429641c00.lock_count=1576
ldlm.namespaces.scratch2-OST00cb-osc-ffff880429641c00.lock_count=1549
ldlm.namespaces.scratch2-OST00cc-osc-ffff880429641c00.lock_count=1620
ldlm.namespaces.scratch2-OST00cd-osc-ffff880429641c00.lock_count=1631
ldlm.namespaces.scratch2-OST00ce-osc-ffff880429641c00.lock_count=1860
ldlm.namespaces.scratch2-OST00cf-osc-ffff880429641c00.lock_count=1610
ldlm.namespaces.scratch2-OST00d0-osc-ffff880429641c00.lock_count=1665
ldlm.namespaces.scratch2-OST00d1-osc-ffff880429641c00.lock_count=1828
ldlm.namespaces.scratch2-OST00d2-osc-ffff880429641c00.lock_count=1641
ldlm.namespaces.scratch2-OST00d3-osc-ffff880429641c00.lock_count=1628
ldlm.namespaces.scratch2-OST00d4-osc-ffff880429641c00.lock_count=1640
ldlm.namespaces.scratch2-OST00d5-osc-ffff880429641c00.lock_count=1607
ldlm.namespaces.scratch2-OST00d6-osc-ffff880429641c00.lock_count=1593
ldlm.namespaces.scratch2-OST00d7-osc-ffff880429641c00.lock_count=1881
ldlm.namespaces.scratch2-OST00d8-osc-ffff880429641c00.lock_count=1583
ldlm.namespaces.scratch2-OST00d9-osc-ffff880429641c00.lock_count=1802
ldlm.namespaces.scratch2-OST00da-osc-ffff880429641c00.lock_count=1587
ldlm.namespaces.scratch2-OST00db-osc-ffff880429641c00.lock_count=1632

I don't have time this morning to grab the rest as I have to head into the customer site. I'll update the rest after I get there.

Comment by Oleg Drokin [ 20/Mar/13 ]

Ok, so looking at the systems myself both scratch1 and scratch2 are remarkably similar, both MDTs have a bit over 1M locks handed out to clients, most of the RAM used by the buffer cache (scratch2 only has 22k or so ldiskfs inodes which I found to be surprisingly low, scratch1 has 42k which is also pretty low considering a million of locks held on 800k resources)

Both systems were pretty active at the time and from LA perspective scratch1 even looked a bit more busy. saved request statistics also pointed at scratch1 as having more significant peak load with maximum request waiting time being over 6 seconds, where as scratch2 was at 1.9 seconds). scratch2 has about 2x inodes used on MDS compared to scratch1.
we logged into a random client and the lock usage there was only ~400 locks on mdc and pretty random numbers for oscs, so it's all pretty typical I believe, it's just the login node that has all locks pegged to the allowed max (lru_resize is disabled on the system and max lru_size is fixed at 2400).

So I think we are going in with the dedicated test time plan, here's what I think would be useful:

  • stop all jobs AND
  • bump all users and their influence off from login nodes (this could be easier said than done as I imagine people might be having scripts running in background and such - I am just trying to control here for this so that we don't end up in a situation where login nodes unmount will "Fix" the problem only to find out it was because some background script just got cut off and the load eased).
  • reset mdt rpc counter (echo >/proc/fs/lustre/mdt/MDS/.../stats)
  • wait 10 seconds
  • check the rpc stats counters to make sure there's really no RPC traffic is coming (other than pings)
  • if the counter is registering more traffic, a bit of investigation into that is needed to see where's that coming from, could be obtained from export stats or debug logs.
  • run mdtest to see what the number is (might make sense to run this under oprofile (on mds) right away so we have baseline cpu utilization)
  • assuming it's not as good as desired, grab /proc/meminfo data on MDS
  • increase debugging: echo "+rpctrace +dlmtrace +dentry" > /proc/sys/lnet/debug
  • increase debugging buffers: echo 1024 >/proc/sys/lnet/debug_mb (this will reduce available MDS ram by about 1G)
  • start "debug daemon" to log debug data: lctl debug_daemon start /somewhere_stable/run1.bin 10240 it's important that this file is not to be stored on NFS, but some local disk storage)
  • run mdtest again (it's likely going to be slower than previous time, but that's expected).
  • stop "debug daemon": lctl debug_daemon stop (also save the file somewhere so it does not disappear, it would need later decoding with lctl df)
  • reduce logging back to previous levels: echo "-rpctrace -dlmtrace -dentry" > /proc/sys/lnet/debug
  • Now for groups of clients: 1st 1/4 of clients ; 2nd 1/4 of clients ; 3rd 1/4 of clients ; remaining clients:
  • unmount lustre on the nodes subset (if the unmount cannot be complete because mountpoint is busy, that's ground for investigations for what might be running there still using filesystem, most likely only would ever be a problem for login nodes, do not use umount -f, just regular umount so it actually sends a message to the server about its departure)
  • grab /proc/meminfo and /proc/slabinfo from mds
  • repeat mdtest (possibly under oprofile)
  • if mdtest performed better than before, make another mdtest run with increased logging and debug daemon in effect as before (logging to a different file, please), drop debug level back to where it was before continuing on further tests
  • proceed to do the next bunch of nodes.

All mdtests ideally should be run from the same set of nodes (with known NIDs so we can clearly separate them from the logs). Also it might make sense to only run mdtest in metadata (directory create/stat/unlink) mode for extra fast turnaround, since that's enough to demonstrate the presence or absence of the problem.
The nodes could be unmounted on a more fine-grained basis if time would allow.

It might be interesting to perform a similar experiment where the nodes are just powered down instead of unmounted, but that would be impossible to do in the same window. The difference being - server does not really notice client going away for some time after abrupt power off and so the resources are not really freed, but the client cannot send anything.
Similarly it might be interesting to just drop locks from clients instead of unmount. I do not think locks at the current level of usage would be too disruptive to anything and this would confirm the theory, but is less important than the above unmount testcase.

After rereading some of the info I realized that just bumping compute clients without restarting login nodes improved the performance, so it's definitely something there, not any stale stuff on login nodes.

Comment by Oleg Drokin [ 20/Mar/13 ]

also as discussed, incorporating (sync; umount/remount), or lru clean (depending on which one proves more effective) into job epilogue might be a practical fast workaround, but if implemented right away, we likely won't know what's it was.

Actually, as I was just thinking about all of this, I remembered about an old report that was long addressed in 2.x, but never in 1.8 releases.
It's about ldlm lock hash size that defaults to 12 bits, or only 4096 buckets, so with 1M+ locks the lists are going to be quite big to traverse. We have it set at 20 bits in 2.x and I saw some people were increasing it to 18 bits (256k buckets, so already much better even at 1M+ locks).
This totally explains why dropping clients fixes the problem too.
This does not explain why scratch1 is not affected as much at the same lock count, though.

Short of a patch to increase RES_HASH_BITS define for all servers I expect dropping lrus in job epilogue should help somewhat.

Comment by Dennis Nelson [ 21/Mar/13 ]

The customer performed a test this morning stopping all compute jobs, running mdtest, clearing locks, then running mdtest again. mdtest performance increased dramatically after the locks were cleared. The customer has now setup the batch queueing system such that it releases the locks on the allocated compute nodes before and after each job. Now, the total locks on the each filesystem seems to be running at about 120K instead of 1.2 - 2.4M. The system has operational now for about 2 hours and they are happy with mdtest performance numbers they are seeing.

I'd like to leave this ticket open for just a few more days to confirm but I believe that the problem has been resolved.

Comment by Dennis Nelson [ 21/Mar/13 ]

The application specialists at NOAA are requesting information concerning lock creation and the process that Lustre 1.8.x uses in canceling locks. Is there any documentation, or whitepapers, that discusses what events cause locks to be created and in what circumstances they will get deleted?

Comment by Oleg Drokin [ 21/Mar/13 ]

There's Lustre Internals document by ORNL made in collaboration with Oracle that touches this topic as well (Starting from page 24). ( http://users.nccs.gov/~fwang2/papers/lustre_report.pdf )

The locks should not be viewed as a burden on the system, they are actually accumulated because they are more useful when they are needed allowing the system to cache data.

The problem at hand is just that there were too many locks in the system ovarall for the default value of the lock hash table, which led to slowdowns of lock operations (stemming from slowness in locating locks in huge lists by just iterating those lists).

The locks are in general never deleted unless there is some sort of memory pressure or there is a conflict with some other lock. Though Lustre RPMs are built in such a way that every client node is limited to 2400 locks per service (really 2* number of cpus)) and once this number is reached, oldest used locks are voluntarily released (otherwise they are sitting in lock lru list hoping to be useful further down the road, meanwhile protecting cached resources).

Comment by Dennis Nelson [ 03/Apr/13 ]

Update: On Tuesday, NOAA upgraded to Lustre 1.8.9 on clients and servers. This is the Intel version:
cat /proc/fs/lustre/version
lustre: 1.8.9
kernel: patchless_client
build: jenkins-wc1--PRISTINE-2.6.32-279.19.1.el6.x86_64

Previously, the customer was running with 1.8.6 on the servers with 1.8.8 clients. lru_resize was disabled in the previous version. We elected to leave it to the default in the upgrade.

Now, the customer's prologue and epilogue scripts no longer appear to be clearing the locks. They are above 3M right now. The prologue and epilogue scripts run the following:

cat /var/spool/torque/mom_priv/clear_lru.sh
#! /bin/bash
for i in /proc/fs/lustre/ldlm/namespaces/*; do
echo clear > ${i}/lru_size
done

The customer is reporting seeing this error when the prologue and epilogue scripts run:
r7i2n13: /var/spool/torque/mom_priv/clear_lru.sh: line 3: echo: write error: Invalid argument
r7i2n13: /var/spool/torque/mom_priv/clear_lru.sh: line 3: echo: write error: Invalid argument
r7i2n13: /var/spool/torque/mom_priv/clear_lru.sh: line 3: echo: write error: Invalid argument

I tried it on a login node and I got the error one time the first time I ran it but I ran it again and there was no error.

[root@lfs-mds-1-1 ~]# date;cat /proc/fs/lustre/ldlm/namespaces/mds-scratch1-MDT0000_UUID/lock_count
Wed Apr 3 15:23:13 UTC 2013
1525269

[root@lfs-mds-2-1 ~]# date;cat /proc/fs/lustre/ldlm/namespaces/mds-scratch2-MDT0000_UUID/lock_count
Wed Apr 3 15:23:22 UTC 2013
3383870

[root@fe1 namespaces]# date;for i in `find /proc/fs/lustre/ldlm/namespaces/ -name lru_size -print`;do echo clear > $i;done
Wed Apr 3 15:22:33 UTC 2013
[root@fe1 namespaces]# ssh fe2 'date;for i in `find /proc/fs/lustre/ldlm/namespaces/ -name lru_size -print`;do echo clear > $i;done'
Wed Apr 3 15:23:35 UTC 2013
[root@fe1 namespaces]# ssh fe3 'date;for i in `find /proc/fs/lustre/ldlm/namespaces/ -name lru_size -print`;do echo clear > $i;done'
Wed Apr 3 15:23:53 UTC 2013
[root@fe1 namespaces]# ssh fe4 'date;for i in `find /proc/fs/lustre/ldlm/namespaces/ -name lru_size -print`;do echo clear > $i;done'
Wed Apr 3 15:24:20 UTC 2013
bash: line 0: echo: write error: Invalid argument
bash: line 0: echo: write error: Invalid argument
[root@fe1 namespaces]# ssh fe4 'date;for i in `find /proc/fs/lustre/ldlm/namespaces/ -name lru_size -print`;do echo clear > $i;done'
Wed Apr 3 15:24:35 UTC 2013
bash: line 0: echo: write error: Invalid argument
bash: line 0: echo: write error: Invalid argument
bash: line 0: echo: write error: Invalid argument
[root@fe1 namespaces]# ssh fe4 'date;for i in `find /proc/fs/lustre/ldlm/namespaces/ -name lru_size -print`;do echo clear > $i;done'
Wed Apr 3 15:24:41 UTC 2013
bash: line 0: echo: write error: Invalid argument
bash: line 0: echo: write error: Invalid argument
bash: line 0: echo: write error: Invalid argument
bash: line 0: echo: write error: Invalid argument
[root@fe1 namespaces]# ssh fe5 'date;for i in `find /proc/fs/lustre/ldlm/namespaces/ -name lru_size -print`;do echo clear > $i;done'
Wed Apr 3 15:24:56 UTC 2013
[root@fe1 namespaces]# ssh fe6 'date;for i in `find /proc/fs/lustre/ldlm/namespaces/ -name lru_size -print`;do echo clear > $i;done'
Wed Apr 3 15:25:22 UTC 2013
bash: line 0: echo: write error: Invalid argument
bash: line 0: echo: write error: Invalid argument
[root@fe1 namespaces]# ssh fe7 'date;for i in `find /proc/fs/lustre/ldlm/namespaces/ -name lru_size -print`;do echo clear > $i;done'
Wed Apr 3 15:26:00 UTC 2013
bash: line 0: echo: write error: Invalid argument
bash: line 0: echo: write error: Invalid argument
[root@fe1 namespaces]# ssh fe8 'date;for i in `find /proc/fs/lustre/ldlm/namespaces/ -name lru_size -print`;do echo clear > $i;done'
Wed Apr 3 15:26:26 UTC 2013
[root@fe1 namespaces]#

Any suggestions on what causes the write errors? Also, in the previous version, the lru_size parameter seemed to be limited to 2400. Now, it is growing much higher. With lru-resize enabled in the build, is there anyway to implement the same 2400 limit without installing a patched version of Lustre?

Comment by Dennis Nelson [ 03/Apr/13 ]

It appears that the locks are being released, the error is just that some were not released. We have set the lru_resize parameter back to 2400 on all of login nodes and will be working to set it on all nodes. I think that should resolve the issue.

Comment by Oleg Drokin [ 03/Apr/13 ]

I think the best all out solution would be for you to increase RES_HASH_BITS value on your servers and deploy it that way.
By limiting number of locks in lru on your login nodes you are missing out on the benefits more locks are bringing you there.
It's ok to drop the locks on the compute nodes between jobs because different jobs are unlikely to access same set offiles anyway, and the extra benefit is you clear the cache proactively too reducing the strain on the vm subsystem.
The write errors to lru_size proc file could be ignored if you don't get any scary messages in dmesg (if you do, please post them here).

Comment by Kit Westneat (Inactive) [ 18/Apr/13 ]

Here's the RES_HASH_BITS patch we are going to use. Does this look correct? Thanks.

Comment by Oleg Drokin [ 18/Apr/13 ]

Yes, this looks good.
Note you really only need this on servers, on clients it's unlikely there ever is going to be as many locks in a single namespace.
But the penalty is only a somewhat bigger memory usage per namespace.

Comment by Kit Westneat (Inactive) [ 24/Apr/13 ]

I pushed it to gerrit in case it would be useful to land:
http://review.whamcloud.com/6148

Comment by Kit Westneat (Inactive) [ 25/Jul/13 ]

Just FYI, on the client applying this patch increased Lustre memory usage by 7GB.

1.8.9 without patch:
--------------------------------------------------------------------------
[root@r1i3n15 ~]# free
total used free shared buffers cached
Mem: 24631956 1145668 23486288 0 0 229708
-/+ buffers/cache: 915960 23715996
Swap: 0 0 0

1.8.9 with patch:
--------------------------------------------------------------------------
[root@r1i3n14 ~]# free
total used free shared buffers cached
Mem: 24631956 8276320 16355636 0 0 58372
-/+ buffers/cache: 8217948 16414008
Swap: 0 0 0

Comment by Cliff White (Inactive) [ 09/May/14 ]

Patch was abandoned, closing this issue.

Generated at Sat Feb 10 01:29:29 UTC 2024 using Jira 9.4.14#940014-sha1:734e6822bbf0d45eff9af51f82432957f73aa32c.