calculation directory size faster

hi everyone,

I have been working on this for a bit but can't seem to get it working faster. I am calculating Directory sizes on multiple servers using du but it scans all the sub directories but that takes forever would someone have an Idea to make it go faster or maybe have some sort of precalculation coding?

thank you
 
Well you can run du parallel on eatch server via ssh.

i wrote this script for it once.

RUN:
Code:
[lbl@atom0 ~]$ ./xserv-admin cmd "du -ch /tmp | tail -n 1" 2> /dev/null 
Server Admin tool
==> Running "du -ch /tmp | tail -n 1" on atom0 <==
==> Running "du -ch /tmp | tail -n 1" on atom1 <==
==> Running "du -ch /tmp | tail -n 1" on sip <==
sip:  48K       total
atom1: 658K     total
atom0:  60K     total
All commands are done.
[lbl@atom0 ~]$

Script:
Code:
#!/usr/local/bin/bash

echo "Server Admin tool"

#Systems in the tool
systems="atom0 atom1 sip"

sshcommand()
{

for i in $systems
        do
                echo "==> Running \"$1\" on $i <=="
                ssh $i "$1" | sed "s/^/$i: /" &
        done
}

case "$1" in
        'loggedin')
                sshcommand "w"
                wait ; echo "All commands are done."
        ;;
        'cmd')
                sshcommand "$2"
                wait ; echo "All commands are done."
        ;;
        *)
                echo "Usage: $0 loggedin"
                echo "Usage: $0 cmd <command to run on remote systems>"
        ;;
esac

A nice tool for checking stuff out if you have a shitload of servers.

This wont help "du"´s speed tho ... hehe

/lbl
 
Back
Top