Recently, I was
wget-ting a large amount of hurricane data from multiple sources -to the tune of several hundred gigs. These were
gzip compressed files and so my perlscript had to download, unzip and then run some cleaning and integration code on all the data, while keeping track of the source locations and possible errors in the whole process. Since the uncompressed data volume was ~5TB, I had these things running in parallel... for most part. ;)
Anyways, the simplest way to get "live" updates of file changes by tracking their sizes is like this:
watch -n 1 --differences du -h
or watch -n 1 --differences ls -lh ~/downloads
This basically gives you highlighted updates on the file sizes every 1sec. Note how the
watch command keeps
du "alive".
Undoubtedly the best part about any *nix system is the terminal!
No comments:
Post a Comment