What is the fastest way to calculate disk usage for each client?

I hope this is easy.

I am running a Rails web application where I host about 100 school sites. One application processes all sites, and I have a management interface where we can add and remove schools, etc.

I want to add a stat to this interface, which is the shared disk space used by this school. Each school’s files are stored in a separate directory structure, so they are easy to recognize. The only problem is that I need it to be fast. So the question is what is the fastest way to find this information. If it could be found using the ruby ​​call on the fly, that would be great, but I'm open to all that will work. Ideally, I would like to avoid caching and generating background data (at least at the rail level). :)

+1
source share
3 answers

If you want to go with pure Ruby, you can try this code. Although, if you are looking for speed, I'm sure it duwill be faster.

def dir_size(dir_path)
  require 'find'
  size = 0
  Find.find(dir_path) { |f| size += File.size(f) if File.file?(f) }
  size
end

dir_size('/tmp/')
+3
`du -s "/your/path/here"`.split("\t").first.to_i #returns bytes
+4

Have you tried working in each directory upon request? On my older box, I can make du in the catalog 15M at ~ 4 ms and 250 M at ~ 50 ms. Both of them seem reasonable for the task. How big are directories? Before you try to really optimize this, make sure that it is really worth your time. LAMB and all that.

You can always track the download when they provide you with a file. This way you just need to track the delta as files are added or deleted.

0
source

All Articles