Advice on structuring large numbers of files

From: Brewer, Edward (BREWERE@OD.NIH.GOV)
Date: Thu Aug 22 2002 - 17:03:48 EDT


Admins and Guru's;

        The question was posed as to the best way to handle around
300,000-1,000,000 files in an ADVFS filesystem. Currently we have around
300,000 files in one directory. We expect growth to be steady and in the
future we may push 1,000,000. Our current performance seems adequate, I
have collect data on the disks for that domain and by all indication it
seems fine ( I can give numbers or graphs if needed). The directory is
rarely touched by users directly, instead an oracle B-file points to the
files in this directory when users need the information. The files are not
very large, around 10-250 MB in size and are accessed infrequently.
Performing an ls or directory scan is a bit slow, but this is not usually
done by a user. I was thinking that if we created 20 directories that where
further divided into 12 more directories ( 20 years by 12 months) this would
give enough depth to allow for efficient growth. 20x 12 = 240 directories,
with an even distribution then around 1000-2000 per directory. Any future
growth would mean 12 more directories a year. Any suggestions, or
experience would be greatly experienced.

Lee Brewer



This archive was generated by hypermail 2.1.7 : Sat Apr 12 2008 - 10:48:50 EDT