Re: [SLUG] 100k files in one directory

From: Ed Centanni (ecentan1@tampabay.rr.com)
Date: Thu Dec 13 2001 - 20:05:31 EST


This is a a user education issue not a technical one.

  I suggest you move all the files to a non-shared directory then create
a separate directory for each user in the shared area. When the users
ask where their files went tell them that if they'll provide a list of
the files they want (in a text file so you can automate the transfer),
you'll place them IN THEIR ASSIGNED directory. Tell them that in the
future any files found OUTSIDE of their directory will be erased. If
they need to share files then make a special shared subdirectory where
they can place a COPY of their files, but inform them all that this area
is a temporary share area and that it's contents will be erased on a
regular published and announced schedule and then DO IT!

Explain to the users the problem that lead to these measures and
positively ask for their cooperation, emphasizing that this will make
life easier for all.

If management will not support you in this then look for another job
because these un-disciplined louts will make your life hell.

Ed.

Logan wrote:

> Various folks in my office dumped just shy of 100K files to a linux
> file server I made for back ups. But, they dumped them all into one
> directory. There are so many files there that ls times out.
> I think the cure would be to recompile a kernel, editing
> /usr/src/linux/include/linux/fs.h to change the NR_FILE variable from
> 8192 to 8192*10 or even 8192*100. I think this would allow things like
> ls and rm to work. Or am I barking up the wrong tree?
> Has anyone else had any experience like this? I know I can also cure
> it by making more subdirectories and force them to put there files in
> the subdirectories. But chiding from the Windoze users stirs me to try
> to let them leave their crap all in one big heap.
>
> Logan
>



This archive was generated by hypermail 2.1.3 : Fri Aug 01 2014 - 20:13:47 EDT