Re: [SLUG] 100k files in one directory

From: Bill (selinux@home.com)
Date: Tue Dec 11 2001 - 22:20:38 EST


On Tuesday 11 December 2001 20:47, you wrote:
> you might not have to recompile.. I'm not sure.. but I think you
> can do something like cat 100000 > /proc/sys/fs/file-max
>
> maybe.. =)
>
> On 11 Dec 2001, Logan wrote:
> > Various folks in my office dumped just shy of 100K files to a
> > linux file server I made for back ups. But, they dumped them all
> > into one directory.

Clipped

  But chiding from the Windoze users
> > stirs me to try to let them leave their crap all in one big heap.
> >
> > Logan

Windows users? Good gravy ... they're worse than Jehovah's Witnesses!

:-)

Why not write a script to break up the files into individual
directories (sorted however), do your backup magic and the dump all
the crud back into the original directory? They'll never know what
hit 'em :-)

In fact, you should accidentally mirror the munged directory to their
hard drives, too. Maybe dir can handle more files than ls? :-)

What would life be without an occaisional accident?

Bill

-- 
         total       used       free     shared    buffers     cached
Mem:   1545352    1500812      44540          0     150428    1114132
Swap:   401584       2372     399212
Total: 1946936    1503184     443752
Linux a.genesis.com 2.4.14 #3 Fri Nov 9 23:14:31 EST 2001 K7 750MHz
 10:09pmup 2 days, 23:23,  4 users,  load average: 0.69, 0.69, 0.67



This archive was generated by hypermail 2.1.3 : Fri Aug 01 2014 - 20:10:09 EDT