POV-Ray : Newsgroups : povray.off-topic : Trouble with large directory in Linux : Re: Trouble with large directory in Linux Server Time
29 Jul 2024 06:20:12 EDT (-0400)
  Re: Trouble with large directory in Linux  
From: Kevin Wampler
Date: 28 Feb 2012 15:01:12
Message: <4f4d3288$1@news.povray.org>
Alas (but thankfully) I just managed to delete all the files by 
remembering the existence of and SSHing into a machine with 64GB of ram 
before I read your reply.  I'll still try to respond to your questions 
for curiosity's sake.

There was, however, some odd behavior that I'm curious about, even if 
I've successfully deleted the files now.  On the 64GB ram machine I was 
able to run 'find . -name "*.txt" | wc', which listed around 350K files. 
  After running 'find . -name "*.txt" -delete' successfully, another run 
of 'find . -name "*.txt" | wc' have 150K files (all *.txt files too!). 
Repeating the process, the number of remaining files fluctuated up and 
down a number of times before everything finally seemed to be removed. 
This seems to be somewhat strange behavior to me.

>
> What is the output of : ls -ld<dir>
> (notice the d), in particular the size attribute
> many many many files in a directory would lead to huge size.
> Check also the rights on directory.

I obviously can't run this now, but if I remember the output of previous 
df commands I think the dir was somewhere in the ballpark of 10GB.  Due 
to the odd behavior mentioned above I'm not sure how many files ended up 
being in the directory, except that it was probably between 350K and 1000K.


> Also possible: drive is full, check the output of "df<dir>"
> (rules of thumb: 5% of partition is reserved to root, unless tuned
> otherwise explicitly with ext2/3/4. )

This is one thing I did try, there was plenty of free space (about 50% 
capacity).

>
> I wonder about :
>   $ find<dir>  -type f -exec rm {} \;
>   $ rm -rf<dir>
> (notice the f)

On the 8GB ram machine I tried a similar variant to the above find 
command (using the -delete option), which ate through all my memory like 
everything else.  I didn't try rm -rf though, since I figured it 
wouldn't run faster than rm -r... perhaps I should have though.


> How is the network load when you tried to access that directory ?

Unfortunately this is one I can't answer.

Thanks for your help!  I will remember everything you suggested in case 
something like this comes up again.


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.