POV-Ray : Newsgroups : povray.off-topic : Trouble with large directory in Linux Server Time
1 Nov 2024 05:20:36 EDT (-0400)
  Trouble with large directory in Linux (Message 1 to 10 of 14)  
Goto Latest 10 Messages Next 4 Messages >>>
From: Kevin Wampler
Subject: Trouble with large directory in Linux
Date: 28 Feb 2012 13:49:22
Message: <4f4d21b2@news.povray.org>
I have a directory which (I think) has many many small files in it.  I'd 
like to remove these files, however every attempt I've tried fails.  For 
example:

cd <dir>; rm *
rm -r <dir>
ls <dir>
find <dir>

All quickly consume my 8GB of memory and grind to an apparent halt.  I 
haven't been able to even determine the name of a single file within 
that directory, let alone delete anything.  Does anyone have any ideas?


Notes:

1) I'm assuming the problem is a large number of files, but as I can't 
get any info about the contents of the directory I don't know for sure 
if this is the case or it's some other problem.

1) This is on NFS, in case that matters.

2) I don't have root privileges.


Post a reply to this message

From: Le Forgeron
Subject: Re: Trouble with large directory in Linux
Date: 28 Feb 2012 14:38:27
Message: <4f4d2d33$1@news.povray.org>
Le 28/02/2012 19:49, Kevin Wampler nous fit lire :
> I have a directory which (I think) has many many small files in it.  I'd
> like to remove these files, however every attempt I've tried fails.  For
> example:
> 
> cd <dir>; rm *
> rm -r <dir>
> ls <dir>
> find <dir>
> 
> All quickly consume my 8GB of memory and grind to an apparent halt.  I
> haven't been able to even determine the name of a single file within
> that directory, let alone delete anything.  Does anyone have any ideas?
> 
> 

do not have the shell expand * ;



> Notes:
> 
> 1) I'm assuming the problem is a large number of files, but as I can't
> get any info about the contents of the directory I don't know for sure
> if this is the case or it's some other problem.

What is the output of : ls -ld <dir>
(notice the d), in particular the size attribute
many many many files in a directory would lead to huge size.
Check also the rights on directory.

Also possible: drive is full, check the output of "df <dir>"
(rules of thumb: 5% of partition is reserved to root, unless tuned
otherwise explicitly with ext2/3/4. )
> 
> 1) This is on NFS, in case that matters.

Yes, it means we cannot use the lovely unlink.

> 
> 2) I don't have root privileges.

Of course not, root does not traverse NFS. (unless serving system is
changed (and you do not want that))

I wonder about :
 $ find <dir> -type f -exec rm {} \;
 $ rm -rf <dir>
(notice the f)

How is the network load when you tried to access that directory ?


Post a reply to this message

From: Warp
Subject: Re: Trouble with large directory in Linux
Date: 28 Feb 2012 14:47:33
Message: <4f4d2f55@news.povray.org>
Kevin Wampler <wam### [at] uwashingtonedu> wrote:
> I have a directory which (I think) has many many small files in it.  I'd 
> like to remove these files, however every attempt I've tried fails.  For 
> example:

> cd <dir>; rm *
> rm -r <dir>
> ls <dir>
> find <dir>

  A webpage suggests this:

find <dir> -exec rm {} \;

  (I think that the idea is that this executes the 'rm' command individually
on each file as it traverses the directory, rather than first gathering all
the file names into RAM and then trying to do something to them.)

-- 
                                                          - Warp


Post a reply to this message

From: Kevin Wampler
Subject: Re: Trouble with large directory in Linux
Date: 28 Feb 2012 15:01:12
Message: <4f4d3288$1@news.povray.org>
Alas (but thankfully) I just managed to delete all the files by 
remembering the existence of and SSHing into a machine with 64GB of ram 
before I read your reply.  I'll still try to respond to your questions 
for curiosity's sake.

There was, however, some odd behavior that I'm curious about, even if 
I've successfully deleted the files now.  On the 64GB ram machine I was 
able to run 'find . -name "*.txt" | wc', which listed around 350K files. 
  After running 'find . -name "*.txt" -delete' successfully, another run 
of 'find . -name "*.txt" | wc' have 150K files (all *.txt files too!). 
Repeating the process, the number of remaining files fluctuated up and 
down a number of times before everything finally seemed to be removed. 
This seems to be somewhat strange behavior to me.

>
> What is the output of : ls -ld<dir>
> (notice the d), in particular the size attribute
> many many many files in a directory would lead to huge size.
> Check also the rights on directory.

I obviously can't run this now, but if I remember the output of previous 
df commands I think the dir was somewhere in the ballpark of 10GB.  Due 
to the odd behavior mentioned above I'm not sure how many files ended up 
being in the directory, except that it was probably between 350K and 1000K.


> Also possible: drive is full, check the output of "df<dir>"
> (rules of thumb: 5% of partition is reserved to root, unless tuned
> otherwise explicitly with ext2/3/4. )

This is one thing I did try, there was plenty of free space (about 50% 
capacity).

>
> I wonder about :
>   $ find<dir>  -type f -exec rm {} \;
>   $ rm -rf<dir>
> (notice the f)

On the 8GB ram machine I tried a similar variant to the above find 
command (using the -delete option), which ate through all my memory like 
everything else.  I didn't try rm -rf though, since I figured it 
wouldn't run faster than rm -r... perhaps I should have though.


> How is the network load when you tried to access that directory ?

Unfortunately this is one I can't answer.

Thanks for your help!  I will remember everything you suggested in case 
something like this comes up again.


Post a reply to this message

From: Jim Henderson
Subject: Re: Trouble with large directory in Linux
Date: 28 Feb 2012 15:01:25
Message: <4f4d3295$1@news.povray.org>
It's probably shell expansion of the wildcard, as others have indicated.

Break it down into smaller deletions.  For example:

rm a*
rm b*
rm c*

etc.

Or use one of the other suggested ideas.

Jim


Post a reply to this message

From: Kevin Wampler
Subject: Re: Trouble with large directory in Linux
Date: 28 Feb 2012 15:04:29
Message: <4f4d334d$1@news.povray.org>
On 2/28/2012 11:47 AM, Warp wrote:
>    A webpage suggests this:
>
> find<dir>  -exec rm {} \;
>
>    (I think that the idea is that this executes the 'rm' command individually
> on each file as it traverses the directory, rather than first gathering all
> the file names into RAM and then trying to do something to them.)

I saw that too, and tried some pretty similar things.  For whatever 
reason I wasn't able to get find to do anything successful, and all 
attempts met the same fate as other approaches.  Fortunately I 
remembered that I had access to a machine with enough ram to actually 
run find, and something like your suggestion did work.  I appreciate you 
taking the time to look for an answer though.


Post a reply to this message

From: Kevin Wampler
Subject: Re: Trouble with large directory in Linux
Date: 28 Feb 2012 15:06:38
Message: <4f4d33ce$1@news.povray.org>
> Thanks for your help! I will remember everything you suggested in case
> something like this comes up again.

Also, I still don't know how I could have deleted things without SSHing 
to the bigger machine, so if you have some ideas there I'd still be very 
interested in hearing them.


Post a reply to this message

From: Darren New
Subject: Re: Trouble with large directory in Linux
Date: 28 Feb 2012 21:53:22
Message: <4f4d9322@news.povray.org>
On 2/28/2012 12:04, Kevin Wampler wrote:
> I saw that too, and tried some pretty similar things.

For the future, just try "find <dir>" and see what you get. That should 
print all the file names to the screen without having to actually load them 
into memory. (The shell and ls both try to sort the list of files before 
printing/using them, so that's part of your problem.)

Also, be aware that once the directory is empty, it may take several minutes 
to rmdir it, because (depending on the underlying file system) it has to 
read the entire directory before it will ack that the directory is 
sufficiently empty to delete.

-- 
Darren New, San Diego CA, USA (PST)
   People tell me I am the counter-example.


Post a reply to this message

From: Kevin Wampler
Subject: Re: Trouble with large directory in Linux
Date: 28 Feb 2012 22:30:38
Message: <4f4d9bde@news.povray.org>
On 2/28/2012 6:53 PM, Darren New wrote:
> On 2/28/2012 12:04, Kevin Wampler wrote:
>> I saw that too, and tried some pretty similar things.
>
> For the future, just try "find <dir>" and see what you get. That should
> print all the file names to the screen without having to actually load
> them into memory. (The shell and ls both try to sort the list of files
> before printing/using them, so that's part of your problem.)

I did try exactly this actually, it printed "." and nothing else before 
it used up 8GB of ram, 15.5GB of swap, and ground to a halt.  It worked 
on the 64GB ram computer though, which is basically how I managed to 
delete the files.  If I did "find ." in the superdirectory by the way, 
it successfully printed files as normal until it got the the problematic 
subdirectory, at which point it ground to a halt just as before.


Post a reply to this message

From: Darren New
Subject: Re: Trouble with large directory in Linux
Date: 2 Mar 2012 00:42:10
Message: <4f505db2$1@news.povray.org>
On 2/28/2012 19:30, Kevin Wampler wrote:
> I did try exactly this actually, it printed "." and nothing else

Hmmm. There's a flag you can pass to find to turn off the optimization of 
counting the number of links on "." and not recursing afterwards, or 
something. I wonder if that would have helped. I wasn't aware find tried 
loading things into memory, but maybe you wound up with a hard link to 
itself in the directory (other than ".") or some such. I suspect file system 
corruption.


-- 
Darren New, San Diego CA, USA (PST)
   People tell me I am the counter-example.


Post a reply to this message

Goto Latest 10 Messages Next 4 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.