POV-Ray : Newsgroups : povray.off-topic : Trouble with large directory in Linux Server Time
29 Jul 2024 10:32:07 EDT (-0400)
  Trouble with large directory in Linux (Message 5 to 14 of 14)  
<<< Previous 4 Messages Goto Initial 10 Messages
From: Jim Henderson
Subject: Re: Trouble with large directory in Linux
Date: 28 Feb 2012 15:01:25
Message: <4f4d3295$1@news.povray.org>
It's probably shell expansion of the wildcard, as others have indicated.

Break it down into smaller deletions.  For example:

rm a*
rm b*
rm c*

etc.

Or use one of the other suggested ideas.

Jim


Post a reply to this message

From: Kevin Wampler
Subject: Re: Trouble with large directory in Linux
Date: 28 Feb 2012 15:04:29
Message: <4f4d334d$1@news.povray.org>
On 2/28/2012 11:47 AM, Warp wrote:
>    A webpage suggests this:
>
> find<dir>  -exec rm {} \;
>
>    (I think that the idea is that this executes the 'rm' command individually
> on each file as it traverses the directory, rather than first gathering all
> the file names into RAM and then trying to do something to them.)

I saw that too, and tried some pretty similar things.  For whatever 
reason I wasn't able to get find to do anything successful, and all 
attempts met the same fate as other approaches.  Fortunately I 
remembered that I had access to a machine with enough ram to actually 
run find, and something like your suggestion did work.  I appreciate you 
taking the time to look for an answer though.


Post a reply to this message

From: Kevin Wampler
Subject: Re: Trouble with large directory in Linux
Date: 28 Feb 2012 15:06:38
Message: <4f4d33ce$1@news.povray.org>
> Thanks for your help! I will remember everything you suggested in case
> something like this comes up again.

Also, I still don't know how I could have deleted things without SSHing 
to the bigger machine, so if you have some ideas there I'd still be very 
interested in hearing them.


Post a reply to this message

From: Darren New
Subject: Re: Trouble with large directory in Linux
Date: 28 Feb 2012 21:53:22
Message: <4f4d9322@news.povray.org>
On 2/28/2012 12:04, Kevin Wampler wrote:
> I saw that too, and tried some pretty similar things.

For the future, just try "find <dir>" and see what you get. That should 
print all the file names to the screen without having to actually load them 
into memory. (The shell and ls both try to sort the list of files before 
printing/using them, so that's part of your problem.)

Also, be aware that once the directory is empty, it may take several minutes 
to rmdir it, because (depending on the underlying file system) it has to 
read the entire directory before it will ack that the directory is 
sufficiently empty to delete.

-- 
Darren New, San Diego CA, USA (PST)
   People tell me I am the counter-example.


Post a reply to this message

From: Kevin Wampler
Subject: Re: Trouble with large directory in Linux
Date: 28 Feb 2012 22:30:38
Message: <4f4d9bde@news.povray.org>
On 2/28/2012 6:53 PM, Darren New wrote:
> On 2/28/2012 12:04, Kevin Wampler wrote:
>> I saw that too, and tried some pretty similar things.
>
> For the future, just try "find <dir>" and see what you get. That should
> print all the file names to the screen without having to actually load
> them into memory. (The shell and ls both try to sort the list of files
> before printing/using them, so that's part of your problem.)

I did try exactly this actually, it printed "." and nothing else before 
it used up 8GB of ram, 15.5GB of swap, and ground to a halt.  It worked 
on the 64GB ram computer though, which is basically how I managed to 
delete the files.  If I did "find ." in the superdirectory by the way, 
it successfully printed files as normal until it got the the problematic 
subdirectory, at which point it ground to a halt just as before.


Post a reply to this message

From: Darren New
Subject: Re: Trouble with large directory in Linux
Date: 2 Mar 2012 00:42:10
Message: <4f505db2$1@news.povray.org>
On 2/28/2012 19:30, Kevin Wampler wrote:
> I did try exactly this actually, it printed "." and nothing else

Hmmm. There's a flag you can pass to find to turn off the optimization of 
counting the number of links on "." and not recursing afterwards, or 
something. I wonder if that would have helped. I wasn't aware find tried 
loading things into memory, but maybe you wound up with a hard link to 
itself in the directory (other than ".") or some such. I suspect file system 
corruption.


-- 
Darren New, San Diego CA, USA (PST)
   People tell me I am the counter-example.


Post a reply to this message

From: Le Forgeron
Subject: Re: Trouble with large directory in Linux
Date: 2 Mar 2012 13:42:34
Message: <4f51149a$1@news.povray.org>
Le 02/03/2012 06:42, Darren New nous fit lire :
> On 2/28/2012 19:30, Kevin Wampler wrote:
>> I did try exactly this actually, it printed "." and nothing else
> 
> Hmmm. There's a flag you can pass to find to turn off the optimization
> of counting the number of links on "." and not recursing afterwards, or
> something. I wonder if that would have helped. I wasn't aware find tried
> loading things into memory, but maybe you wound up with a hard link to
> itself in the directory (other than ".") or some such. I suspect file
> system corruption.
> 
> 
I'm more thinking about it: by default, it (ls) will try to sort the
filenames in alphabetical order... find the switch to disable that!


Post a reply to this message

From: Kevin Wampler
Subject: Re: Trouble with large directory in Linux
Date: 2 Mar 2012 22:10:40
Message: <4f518bb0$1@news.povray.org>
On 3/2/2012 10:42 AM, Le_Forgeron wrote:
>>
> I'm more thinking about it: by default, it (ls) will try to sort the
> filenames in alphabetical order... find the switch to disable that!
>

This is an excellent point, although I did at least check that find 
wasn't trying to sort things when I ran it (it wasn't).  Totally forgot 
that ls would do that though.


Post a reply to this message

From: Kevin Wampler
Subject: Re: Trouble with large directory in Linux
Date: 2 Mar 2012 22:12:57
Message: <4f518c39$1@news.povray.org>
On 3/1/2012 9:42 PM, Darren New wrote:
> On 2/28/2012 19:30, Kevin Wampler wrote:
>> I did try exactly this actually, it printed "." and nothing else
>
> I suspect file system corruption.

This has been a worry of mine too, although everything else seems to be 
fine, and deleting the files did eventually work.  I also am starting to 
wonder if there's something with the particular Linux distro I was 
using, since the more I think about it the more I feel that it ran a 
little *too* well on the big machine, I'll poke around a bit more when I 
get back from vacation and see if i can figure out anything else.


Post a reply to this message

From: Darren New
Subject: Re: Trouble with large directory in Linux
Date: 2 Mar 2012 23:50:24
Message: <4f51a310$1@news.povray.org>
On 3/2/2012 10:42, Le_Forgeron wrote:
> I'm more thinking about it: by default, it (ls) will try to sort the
> filenames in alphabetical order... find the switch to disable that!

That's why I sugested just "find . -print". "find" doesn't try to sort the 
output. (glob sorts the output also.)

-- 
Darren New, San Diego CA, USA (PST)
   People tell me I am the counter-example.


Post a reply to this message

<<< Previous 4 Messages Goto Initial 10 Messages

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.