|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Chambers wrote:
> So what you're really saying, is that we need a modern implementation of
> traditional Unix style tools, distributed as a package. It wouldn't
> need a ton of that backwards-compatibility stuff, because all of the
> included tools are fresh implementations that we know work together.
What I'm saying is that "Unix" isn't a single coherant design. It's
50,000 random people all doing their own seperate thing, and expecting
the result to actually function. Which, almost unbelievably, it does.
But *damn* is it messy...
--
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Orchid XP v7 <voi### [at] devnull> wrote:
> What I'm saying is that "Unix" isn't a single coherant design. It's
> 50,000 random people all doing their own seperate thing, and expecting
> the result to actually function. Which, almost unbelievably, it does.
> But *damn* is it messy...
I really fail to see how that is at all different from Windows. ;)
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On Sat, 08 Mar 2008 10:46:57 -0800, Chambers wrote:
> Nicolas Alvarez wrote:
>> Tim Cook escribió:
>>> Heck, I use version 7...even 8 was too bloatware for my taste.
>>
>> http://stuff.povaddict.com.ar/psp5.png
>>
>> I tried 8 or 9 and I was impressed at the amount of bloat.
>
> I love the Unix philosophy of making specific, lightweight tools that
> interoperate, rather than gargantuan monolithic beasts that do
> everything rather poorly.
Ditto. When I do data manipulation now in oocalc, I often parse the data
first by piping it through a series of awk scripts and grep filters to
get the data I'm interested in, *then* import it.
Makes it easier to aggregate the stuff I look at.
Jim
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On Sat, 08 Mar 2008 19:16:28 +0000, Orchid XP v7 wrote:
> ["This makes it read from a file. Unless the file is named '-', in which
> case it reads from stdin." OK, so how do I make it read from a file
> that's actually named '-' then? And other surprising ad-hoc
> behaviours...]
(a) you don't name a file '-' because it's got a special meaning to the
shell, or
(b) you escape the filename.
Jim
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On Sat, 08 Mar 2008 11:54:07 -0800, Darren New wrote:
> Here's a contest. Given the directory /tmp/stuff, delete all the files
> in that directory that end with ".tmp".
cd /tmp/stuff; find -type f | grep tmp$ | awk '{system("rm \"" $0 "\""))'
Jim
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Jim Henderson <nos### [at] nospamcom> wrote:
> cd /tmp/stuff; find -type f | grep tmp$ | awk '{system("rm \"" $0 "\""))'
Why does it need to be so complicated?
find /tmp/stuff -name "*.tmp" -exec rm {} \;
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Jim Henderson wrote:
> On Sat, 08 Mar 2008 11:54:07 -0800, Darren New wrote:
>
>> Here's a contest. Given the directory /tmp/stuff, delete all the files
>> in that directory that end with ".tmp".
>
> cd /tmp/stuff; find -type f | grep tmp$ | awk '{system("rm \"" $0 "\""))'
Great. Now try it on a directory with the following names in it:
-rf\n.tmp (where the \n means newline, of course)
<.tmp
.xyz.tmp
hip"hop.tmp
hop'hip.tmp
this.tmp;that.tmp
this\bthat.tmp (where the \b means bell)
this\bthat.tmp (where the \b is two characters, backslash and b)
Two for zero... ;-)
Altho I expect 'find . -name '*.tmp' -exec rm {} \;" might work, but
only because you know the name ends in '.tmp'. You might be able to
manage '-exec rm ./{}' and make it work too. I haven't tested that. I do
know it'll be an order of magnitude slower due to all the invocations of
/bin/rm.
For a real fun time, try "delete all the files in the directory without
deleting the directory". Or "copy them to the /tmp/keep" directory
(which, as far as I can tell, xargs doesn't help with.)
(I work with big directories, and big files. Often I'll have a program
that runs 3 hours, then takes half an hour to delete its input files. Or
individual files that take 2 minutes of disk I/O to delete. Or empty
directories that take several minutes to delete because they used to
have six or seven million files in em. And of course, the machine
becomes completely unusable during any sort of operation like that.)
--
Darren New / San Diego, CA, USA (PST)
"That's pretty. Where's that?"
"It's the Age of Channelwood."
"We should go there on vacation some time."
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Warp wrote:
> Why does it need to be so complicated?
> find /tmp/stuff -name "*.tmp" -exec rm {} \;
> It *does* work, assuming you have a sufficiently modern shell (and not
> a 20 years old sh). Modern shells are rather smart at escaping what needs
> to be escaped when you write "*".
In other words, if you completely bypass the shell, it can be made easy
to work. Otherwise you get things like "no such user John.tmp" when you
try to delete "~John.tmp". And if you want to delete all the files in
the directory, you have to add more escaping. And if you want to scp
them to a different server, you really don't want to be typing the
password for each one, so you either have to use xargs (with the -0
switch, which is why that's there!) or manage to tar them up somehow,
which has the same problems.
Anyway, my point was that it's kind of unobvious to get right, not that
it couldn't be done. I generally just break out Tcl for such a task
(since Tcl doesn't reparse things repeatedly without you asking
explicitly), as it's easier than trying to come up with the right list
of flags to the various programs.
Yet every time I mention the kinds of problems it causes, the UNIX
dweebs *I* know will fight to the death their need to be able to put
backspaces and vertical tab characters into file names. You might as
well try to convince a FORTH programmer that it isn't necessary to have
the ability to start the name of a function with a closing parenthesis.
</rant> :-)
Having the shell doing your expansion has caught me other times, too...
% cat a b c >d
Can't create d: permission denied
% sudo !!
Can't create d: permission denied
Of course, the real line was much longer, so a simple "su" meant
retyping the line (or using copy-and-paste to retype it for you) instead
of just !! or up-arrow. :-)
However, I'm glad Windows finally forced people writing Unix shells to
start dealing with funky file names as a regular occurrence. ;-) I
noticed that the shells started learning to do proper quoting (like for
command-line completion) right around the time SAMBA shares with
MSWindows servers got popular. :-)
--
Darren New / San Diego, CA, USA (PST)
"That's pretty. Where's that?"
"It's the Age of Channelwood."
"We should go there on vacation some time."
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Darren New <dne### [at] sanrrcom> wrote:
> In other words, if you completely bypass the shell, it can be made easy
> to work. Otherwise you get things like "no such user John.tmp" when you
> try to delete "~John.tmp".
Only if you write "~John.tmp" by hand instead of letting the shell do
the proper expansion.
> And if you want to delete all the files in
> the directory, you have to add more escaping.
Well, if "rm *" works (ie. there aren't too many files), you don't.
If there are too many files, I suppose you'll have to use the 'find'
trick.
> And if you want to scp
> them to a different server, you really don't want to be typing the
> password for each one
If you do a "scp *" it won't ask the password for each one.
>, so you either have to use xargs (with the -0
> switch, which is why that's there!) or manage to tar them up somehow,
> which has the same problems.
What's the problem with tarring them? "tar -cvf . files.tar"
> Anyway, my point was that it's kind of unobvious to get right
Granted, the only problem I have seen so far is when there are way
too many files in the directory, in which case "*" expand to too many.
However, I haven't seen any problems with special characters so far.
> Yet every time I mention the kinds of problems it causes, the UNIX
> dweebs *I* know will fight to the death their need to be able to put
> backspaces and vertical tab characters into file names.
The only valid point you have presented is the too-many-files argument,
which can be a real problem for the * expansion. So far I haven't seen
any real problem with special characters.
> Having the shell doing your expansion has caught me other times, too...
> % cat a b c >d
> Can't create d: permission denied
> % sudo !!
> Can't create d: permission denied
> Of course, the real line was much longer, so a simple "su" meant
> retyping the line (or using copy-and-paste to retype it for you) instead
> of just !! or up-arrow. :-)
What's wrong with getting the command from the command history with
the up cursor, going to the beginning of the line with ctrl-a and then
adding the "sudo" there?
> However, I'm glad Windows finally forced people writing Unix shells to
> start dealing with funky file names as a regular occurrence. ;-) I
> noticed that the shells started learning to do proper quoting (like for
> command-line completion) right around the time SAMBA shares with
> MSWindows servers got popular. :-)
Still many shell scripts fail to use the basic "$@" for the correct
expansion of all the parameters (and instead use $*, which is wrong).
Not all unix geeks know all the right tools either.
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Darren New <dne### [at] sanrrcom> wrote:
> (I work with big directories, and big files. Often I'll have a program
> that runs 3 hours, then takes half an hour to delete its input files. Or
> individual files that take 2 minutes of disk I/O to delete. Or empty
> directories that take several minutes to delete because they used to
> have six or seven million files in em. And of course, the machine
> becomes completely unusable during any sort of operation like that.)
Isn't that a clear sign that a re-design of your programs is in place?-)
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
|
|