POV-Ray : Newsgroups : povray.off-topic : Questionable optimizations Server Time
5 Sep 2024 21:24:18 EDT (-0400)
  Questionable optimizations (Message 15 to 24 of 44)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: Doctor John
Subject: Re: Questionable optimizations
Date: 19 Jul 2009 08:45:18
Message: <4a63155e@news.povray.org>
Darren New wrote:
> Doctor John wrote:
>> At a guess that would be:
>> http://www.theregister.co.uk/2009/07/17/linux_kernel_exploit/
> 
> Yes, but since that wasn't the point of my question and people might
> reasonably assume I'm bashing on Linux if I posted a link to it, I
> thought I'd omit it.  The question was why the compiler felt the need
> for *that* optimization in the first place. When would it ever be a good
> idea?
> 

Actually, I was going to ask the group the same question but you just
beat me to it :-) and you could hardly assume that I was in any way
anti-Linux.
What bothers me (apart from the unwanted optimisation) is why Torvalds
et al have chosen to remain silent on this.
Another thing to remember is that it is not only the kernel code that is
the problem here, it's also gcc - so comments from the Stallman camp
would also be appropriate.

John
-- 
"Eppur si muove" - Galileo Galilei


Post a reply to this message

From: clipka
Subject: Re: Questionable optimizations
Date: 19 Jul 2009 09:20:00
Message: <web.4a631cc82c54829feecd81460@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> But I'm trying to figure out why you would add optimizations
> specifically to improve the performance of code you are optimizing *because*
> you know it is flawed.

No, optimizers don't optimize code *because* it's flawed, just maybe *although*
it's flawed.

Or, possibly more particularly fitting the intention in this case: Although it
is *redundant*.

After all, that's a crucual part of optimization. It allows programmers to write
source code that is more self-descriptive and more self-defensive, yet at the
speed of hand-optimized code.

For instance, one might write:

    if (i <= LOWER_BOUND) {
        return ERR_TOO_LOW;
    }
    // very long code here
    if (i > LOWER_BOUND && i <= UPPER_BOUND) {
        // (1)
        return OK;
    }

The test for i>LOWER_BOUND is, of course, perfectly redundant. But explicitly
stating it may help...

(a) to make it easier to see for the casual reader that at (1), i>LOWER_BOUND is
always true; after all, they may not have read the part before the very long
code, or may have forgotten about the i<=LOWER_BOUND test by then; and

(b) to make sure that at (1), i>LOWER_BOUND is true even when someone tampers
with the remainder of the code and happens to remove the first check.


> Right.  I think the comments on the report of it that I read implied that
> the assignment was added later, and the problem is that in C, the
> declarations all had to be at the top.
>
> So someone wrote
>     blah * xyz;
>     if (!xyz) return failure;
>     ... use xyz ...
>
> Someone else came along and added a declaration
>     blah * xyz;
>     another pdq = xyz->something;
>     if (!xyz) ....

That may indeed explain how the code came to be there in the first place.

> Instead, they should have said
>     blah * xyz;
>     another pdq;
>     if (!xyz) ...
>     pdq = xyz->something;

Yesss. Absolutely.

I must confess that I used to do it otherwise, too. It was PC-lint that taught
me not to, by constantly pestering me about it. And developing embedded
software for the automotive industry, I had to obey it: MISRA rules forbid to
use *any* language constructs that officially lead to unspecified behavior.


> Agreed. The problem is that the compiler optimized out good code based on
> bogus code. My only question is whether I'm missing something here, because
> such an optimization seems really a bad idea.

Optimizers aren't designed to detect bogus code - they're designed to speed up
things.

Even most compilers do a pretty poor job at detecting bogus code.

That's why you need static code-analysis tools, specifically designed to
identify such bogosities.


> > There's even some reason to argue that if the programmer happily dereferences a
> > pointer without checking it for NULL, why shouldn't the compiler assume that
> > other provisions have been made that it cannot be NULL in the first place?
>
> That's exactly what the compiler did. It looked, saw the dereference of the
> pointer, then the check that the pointer isn't null, and optimized out the
> check for the pointer being null. But the fact that the programmer said to
> check for NULL at a point where all code paths have already dereferenced the
> pointer would seem to be at least a warning, not an "oh good, here's an
> optimization I can apply."

As I already pointed out above, it may also have been code that the developer
left in there just for clarity, to be removed later, or whatever, *expecting*
the compiler to optimize it away.


> I'm surprised and dismayed when the kernel gets code checked in that's
> syntactically invalid,

At this point, in a good commercial project the developer would already get his
head chopped off - by colleagues wo *do* their homework and therefore hit this
stumbling block when trying to compile their own changes to do their own module
testing (and as good developers had suspected themselves to have done something
wrong first, and spent hours to track down the problem, until ultimately
identifying their colleague as the culprit and being *not* amused).

Checking in code you never actually compiled yourself? Hey, haven't done our
homework, have we?!?

> then released,

.... and at this point it would be the build manager's head to roll if it's a
single-platform project or the particular code applies to all projects - or the
test team leader's head if it's intended to be a portable thing and the code is
activated only on certain target platforms (unless of course the code is a fix
for an exotic platform that isn't available in-house).

In free-software projects with all contributors being volunteers, unfortunately
there's no authority to chop off some heads. After all, if you treat the
volunteers too harshly, off they go to someplace more relaxed.


Post a reply to this message

From: clipka
Subject: Re: Questionable optimizations
Date: 19 Jul 2009 09:50:00
Message: <web.4a6324022c54829feecd81460@news.povray.org>
Doctor John <joh### [at] homecom> wrote:
> What bothers me (apart from the unwanted optimisation) is why Torvalds
> et al have chosen to remain silent on this.

What? And lose ground on their "safest operating system on earth" territory?

I did believe that, too, but hearing these news and thinking about the
background and how the "top hats" went to deal with the issue, I guess it's
time to start thinking out of the box once again: The Linux developer
community, too, want to "sell" their products in a way. Both their kernel, and
their FSF ideals.

Now, after always claming that free software is the superior development
approach, they prove that some of its drawbacks are potentially more serious
than they would like them to be.


Post a reply to this message

From: Warp
Subject: Re: Questionable optimizations
Date: 19 Jul 2009 10:11:48
Message: <4a6329a4@news.povray.org>
clipka <nomail@nomail> wrote:
> [-- text/plain, encoding 8bit, charset: iso-8859-1, 18 lines --]

> Doctor John <joh### [at] homecom> wrote:
> > What bothers me (apart from the unwanted optimisation) is why Torvalds
> > et al have chosen to remain silent on this.

> What? And lose ground on their "safest operating system on earth" territory?

> I did believe that, too, but hearing these news and thinking about the
> background and how the "top hats" went to deal with the issue, I guess it's
> time to start thinking out of the box once again: The Linux developer
> community, too, want to "sell" their products in a way. Both their kernel, and
> their FSF ideals.

> Now, after always claming that free software is the superior development
> approach, they prove that some of its drawbacks are potentially more serious
> than they would like them to be.

  Aren't you exaggerating a bit?

  Do you know how many serious security holes are found each year in Linux?
I think the number is in the *hundreds*. A few of these bugs each year provide
a way to get root privileges by running a program exploiting the bug. It has
happened dozens of times in the past, and it will happen in the future. This
is just another one of those.

  (And before anyone says anything, no, Windows is not better. Windows is
year after year always at the top of the list of most security flaws found
during the year.)

  You *can't* expect the OS to be completely safe.

-- 
                                                          - Warp


Post a reply to this message

From: Darren New
Subject: Re: Questionable optimizations
Date: 19 Jul 2009 11:32:42
Message: <4a633c9a$1@news.povray.org>
Warp wrote:
>   Do you know how many serious security holes are found each year in Linux?

And I don't think this one is even in a kernel that was released into major 
distributions, for that matter. :-)

>   (And before anyone says anything, no, Windows is not better. Windows is
> year after year always at the top of the list of most security flaws found
> during the year.)

Is this still true?  I think there's more 3rd-party flakey code, but not 
stuff that comes with Windows necessarily. Certainly not as much as it used 
to be.

-- 
   Darren New, San Diego CA, USA (PST)
   "We'd like you to back-port all the changes in 2.0
    back to version 1.0."
   "We've done that already. We call it 2.0."


Post a reply to this message

From: Darren New
Subject: Re: Questionable optimizations
Date: 19 Jul 2009 11:40:39
Message: <4a633e77$1@news.povray.org>
clipka wrote:
> Optimizers aren't designed to detect bogus code - they're designed to speed up
> things.

Sure, but it sounded like an optimization that only worked on bogus code, 
which seemed like a waste of time of the person writing the optimizer.

Macros make a good explanation why, tho.

> As I already pointed out above, it may also have been code that the developer
> left in there just for clarity, to be removed later, or whatever, *expecting*
> the compiler to optimize it away.

I think this is different. Your cases, sure.

> At this point, in a good commercial project the developer would already get his
> head chopped off

Well, to be fair, they all know it's bogus. It's been at 0.9.x for like five 
years. :-) I'm always amused at open source folks who won't recognise that 
the first thing they give to the public is 1.0 regardless of how you number it.

> Checking in code you never actually compiled yourself? Hey, haven't done our
> homework, have we?!?

Crap like this would be completely untenable before Google, really.

> test team leader's head if it's intended to be a portable thing and the code is
> activated only on certain target platforms (unless of course the code is a fix
> for an exotic platform that isn't available in-house).

This happened to be some mips-specific assembly. Not exactly exotic, but 
then why are you changing that file if you don't have a mips chip to test it 
on in the first place?

-- 
   Darren New, San Diego CA, USA (PST)
   "We'd like you to back-port all the changes in 2.0
    back to version 1.0."
   "We've done that already. We call it 2.0."


Post a reply to this message

From: Warp
Subject: Re: Questionable optimizations
Date: 19 Jul 2009 11:56:16
Message: <4a634220@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> >   (And before anyone says anything, no, Windows is not better. Windows is
> > year after year always at the top of the list of most security flaws found
> > during the year.)

> Is this still true?  I think there's more 3rd-party flakey code, but not 
> stuff that comes with Windows necessarily. Certainly not as much as it used 
> to be.

  I don't know. I haven't been reading such reports lately. One would think,
though, that new OS = new bugs (unless Vista isn't really "new").

  But I still think it's fair to say that Linux is safer than Windows. Why?
For the simple reason that Linux is not such a popular *target* for attacks
and malware as Windows is. For example, basically 100% of email virii and
http exploits have targetted Windows. I'd bet at least 99.99% of more
traditional virii out there work only on Windows (and the older ones in
DOS). Almost 100% of malware (spyware, adware, trojans, rootkits...) target
Windows. While I have no idea how popular Windows as a target is among
crackers, I bet it's well over half of them for the simple reason that
Windows is way more widespread than Linux. (The only place where other
systems might rival Windows as a target of crackers is in the web servers
and other such servers, because there other systems are more popular than
in desktop computers.)

  Of course, Linux is in no way safe from attacks either. In fact, the linux
box of a good friend of mine got hijacked by a hacker some years ago (and it
remained so for a good amount of time, I think it was several weeks or even
months, without my friend noticing). However, usually these cases are direct
attacks by individual hackers, rather than being a massive attack by a
self-spreading program. You are much less likely to get your computer hacked
by someone directly, than by a self-spreading program (assuming you are
running an OS supported by that program). While I don't assume that my
linux box has not been hijacked without me noticing, I'd say it's pretty
unlikely.

-- 
                                                          - Warp


Post a reply to this message

From: Darren New
Subject: Re: Questionable optimizations
Date: 19 Jul 2009 13:29:43
Message: <4a635807$1@news.povray.org>
Warp wrote:
>   I don't know. I haven't been reading such reports lately. 

Same here.

> One would think,
> though, that new OS = new bugs (unless Vista isn't really "new").

Well, it's not really new. It's built on NT, 2000, 2003, etc. And it has a 
whole bunch of new security stuff added (Defender, UAC, secure desktops (or 
whatever they call the thing that prevents the shatter attack), etc)

I'd have to start taking the SANS stuff out of the junk folder, but my 
memory is that most of the attacks are against web-based services, with very 
rare root-access on either OS, most of which are privilege escalation. Maybe 
one every week or two even attacks on either kernel from outside.

>   But I still think it's fair to say that Linux is safer than Windows. Why?

I agree. It used to be Windows had about 2x the code and 2x the attacks found.

> For the simple reason that Linux is not such a popular *target* for attacks
> and malware as Windows is.

Yeah. UNIX had way, way more exploits back before Windows was a popular 
internet presence. Every week there would be reports of credit card lists 
being stolen from ISPs, viruses attacking sendmail, etc. Then it was 
Netscape's servers. *Then* it was Windows, around the '98 timeframe.

And before that, floppy boot-sector viruses ran rampant amongst 
microcomputer OSes, but I never heard of any of those being done for money.

 > email virii

I saw a great rant from someone who actually knows Latin about "virii". 
"Virii" is apparently the plural of some completely unrelated latin word, 
like "voice" or "people" or something.  "Virus" is apparently already a mass 
noun not unlike "stuff".

> I bet it's well over half of them for the simple reason that
> Windows is way more widespread than Linux.

I think it's three things. 1 - There's more Windows desktops. 2 - People 
with Linux desktops tend to be more clued and/or have someone administering 
them that is clued. 3 - I suspect that most Linux machines are actually 
servers that are quite locked down and rarely doing anything not 
specifically planned.

I've run Windows servers whose only job was running the database, or answers 
calls from credit card terminals, or etc, and it was trivial to lock them 
down enough that you didn't have to worry much about exploits. It's when you 
actually put someone in front of the keyboard who will surf web pages, run 
stuff people email to them, and so on, that you get lots of infections.

Of course there are some break-ins, but I understand that most of them are 
patched before they're actually exploited, and it's the people who didn't 
patch that are the problem.

> (The only place where other
> systems might rival Windows as a target of crackers is in the web servers
> and other such servers, because there other systems are more popular than
> in desktop computers.)

That, and web servers have value in and of themselves. You can either 
disrupt a rival, steal credit cards, etc.  Hacking an individual desktop 
machine only gives you what's on that machine (i.e., one person's financial 
data) or one node of a bot-net.

> However, usually these cases are direct
> attacks by individual hackers, rather than being a massive attack by a
> self-spreading program.

Now, yes. It used to be way more common for even proprietary UNIXes to get 
hit by viruses and worms than it is now for Windows, methinks.

-- 
   Darren New, San Diego CA, USA (PST)
   "We'd like you to back-port all the changes in 2.0
    back to version 1.0."
   "We've done that already. We call it 2.0."


Post a reply to this message

From: clipka
Subject: Re: Questionable optimizations
Date: 19 Jul 2009 13:40:01
Message: <web.4a6358d72c54829feecd81460@news.povray.org>
Warp <war### [at] tagpovrayorg> wrote:
>   (And before anyone says anything, no, Windows is not better. Windows is
> year after year always at the top of the list of most security flaws found
> during the year.)

True, but the superiority of Linux crumbles in my eyes, if the responsible
people brush aside security holes that easily. I always had expected the Linux
community to have a basic mentality of "oops, right, our mistake; we'll fix
that of course", or at least "oops, right, this is problematic; we'll work
around that of course". But now they show that they, too, are more like "well,
that's not our fault; we won't fix it".

And knowing (through obvious proof) that the Linux kernel code isn't checked
with professional tools (or rather, probably is, but the results seem to be
brushed off and not taken seriously unless proven to be exploitable) doesn't
convince me of Linux' alleged superior security either.

I'm not saying "they're worse than Microsoft" - all I'm saying is "they're no
better".

Which is to say, "they're worse than what they claim and are percieved to
be". And as we all know, overestimating a system's security is a bad thing. If
you run two systems which are equally secure from a technical point of view but
one is percieved as more secure, that one will actually pose the higher security
threat.

(And what I'm also saying is that I think the commercial approach is
*potentially* better suited to produce secure software.)


Post a reply to this message

From: clipka
Subject: Re: Questionable optimizations
Date: 19 Jul 2009 14:05:00
Message: <web.4a635ff52c54829feecd81460@news.povray.org>
Warp <war### [at] tagpovrayorg> wrote:
>   But I still think it's fair to say that Linux is safer than Windows. Why?
> For the simple reason that Linux is not such a popular *target* for attacks
> and malware as Windows is. For example, basically 100% of email virii and
> http exploits have targetted Windows. I'd bet at least 99.99% of more
> traditional virii out there work only on Windows (and the older ones in
> DOS). Almost 100% of malware (spyware, adware, trojans, rootkits...) target
> Windows. While I have no idea how popular Windows as a target is among
> crackers, I bet it's well over half of them for the simple reason that
> Windows is way more widespread than Linux. (The only place where other
> systems might rival Windows as a target of crackers is in the web servers
> and other such servers, because there other systems are more popular than
> in desktop computers.)

I dare to disagree - I'd even postulate that Linux poses a *higher* security
risk than Windows.

Why?

Because Windows has its highest popularity on Desktops. Yeah, that makes great
targets, and a great number of them to set up bot networks.

But Linux systems, being the more popular among Web servers and such, are
typically a good deal closer to the infrastructure.

If you can infiltrate the very infrastrucuture of the web, this makes
infiltrating the end-user computers much easier.

So if some infiltrated Windows systems would be an inflammation, I'd liken some
infiltrated Linux systems to a sepsis.

Note that Web servers have already been infiltrated as meta-targets in order to
infiltrate end-user computers; if these attacks become more common and
sophisticated (and I expect they will), I'd care more about a secure Linux
kernel than I'd do about a secure Windows kernel.


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.