|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
that traditional OSes hold back the more sophisticated (as in, far from
machine language) languages.
http://www.artima.com/lejava/articles/azul_pauseless_gc.html
Traditional file system interfaces probably do too. For example, if you
wanted something like a virus scanner that was watching your executables, I
suspect that "scan the file when it gets passed to exec()" is probably a
much more common implementation than "scan each block between the time it is
paged in and the time the code branches to it", for example. (Indeed, I
don't know how you'd even do that latter on Linux or Windows or whatever.)
It's interesting that this sort of stuff is starting to get to the point
where people will be willing to break with compatibility at some level.
Phones, game consoles, set-top boxes, and eventually probably "enterprise"
or "cloud" type servers will all be willing to consider a different
operating system that puts limits on compatibility with previous languages
and libraries.
--
Darren New, San Diego CA, USA (PST)
Serving Suggestion:
"Don't serve this any more. It's awful."
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Darren New <dne### [at] sanrrcom> wrote:
> Traditional file system interfaces probably do too. For example, if you
> wanted something like a virus scanner that was watching your executables, I
> suspect that "scan the file when it gets passed to exec()" is probably a
> much more common implementation than "scan each block between the time it is
> paged in and the time the code branches to it", for example. (Indeed, I
> don't know how you'd even do that latter on Linux or Windows or whatever.)
On the subject of virus scanners in particular, I'd say that the very
need to have such scanners is a symptom of fundamentally bad OS design.
Of course this isn't an original idea of mine, as such an idea has been
expressed numerous times by people more knowledgeable than me (and probably
even linked to in this very newsgroup in the past).
The unix philosophy of OS design has always been a step or two closer
to the safer design (with respect to computer viruses and other malware)
then the typical DOS/Windows (and other similar OS's in the past) design.
The reason for this is that unixes have always been designed to be
multi-user operating systems while DOS/Windows has been designed to be
a single-user OS with no regard to security. The very need to handle
multiple user automatically brings forth the need for security: You should
not be able to access other users' data without permission, and especially
you shouldn't be able to access the superusers' data without permission.
This causes security to be built into the system from the ground up.
The DOS/Windows design always took basically the exact opposite approach:
Whatever the user wants to run or do, the OS allows. It's not the system's
task to stop the user doing what he wants. Unfortunately it took over 20
years for Microsoft to rid itself of this mentality (for some reason MS
has always been very slow to adopt certain ideas). NT had security, but
it wasn't even intended for normal users. It wasn't until XP that some
*semblance* of security was introduced (yet, nevertheless, the mentality
of the regular user being by default the superuser was still there, and
probably 99% of XP users out there still use their machine with superuser
privileges). This made the spreading of viruses and malware *trivial*.
Not that the unix design is perfect, but at least viruses, worms and
other malware have always been, and still are, extremely rare in unix
systems in comparison (basically the only relatively successful worms
in the unix world have exploited bugs in the systems to spread themselves,
rather than relying on the users; fix the bug, and the worm stops; however,
in the single-user OS's it requires a very significantly more radical
change in design than just fixing a few bugs).
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Warp <war### [at] tagpovrayorg> wrote:
> The DOS/Windows design always took basically the exact opposite approach:
> Whatever the user wants to run or do, the OS allows. It's not the system's
> task to stop the user doing what he wants. Unfortunately it took over 20
> years for Microsoft to rid itself of this mentality
I think the quintessential example of this are email viruses.
If in the mid and early 90's you had talked about "email viruses",
especially to unix people, they would have laughed. Well, the very
concept *should* be equally laughable even today. The very concept should
be as ludicrous today as it was in the early 90's. Advise like "never
open attachments from emails sent by unknown people" should not have to
exist.
However, then came Microsoft. Some people today might not realize this,
but the first email viruses didn't actually exploit any bugs at all. They
just abused Microsoft's software design. Back then Microsoft's ideology
was still the antiquated "whatever the user wants to do, do". This included
"if the user wants to open an email attachment and watch it in the way as
watching those files has been configured in the system, do so". And yes,
if the attachment was an executable, it would be executed by simply opening
the attachment. This was *by design*. It was not a programming error (at
most one could argue it was a design oversight).
As said, Microsoft is, for some reason, quite slow at learning certain
things, and it seems that they still hadn't developed the "what if"
instinct. (In this particular, a basic question a developer could have
asked would have been "hey, what if someone sends someone else a program
as attachment, and the program deletes everything from the hard drive?"
Even if someone at Microsoft did ask such questions, they were obviously
ignored.)
Of course when Microsoft later put a few barricades to their email program
to slow down this, the email viruses started exploiting bugs in the software.
Naturally bugs are something we have to live with, but this *still* shows
the fundamentally wrong mentality in OS design: Even if the email software
is buggy, it should *still* be impossible for a virus to spread by abusing
this bug. The OS should be designed in such way that it's just not possible.
Virus scanners are fighting the symptoms, not the fundamental problem.
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Warp wrote:
> On the subject of virus scanners in particular, I'd say that the very
> need to have such scanners is a symptom of fundamentally bad OS design.
While I agree it would be nice not to need such things, I don't think many
of the common OSes don't need it. Something like Singularity, where you
can't run code that does something "unsafe", where you have to reboot after
installing a new executable, and where every executable declares in the
manifest what system resources it'll need? Sure, much less virus-prone.
> The reason for this is that unixes have always been designed to be
> multi-user operating systems while DOS/Windows has been designed to be
> a single-user OS with no regard to security.
And yet, I got a virus from a web page while running in a non-privileged
account under Vista. (The first ever malware I ever myself contracted, in
any system since 8-bit computers, I'll mention.) Nowadays, malware attacks
individual accounts. You hit a web site, some bug in Firefox lets some piece
of Javascript write some garbage to some hidden directory, and you have
malware. Bazinga.
It's not like UNIX or IBM big iron never had a worm or virus. They just
weren't trying to hide at the time. I daresay if something like the Morris
worm got a toe-hold nowadays, it would be some time before someone noticed it.
> You should
> not be able to access other users' data without permission, and especially
> you shouldn't be able to access the superusers' data without permission.
> This causes security to be built into the system from the ground up.
But this has been the case with NT forever, and since XP, people haven't
been running superuser. The problem is that people bitch about the security,
complain they can't set the clock without typing a password, etc.
> The DOS/Windows design always took basically the exact opposite approach:
> Whatever the user wants to run or do, the OS allows.
That's kind of what happens when you don't have virtual memory mapping.
> This made the spreading of viruses and malware *trivial*.
It's still pretty trivial, methinks. You don't need to be superuser to
spread the kind of malware that spreads these days. People aren't looking to
take down your machine. They're looking to install a firefox extension that
records your bank logins and posts them to a hacker's web site. No admin
privileges needed for that at all.
> rather than relying on the users;
Nah. It's in three parts: (1) unix was and still is used primarily by people
who understand how computers work at least a little; (2) when people broke
into unix systems, they got 50,000 accounts, so they didn't really need to
propagate as much; and (3) the actual problem with trojans (which is where
most malware comes from these days) was solved before networking was ubiquitous.
For example, that was the whole "NT3.5 has C2 security" stuff came from:
there was an actual keystroke (C-A-D) that would uninterruptably ensure
you're talking to a specific program, e.g. the login prompt. UNIX had no
such feature - anything the login program could do, so could a user-level
program, except for the actual logging in part. It's why "." isn't in the
path by default on UNIX (any more). There are, basically, 101 fixes in UNIX
for security that was broken in spite of being multi-user. It's just that
UNIX is old enough and had multi-user *early* enough that *those* kind of
fixes got put in place before networking was common. Otherwise, I suspect
you'd see way more worms and such in UNIX, for the same reason that even now
XP and Vista and etc have more malware for them than UNIX does.
> in the single-user OS's it requires a very significantly more radical
> change in design than just fixing a few bugs).
Sure, but Windows hasn't been single-user for 10+ years.
--
Darren New, San Diego CA, USA (PST)
Serving Suggestion:
"Don't serve this any more. It's awful."
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Warp wrote:
> The OS should be designed in such way that it's just not possible.
The problem is people, really. You put in code that says "don't accept
executables." Then you put in code that says "Hey, you downloaded this, do
you really want to run it?" Then you put in code that says "Don't accept
executables inside zip files."
If it went far enough, people would send out "here's a zip file with the
password xyz. Unpack it with that password, rename hello.jpg to hello.exe,
and run it to get an important message from your bank" and someone would do it.
> Virus scanners are fighting the symptoms, not the fundamental problem.
True. And the symptom is that software is too complex for someone to know
all its interactions in modern systems, in part because software doesn't
come as a package. This is starting to change, but it's still not enforced.
It's more of a convenience feature than a security feature.
If you said "Firefox can only run executable code that you downloaded with
firefox and was signed by the firefox development key", you'd be golden.
Unless, of course, it's an interpreter.
I think fundamentally you're up against the halting problem, and the more
you lock down the system, the more people who *do* know what they're doing
(or who want to do something regardless of whether they know what they're
doing or not) will be annoyed.
--
Darren New, San Diego CA, USA (PST)
Serving Suggestion:
"Don't serve this any more. It's awful."
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Darren New <dne### [at] sanrrcom> wrote:
> Warp wrote:
> > The OS should be designed in such way that it's just not possible.
> The problem is people, really. You put in code that says "don't accept
> executables." Then you put in code that says "Hey, you downloaded this, do
> you really want to run it?" Then you put in code that says "Don't accept
> executables inside zip files."
> If it went far enough, people would send out "here's a zip file with the
> password xyz. Unpack it with that password, rename hello.jpg to hello.exe,
> and run it to get an important message from your bank" and someone would do it.
Didn't seem to be such a problem in the unix world. And even if that
kind of social engineering caused someone to execute a program they
received by email, it would still be limited to that user's account.
The system itself and other users would be safe. It's a whole different
mentality. It's hard to spread viruses like that.
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Darren New <dne### [at] sanrrcom> wrote:
> > You should
> > not be able to access other users' data without permission, and especially
> > you shouldn't be able to access the superusers' data without permission.
> > This causes security to be built into the system from the ground up.
> But this has been the case with NT forever, and since XP, people haven't
> been running superuser. The problem is that people bitch about the security,
> complain they can't set the clock without typing a password, etc.
They wouldn't if Microsoft hadn't taught them the bad habits.
If the very first version of DOS had had a similar account/password
system as unixes, and this strict mentality had been dragged along (and
improved) in all subsequent versions of DOS and Windows, people today
would not complain because they would take it for granted, as something
obvious.
> > The DOS/Windows design always took basically the exact opposite approach:
> > Whatever the user wants to run or do, the OS allows.
> That's kind of what happens when you don't have virtual memory mapping.
The mentality prevailed well after the 80386 became the de-facto standard.
(And, in fact, even the 80286 supported some type of memory protection.)
> > in the single-user OS's it requires a very significantly more radical
> > change in design than just fixing a few bugs).
> Sure, but Windows hasn't been single-user for 10+ years.
Well, how long it took for Microsoft to *finally* get some semblance
of security into their desktop OS? (And the major reason why it took them
so long is not because of technical difficulties, but simply because they
just didn't bother.)
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Warp wrote:
> Didn't seem to be such a problem in the unix world.
No, because most people running UNIX software knew the basics of how
computers work. They'd recognise this as a scam, rather than just another
series of opaque rules you have to follow to get something done on this damn
machine.
> And even if that
> kind of social engineering caused someone to execute a program they
> received by email, it would still be limited to that user's account.
Except that's why they took "." out of the path, and why setuid files aren't
allowed to be writable and why writing to them turns off setuid, neither of
which were security features even as late as V7. Exactly because enough
people fell for it that they made it inconvenient for developers to keep
those not paying attention from running a trojan.
> The system itself and other users would be safe.
Except if there's only one user on the system, like there often is with
Windows desktop machines, you're still screwed if your web browser starts
sending your bank passwords to russian mafia guys.
> It's a whole different
> mentality. It's hard to spread viruses like that.
It only takes one hole. How widely do you think the Morris worm could have
spread if he was being paid a million dollars to make it hibernate on the
computer? He'd write an executable somewhere, call it something benign,
start it up as root, then let it sit and listen for a particular string to
show up in a spam mail to DDOS some network computer.
I think what you're seeing is more that by the time it was *profitable* to
write a virus, Windows was already the best vector both in popularity and
user-naivety. Granted, it was pretty common back in the early days of
Windows to have a virus or whatever spread, but it was also much more common
to hear of break-ins of commercial servers to be stealing credit cards and such.
--
Darren New, San Diego CA, USA (PST)
Serving Suggestion:
"Don't serve this any more. It's awful."
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Warp wrote:
> They wouldn't if Microsoft hadn't taught them the bad habits.
Probably true. But that is left over from when Windows ran on machines that
weren't even capable of running UNIX and for which having a multi-user OS
made no sense.
> If the very first version of DOS had had a similar account/password
> system as unixes,
... then it wouldn't have run on an 8086, and MS would be broke.
> would not complain because they would take it for granted, as something
> obvious.
It's hard to say. Most of the other systems of the day didn't have it either.
>>> The DOS/Windows design always took basically the exact opposite approach:
>>> Whatever the user wants to run or do, the OS allows.
>
>> That's kind of what happens when you don't have virtual memory mapping.
>
> The mentality prevailed well after the 80386 became the de-facto standard.
Backwards compatibility is a bitch, indeed.
>> Sure, but Windows hasn't been single-user for 10+ years.
>
> Well, how long it took for Microsoft to *finally* get some semblance
> of security into their desktop OS?
Backwards compatibility is a bitch.
> (And the major reason why it took them
> so long is not because of technical difficulties, but simply because they
> just didn't bother.)
Sure. Welcome to the commercial market.
But this is kind of off the topic now, methinks. Or at least drifting into
another discussion.
UNIX isn't safe because it has had multi-user stuff longer. (Safer. Far from
"safe".) One slip by the superuser can screw things up. Anyone with the root
password can do anything they want and steal any data they want from anyone
else on the machine. Tricking the superuser into running your code, or
mounting a disk you have without turning off setuid on it, or a server with
a buffer overflow, or anything like that can corrupt the system in ways just
as hard to figure out.
Contrast with something like Singularity, where you explicitly list every
program you're going to run, every file they're going to access, there's no
way to run code that isn't signed, no way to get someone to overrun a buffer
to run code that wasn't in the executable to start with, and giving each
program its own set of permissions orthogonally to the users running the
program. *That* is a significant advance in security of an operating system.
--
Darren New, San Diego CA, USA (PST)
Serving Suggestion:
"Don't serve this any more. It's awful."
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Warp wrote:
> Didn't seem to be such a problem in the unix world.
Thinking on it, there are a whole bunch of "patches" made in the UNIX world
to account for bad security. The whole thing with "xhost" and the magic
cookies. Taking "." out of the default path. Sticky bits on directories.
Shadow password files. The vipw command. The -print0 option on find.
Poisoned DNS caches (which affected everyone, really). All *kinds* of stuff
in UUCP. All kinds of race conditions for files in /tmp/. Why people
switched from rcp to scp. Why people switched from http to https. Why FTP
runs in a chroot cell, and still failed that on occasion if one
misconfigured it. Why it used to be easy to find thousands of /etc/passwd
files on the web when search engines first came out. Why Kerberos used to
let you download the whole list of encrypted passwords for off-line cracking
before you identified yourself.
It's just that most of the breaks these changes fixed happened before
networking was really ubiquitous, and before taking advantage of them was
worth huge amounts of money.
Yes, MS was somewhat late to the party, but it's not like UNIX escaped being
broken into over, and over, and over again, simply because it has always
been multi-user. It's only relatively recently that people in general
started worrying about these things on a systematic basis, because it has
only been relatively recently that it was more valuable to break into it
than getting your college class grades changed. ;-)
Oh, and by the way:
http://en.wikipedia.org/wiki/Christmas_Tree_EXEC
So, yeah. Would never happen on a *real* system.
--
Darren New, San Diego CA, USA (PST)
Serving Suggestion:
"Don't serve this any more. It's awful."
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
|
|