 |
 |
|
 |
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
Patrick Elliott wrote:
> Uh. Doing that requires shutting up off, then rebooting, then defraging
> the drive, then turning it on again..
If you don't have enough space to put a pagefile on your machine that's
contiguous, then you're going to get fragmented files and all. Yep, yep, yep!
> Well, lets just say that, at least
> one person trying this had it render the OS unbootable. I would rather
> let the damn thing just handle it itself, which should include imho,
> occationally using some idle time to restructure the file, so its
> contiguous (like duh!). Near as I can tell, it isn't doing that at all
> as part of normal operations.
It defrags everything else that way. Just not the pagefile. Because, you
know the page file is usually open, and it's pretty easy to defrag it at
boot time if you want.
Of course, if you don't have enough free space to make a block of disk big
enough to hold a page file, then no, it's not going to defrag.
> Got to be about the one and only reason I am looking at getting 7, so it
> doesn't have the memory limits XP did.
The only XP memory limits I know of was running into the 4G boundary. And
apparently that's a licensing thing - seems XP x86 is happy to use however
much memory you put in there, except that Microsoft tells it not to.
--
Darren New, San Diego CA, USA (PST)
I ordered stamps from Zazzle that read "Place Stamp Here".
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
Darren New wrote:
> Patrick Elliott wrote:
>> Uh. Doing that requires shutting up off, then rebooting, then
>> defraging the drive, then turning it on again..
>
> If you don't have enough space to put a pagefile on your machine that's
> contiguous, then you're going to get fragmented files and all. Yep, yep,
> yep!
>
Get enough now. Already defragged it once. Went from 33,000+ fragments
to 1,000, but.. Still be be nice for more.
Still, point is, if you have memory free, open or not, I see no reason
why, if you have free real memory, you couldn't page in some of the file
that isn't "in use", page it back out someplace saner, etc. and defrag
it that way. Ironically, its one thing I almost miss from the old
Win3.11 days, where you could use Norton's defragger to defrag
everything "including" the page file, and shift files you use a lot
closer to the start, and unscatter directories, *and* consolidate free
space. I get that Windows defrag now does most of those, but it does
them damned inefficiently, and no matter how many times you "force" it
to consolidate files, it will flat out *refuse* to consolidate the free
space, even when there is no sane reason to leave a handful of files
scattered willy nilly over the remaining disk space.
And, that is the problem. If you need to defrag something like the page
file, during boot, how do you do that, if it won't move the stupid files
that are in the way? :(
Actually reminds me of the ice truck issues at work. Big days for
shoppers we bring in a truck. The normal method of handling this seems
to have been to spend 3-4 hours trying to get early customers to move
their damn cars, so we could park the thing. This year we spent 2 hours,
at night, waiting for people to get a hint and stop parking where we
where trying to rope off, so that we just had to move the ropes the next
day. This, obviously, worked a whole lot better. lol
> The only XP memory limits I know of was running into the 4G boundary.
> And apparently that's a licensing thing - seems XP x86 is happy to use
> however much memory you put in there, except that Microsoft tells it not
> to.
>
Yeah. I know. But, the hack requires changing the boot.sys file. They
managed to make the OS smart enough to boot anyway, if its
damaged/missing (mine is missing for some reason), but didn't include
any way to rebuild it, if you lost/mangled it. So.. No file, no clue how
to write it, and therefor, no means to override the setting.
--
void main () {
If Schrödingers_cat is alive or version > 98 {
if version = "Vista" {
call slow_by_half();
call DRM_everything();
}
call functional_code();
}
else
call crash_windows();
}
<A HREF='http://www.daz3d.com/index.php?refid=16130551'>Get 3D Models,
3D Content, and 3D Software at DAZ3D!</A>
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
Patrick Elliott wrote:
> Get enough now. Already defragged it once. Went from 33,000+ fragments
> to 1,000, but.. Still be be nice for more.
You just need to have enough space to put it there. :-)
> Still, point is, if you have memory free, open or not, I see no reason
> why, if you have free real memory, you couldn't page in some of the file
> that isn't "in use", page it back out someplace saner, etc. and defrag
> it that way.
Of course you could. As I said, the "sane" reason is that not enough people
want this to justify the difficulty of implementing it. And yes, it's
difficult. Probably much so more than you think.
> Ironically, its one thing I almost miss from the old
> Win3.11 days, where you could use Norton's defragger to defrag
> everything "including" the page file, and shift files you use a lot
> closer to the start, and unscatter directories, *and* consolidate free
> space.
Yeah, when you can shut down all other processes and do it with the machine
offline, it's a lot easier.
> it will flat out *refuse* to consolidate the free
> space, even when there is no sane reason to leave a handful of files
> scattered willy nilly over the remaining disk space.
There is a sane reason, actually. You just don't know what that reason is.
:-) One possibility is that they're protected because they're branches of
files carrying around the encryption key information. Another possibility is
that they're locked open for writing.
> And, that is the problem. If you need to defrag something like the page
> file, during boot, how do you do that, if it won't move the stupid files
> that are in the way? :(
Copy those files to a different disk, delete them, defrag, copy them back.
>> The only XP memory limits I know of was running into the 4G boundary.
>> And apparently that's a licensing thing - seems XP x86 is happy to use
>> however much memory you put in there, except that Microsoft tells it
>> not to.
>>
> Yeah. I know. But, the hack requires changing the boot.sys file.
Different hack, I suspect.
> They
> managed to make the OS smart enough to boot anyway, if its
> damaged/missing (mine is missing for some reason), but didn't include
> any way to rebuild it, if you lost/mangled it. So.. No file, no clue how
> to write it, and therefor, no means to override the setting.
LMGTFY. http://mirror.href.com/thestarman/asm/mbr/bootini.htm
--
Darren New, San Diego CA, USA (PST)
I ordered stamps from Zazzle that read "Place Stamp Here".
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
Patrick Elliott wrote:
> that are in the way? :(
I can show you how to get rid of most or all of those, btw.
It's probably either:
1) Encrypted file system data forks,
2) volume shadow copies (i.e., system restore files),
3) the USN journal
none of which can be moved by the normal defrag mechanism.
--
Darren New, San Diego CA, USA (PST)
I ordered stamps from Zazzle that read "Place Stamp Here".
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
Patrick Elliott wrote:
> On a semi-side note. Had to defrag mine, which Windows doesn't normally
> allow. Page Defrag utility from sysinternals works nice (though I need
> to run it again, now that I succeeded in defragging all the files I
> couldn't, because the page file was in 33,000 pieces... O.o It runs
> before anything but the basics starts up, just after the initial loading
> screen, so pagefile.sys isn't yet in use.
>
> You would think, even with the slight overhead, trying to keep this
> thing in one chunk, as much as possible, would have been useful... Sigh!
You can use the defrag utility if you restart using Safe Mode + Command
Prompt. This might allow you to defrag the page file since lots of stuff
is being bypassed.
-Mike
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
Patrick Elliott schrieb:
> Still, point is, if you have memory free, open or not, I see no reason
> why, if you have free real memory, you couldn't page in some of the file
> that isn't "in use", page it back out someplace saner, etc. and defrag
> it that way. Ironically, its one thing I almost miss from the old
> Win3.11 days, where you could use Norton's defragger to defrag
> everything "including" the page file, and shift files you use a lot
> closer to the start, and unscatter directories, *and* consolidate free
> space.
Remember how, /theoretically/, you were supposed to actually /pay/ for
Norton Utilities back in those days?
Well, here may be news for you: There's still defraggers around for
sale, and which are said to perform better than the defragger MS is
giving away for free with their OS. ;-)
> I get that Windows defrag now does most of those, but it does
> them damned inefficiently, and no matter how many times you "force" it
> to consolidate files, it will flat out *refuse* to consolidate the free
> space, even when there is no sane reason to leave a handful of files
> scattered willy nilly over the remaining disk space.
Question is, what is the most efficient distribution of empty space
after a defrag? One huge consecutive chunk?
Actually, that's rarely the case. This is only good for drives
containing a huge collection of immutable files, like some archive. As
soon as you expect the contents of the files to change, it will actuall
be more efficient to have some free space after each file, to allow for
it to grow without fragmenting /again/.
So the strategy employed by Windows' free defragger is probably not too
bad - and don't forget it's a tool you get with the OS for free. (Not
developed by MS themselves, btw.)
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
clipka schrieb:
>
> Question is, what is the most efficient distribution of empty space
> after a defrag? One huge consecutive chunk?
>
> Actually, that's rarely the case. This is only good for drives
> containing a huge collection of immutable files, like some archive. As
> soon as you expect the contents of the files to change, it will actuall
> be more efficient to have some free space after each file, to allow for
> it to grow without fragmenting /again/.
... furthermore, Windows' defrag apparently /does/ move files, but it
tries to arrange them in different sections, probably related to how
they are actually used/modified, or how large they are. Something along
those lines.
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
clipka wrote:
> soon as you expect the contents of the files to change, it will actuall
> be more efficient to have some free space after each file, to allow for
> it to grow without fragmenting /again/.
Interestingly enough, Windows actually does keep track of which files change
and which don't. Plus, it's probably pretty easy to guess in most cases. EXE
and DLL files? Pack em tight. Mailbox files? You probably want to leave some
space.
> (Not developed by MS themselves, btw.)
That may have been true a long time ago, but I don't expect it's true any more.
--
Darren New, San Diego CA, USA (PST)
I ordered stamps from Zazzle that read "Place Stamp Here".
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
Darren New schrieb:
>> (Not developed by MS themselves, btw.)
>
> That may have been true a long time ago, but I don't expect it's true
> any more.
Vice versa actually: They used to develop their own defrag, but then (I
think starting with XP) switched over to some lightweight edition of a
third-party tool. Raised a lot of fuss over here in Germany due to that
company's involvement with Scientology.
Well, maybe they acquired the developer of that tool by now, don't know...
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
clipka wrote:
> Darren New schrieb:
>
>>> (Not developed by MS themselves, btw.)
>>
>> That may have been true a long time ago, but I don't expect it's true
>> any more.
>
> Vice versa actually: They used to develop their own defrag,
No, they used Executive Software's DiskKeeper stuff. MS developed all the
APIs and such. Executive Software just wrote the user interface part (and of
course provided info on how the API should be designed). I'm pretty sure
nobody outside MS wrote software that frobs the NTFS file system to
rearrange blocks. (Granted, MS might have hired an outside company to
develop that, but it wasn't sold independently, so I don't count that.)
As for the defraggers on pre-NTFS file systems, I wouldn't really guess, but
Executive Software made the UI for the NT defragger.
--
Darren New, San Diego CA, USA (PST)
I ordered stamps from Zazzle that read "Place Stamp Here".
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|
 |