|
 |
Darren New wrote:
>> FTP, no matter how you do it, risks all of the problems,
>
> Not really.
>
> The only difference is that with bittorrent, the protocol is designed
> that the only way to do a download is by requesting chunks. In FTP,
> being able to restart a download, or take only the middle of a file, is
> an option.
>
Going to shorten this and just say that, what you describe is
interesting, but not what you "see" as a layman, actually having to deal
with the result. I couldn't care less, for example, if HTTP has a mess
of things in it, it **still** can't deal with a server that runs into a
serious enough problem that it needs to restart. As for FTP resumes,
that only works of the damn server supports them. Ironically, half the
"big" companies on the net disable it, so you end up starting at the
"first" block of the file, every time you resume. Restart on those is
"literally" restart. And I won't even get into the numerous idiot things
they do on web sites, which are intended to do everything from make
download managers harder to use, verify who is downloading, etc., which
can do anything from making "all" downloads from the site impossible, if
you are not using "generic" in-browser support, and no plugins at all,
to causing it to refuse you, because the "client" you have making the DL
request doesn't match the client that initiated the link to download it.
Mind, all that later stuff is pure nonsense with the web design and the
lack of any "consistent" way to prevent a) multiple connections via FTP,
when that is the intent of all the hoops you jump through, or b) what
ever else they are trying to do.
All I know is, FTP is the least reliable way I know of, without a
manager that is *far* more robust than either the one in IE, or Firefox,
or *any* other browser I have ever used, and HTTP isn't much better, in
"some" cases. :( We need something where the fact that the server is
overloaded doesn't mean a) having to come back and retry 3 hours later,
when you might be at work (or every day, in hopes the load drops), or b)
your 5 hour (or worse 5 day, if you have a slow connection), download
isn't going to fail 50 minutes in, every single time, FTP/HTTP or what
ever, and the server won't resume where it left off. Oh.. And the one
**big** issue imho.. If you have glitchy wiring, or other issues,
FTP/HTTP can land you "bad" files, and short of running an hash on it,
which only some sites even provide (as a separate download), you have no
damn way of knowing if the 4GB ISO you downloaded, or the 6GB of game
files, or what ever, "got" to your machine intact. Nothing like trying
to download the same installer 10 times, and having it fail with, "This
executable is either corrupt or the wrong format", every single time,
each time an hour download. What little error correction exists in FTP
and HTTP is in the packet level, and it simply *doesn't work* well
enough to be certain that the file will arrive intact, when it seems to
work at all.
Like I said, unless you need a streaming protocol, there are "huge"
issues with FTP *and* HTTP, when it comes to having them be both certain
to be reliable and certain to complete. That the torrent might go
looking some place else, if the main source fails is beside the point.
It *can*. The closest to that *at all* for FTP/HTTP is some things like
Getright's support of "search known download sites for a link to the
same named file", which is worthless if its not *actually* the same
file, or version.
--
void main () {
If Schrödingers_cat is alive or version > 98 {
if version = "Vista" {
call slow_by_half();
call DRM_everything();
}
call functional_code();
}
else
call crash_windows();
}
<A HREF='http://www.daz3d.com/index.php?refid=16130551'>Get 3D Models,
3D Content, and 3D Software at DAZ3D!</A>
Post a reply to this message
|
 |