A Conversation for Dead Websites

Unnecessary frustration

Post 1

starlet

I started this Conversation in the hope a) that somebody with more rounded knowledge could tell me why only some of the downloadable files off the net will download 'where you left off' if the link is broken and resumed and b) to find if there are any other frustrating yet easily solvable problems that are encountered on the net.

My naive understanding is that there must be a part of TCP/IP designed to deal with this as the problem would have been around way before the WWW was invented.

As for broken links - why don't the search engines detect them? Or do they do this, but only irregularly?

I think the internet is way to cool a thing to suffer these minor flaws.


Unnecessary frustration

Post 2

Researcher 192821

a) it's dependent on whether the server you are getting the file from supports a `resume` command or not...some do, some don't.
afaik it only works for ftp servers, and the problem is that the server that does the resuming has no way to know if the partial file fragment on the user's box is good or garbage, so typically site owner's will make you d/load it again, if only to boost the stats of how many hits they used to get when .commers took that stuff seriously i.e. as a means to raise VC money... that was longer than i thought it would be!
b)local ISP DNS server failures are the biggest bane of my life...getting "the site's not there" when it really is but your DNS is timimg out so it doesnt get the name resolved.
When/If this happens, open a DOS prompt on your box and type:
C:\>nslookup www.thesiteyouwant.com 198.6.1.142

198.6.1.142 is the IP address of one of the "great DNS' in the sky" otherwise known as cache07.ns.uu.net - it's just much larger and more reliable than others i've seen...

As for broken links, they *do* get detected and removed, but i think there's a misunderstanding of scale here, after all, there are billions of pages with hundreds of billions of links...to put that in perspective, if it was on paper then it'd be a pile approx 10 thousand kilometers high...doesn't matter how many robots you have scanning that lot, it still takes time...

:o)


Key: Complain about this post

Write an Entry

"The Hitchhiker's Guide to the Galaxy is a wholly remarkable book. It has been compiled and recompiled many times and under many different editorships. It contains contributions from countless numbers of travellers and researchers."

Write an entry
Read more