A Conversation for h2g2 Feedback - Feature Suggestions

Offline version of h2g2

Post 1

Thonis

On the future features suggestion list,there is mentioned an offline version of the guide. (Think is was named "h2g2 desktop")

The basic idea was that you could download the entire guide and view/read it offline.And then later update it from the web with new entries and updated entries.

I was wondering if anyone is on this.

A long time ago someone mentioned something about a PDA-version. This is still (in my opinion) wery mutch wanted, as you then could keep the guide in you back pocket. (Where it is supposed to be.)

The trouble with this is that you are required to be connected to the internett for the guide to work. And that isn't allways a good alternative(cost, speed, etc.)

Now, I don't know how mutch information is stored in the guide, and how mutch space it would require in storing it all. But the best sulution would be to be able to retrieve the whole guide and keep in on your PDA(or another computer) to be acessed anytime.

-Just trying to get the guide to te people... Where they need it.smiley - smiley


Offline version of h2g2

Post 2

There is only one thing worse than being Gosho, and that is not being Gosho

Even if 'the entire Guide' is not yet measured in terabytes, I doubt that any of us have a hard drive big enough for it smiley - tongueout

Unless you're running a particularly hefty raid array.


Offline version of h2g2

Post 3

Thonis

Is there someone who have got an idea on how big the guide is?

After all storing text isn't all that space-consuming.


Offline version of h2g2

Post 4

Oberon2001 (Scout)

Do you mean the EG? Cos that might just be possible... still big though, cos a fair few of them have pictures on them.
Oberon2001


Offline version of h2g2

Post 5

SEF

Jim Lynn would know or could ask the database directly. Spelugx probably has a fair idea about the EG from collecting stats on article size. If you look at <./>info</.> you'll see how many EG articles there are. Take a guess or a sample for the size and multiply it up for just the Edited Guide size. You'll also see the most recent article numbers. Throw away the checksum digit at the end of the top one and that is roughly how many articles there are of all types across all dna sites (including repeats, deletions and empty PSs).


Offline version of h2g2

Post 6

SEF

Oops simulpost. Most pictures are of the pointless kind. So there's no need to include them.


Offline version of h2g2

Post 7

GreyDesk

I remember Jim talking about this once, I think he said that it was about 1Gb. Now obviously with the passage of time and all the new DNA sites and stuff it's obviously much bigger. 3Gb maybe?


Offline version of h2g2

Post 8

Zak T Duck

Can you actually see people even with broadband downloading over 3Gb of data? Nope! Besides, we've all seen the bandwidth problems encountered when someone tried to download the entire site, and that was just a couple of search engine bots. Imagine that but a heck of a lot worse.

If you want your offline guide for some inexplicable reason, persuade BBC Worldwide that they've got a potential moneyspinner if they release it on DVD ROM every 12 months like M$ do with Encarta.


Offline version of h2g2

Post 9

Thonis

Some quick cooperation with my calculator:

The info page said there was 6168 edited guide entries.
For simplicity we say they are all 65000 chars long.
(And 65000 i a lot of text)
That leaves you with "only" about 383Mb of raw data.

And also: Let's include a image on all edited entries.
Lets give them a 50kb image pr. entry.
That (6168 * 51200) gives you another 301Mb of data.

This leaves a total of 685Mb data for the edited guide.


Offline version of h2g2

Post 10

SEF

You are definitely overdoing the images in that guestimate. Not that many EG entries have them at all (up to 4/5 of the early ones, 1/5 of the later ones and 1/2 now) and very few are of any importance to the entry. Also the images are supposed to be limited to 20k and are often less.


Offline version of h2g2

Post 11

Spelugx the Beige, Wizard, Perl, Thaumatologically Challenged

I also think you're overestimating the size of the edited articles, I've (pretty reliably) calculated an average edited article size of only 28.6 KiB (for the text only). Between 25 and 30 percent of edited articles have a picture. I'll assume 20 KiB per picture.

So assuming 6168 edited articles, I make it:

172 MiB of article text.
30 - 36 MiB of pictures.

It's also a rather short download over broadband, assuming no server delays and infinite bandwidth at the BBC end (both of which are assumptions easily demonstrated to be false).

spelugx


Offline version of h2g2

Post 12

Rho

> assuming no server delays and infinite bandwidth at the BBC end

Which infinity?

Rho


Offline version of h2g2

Post 13

Spelugx the Beige, Wizard, Perl, Thaumatologically Challenged

hmm, err, hmm, ...

No larger than C.


Offline version of h2g2

Post 14

R. Daneel Olivaw -- (User 201118) (Member FFFF, ARS, and DOS) ( -O- )

Why not Alph-sub-20? Maybe Alph-sub-21?


Offline version of h2g2

Post 15

Spelugx the Beige, Wizard, Perl, Thaumatologically Challenged

Whatever throughput meter one uses to perform the measurement, one will get a finite precision number which approximates the ture throughput. This means the domain of the measurement is only Aleph_0 in size (ie the same size as the set of integers or naturals). Now the actual throughput I suppose could be irrational (smiley - erm...) so that gets us up to C. But I can't really see how it could be any bigger than this...

Anyway, that's all about the domain of the possible levels of bandwidth. I doubt the bandwidth itself could be infinite...


Key: Complain about this post