A Conversation for Computer Programming
Accuracy
The Mystery Researcher Started conversation Jul 4, 1999
Moderately amusing, if a little low on real information.
I do, however agree completely with your point about the Y2K issue, despite being one of the worst breeds of programmers - a consultant.
Don't take this personally as it is not intended as such.
Accuracy
26199 Posted Jul 23, 1999
Surely the lowest form of computer language is QBASIC? Other than that, the article's pretty accurate...
But it could use some stuff on what it's like being a programmer, spending three bloody hours looking for what turns out to be a single instead of a double equals sign (yup, I learned C the hard way...)
Perhaps you should take a look at my article on Jargon. It doesn't mention programmers as such... but the general idea is definitely there.
As for making more than they're worth... I hope so. I really do. 'Cause a computer programmer is what I'm going to be, and I reckon I'm worth a fair bit anyway...
VBASIC
Chips Posted Jul 26, 1999
I am 11 and i have learnt VB I realised as in that artical that if u wanna b a better programmer u gotta go 4 C++ so i am!(VC++)
TIM
[email protected]
Y2K
Is mise Duncan Posted Oct 8, 1999
Once upon a time memory was very very expensive. If
you used a two-digit date field you would save two bytes
per date record, and if (say) you were dealing with a bank
transaction file the saving would be absolutely huge.
Computer Weekly (or Computer Geeky, as my flatmates refer
to it) did a rough figures estimate and worked out that
the saving in many industries realised by using the 2-digit
year field more than paid for the cost of converting it to
four digits now that we need to and memory is cheap.
As for the problem not being real - I'm sorry but it is
very real. A lot of systems will fail in some manner
(mainly the ones I wrote - the trick is only fixing
the ones that are "important". We will find out soon enough.
Y2K
26199 Posted Oct 12, 1999
That makes perfect sense... apart from one thing. Computers use binary... so three (hexadecimal) digits can store any number from zero to 4096... now, I can see why they wouldn't want to do this... but two hexadecimal digits could easily represent 1900-2156, which seems to me to be a pretty reasonable range...
So it seems the digits are stored in Binary Encoded Decimal... that is, one hexadecimal (0-15) digit being used to store one decimal digit. This is usually done for reasons of processing speed... but doesn't it waste quite a lot of memory?
And as for using a byte for each digit... surely they wouldn't be *that* stupid???
Y2K
Is mise Duncan Posted Oct 12, 1999
They (we?) have been that stupid, by (amongst other things)
storing the date as a char(6). I didn't write that one
myself, but I have seen it.
Alternately, storing the whole date as a number between
000101 (hex 65) and 991231 (hex F1FFF) as opposed to getween
0000101 (hex 65 <still> and 19991231 (hex 1310ABF) can result
in your 2 byte saving. Never seen this example, but it could
happen.
Y2K
26199 Posted Oct 12, 1999
When it comes down to it, the day/date system is pretty useless... we should just number them 0 to 364, it'd be a whole lot easier...
By char(6)... you mean a string six bytes long? *shudder* Talk about inefficient...
Y2K
Is mise Duncan Posted Oct 13, 1999
For the king of inefficiency you need to look at Sybase. The technical definition of a "datetime" field says that:
..holds a date between Jan 1, 1753 and Jan 1, 9999 in an eight byte field; 4 bytes for the number of days since (or before) Jan 1 1900 and 4 bytes for the time of day to 1/300th of a second.
Now, if my maths is right, 4 bytes allows us 4294967295 days, which divided by 365.25 gives 11758979 and a half years...so it could easily hold from Jan 1, 1753 to Jan 1, 11760732 - by which time I hope Sybase will be no more . Three bytes allows us up to the year 47686 which is quite acceptable. The 4 bytes for 300ths of a second would do us for a full 3976 hours, rather than the 24 we currently have in an earth day. However in this case 3 bytes is too few.
Perhaps I can sue them for loss of memory?
Y2K
Nick Roberts Posted Dec 4, 2004
Computers whose design was older than the early 1970s did arithmetic in Binary Coded Decimal (BCD), rather than in binary. Two four-bit 'nibbles' can only hold the BCD numbers 00 to 99, so this is why it became the norm, in commercial data processing, to represent dates in the form DDMMYY, where the YY was 00 to 99, representing 1900 to 1999.
I think one of the things that particularly embarrassed the people who were responsible for commissioning expensive computer systems, in the run up to the year 2000, was the fact that it became clear that almost none had seen fit to actually /specify/ that the systems they were commissioning should operate beyond the year 1999 (or, more precisely, that they should correctly process dates beyond the year 1999).
Key: Complain about this post
Accuracy
More Conversations for Computer Programming
Write an Entry
"The Hitchhiker's Guide to the Galaxy is a wholly remarkable book. It has been compiled and recompiled many times and under many different editorships. It contains contributions from countless numbers of travellers and researchers."