Your new topic does not fit any of the above??? Check first. Then post here. Thanks.

Moderator: igrr

User avatar
By QuickFix
#78042
picstart wrote:Reminds me of the year 2000 problem. An extra two bytes for dates would have saved millions of dollars but coders actually believed saving bytes was more important

Y2K actually stems from the 1960's and 1970's when computer memory was extremely expensive and losing the "Useless" two extra characters (=14 or 16 holes) in every date from punch cards also made more room for more important data.

Remember that the first computers were only meant to compute things (hence their name) and not to actually store data.
Programmers at that time could never have thought we would still use their work (and its data) at this day and age.
(Depending on your age: would you have believed, when you were young, that in 2018 we all could carry portable computers that would give us access to the entire world?)

I'm not defending the programmers from old times, just trying to put everything in perspective. ;)
User avatar
By picstart
#78057 There is always a good reason for coders to get the wrong answer. If storage was purely the issue then one byte would have been sufficient for 256 years. Sure 256 years later there would be an issue but AI would already have replaced most coders and the Star Trek quantum computer would fix everything in a nanosecond.
This 32 bit storage for the number of seconds elapsed from the birth date of UNIX or the year 1900 is the next coders Y2K. The number assigned to each day of the week also needs to get standardized.
Science and engineering is more thoughtful.... imagine two different definitions of volts. International standards dictate a single definition.
There are no international standards for coding. Rarely if ever are the electrical constraints specified by coders. This means the code can cause electrical mischief to the unsuspecting especially if the instruction to drive electrons in and out of pins is buried deep.