Year 2010 bug wreaks havoc on German payment cards
Son of Y2K also hits SpamAssasin, Symantec
A delayed Y2K bug has bitten hard at some 30 million holders of German debit and credit cards, making it impossible for them to use automatic teller machines and point-of-sale terminals since New Year's Day.
Multiple news agencies said the outage stemmed from card chips that couldn't recognize the year 2010. The DSGV, an association representing German banks, said engineers were working diligently to fix the problem, but a full resolution might not come until Monday.
The outage affected 20 million EC, or electronic cash, cards, which act as debit cards, and 3.5 million credit cards, according to the DSGV. A separate bank association known as BDB said about 2.5 million of its cards suffered from the same problem and another 4 million cards issued by Germany's cooperative banks were at least partially touched.
The reports are the latest to involve the inability of computers to properly handle the 2010 date. Just after midnight on New Year's Day, Symantec's Endpoint Protection Manager stopped accepting updates after it was hit by its own 2010 date bug. Soon after the first of the year, SpamAssassin began blocking huge amounts of legitimate email because they included the 2010 in their headers, a date so far off the spam filter assumed they had to be junk.
Kaspersky software also experienced massive update problems on December 30, according to support forums, but it's not clear the new year had anything to do with them.
Re: titular thingamidurby
It takes the same amount of code to add/subtract/increment/decrement BCD as it does for normal numbers. This is because it is implemented at the processor level as single instructions.
Back in the "good old days" of 8 bit processors BCD was more efficient when it came to displaying the results. This is because all processors had ADD and SUBTRACT instructions, a few might have MULTIPLY and I dont know of any that had a DIVIDE instruction (but no doubt I will be corrected). To divide 2 integers you had to use multiple instructions within a loop so it cost a lot of processor cycles.
A four digit number takes 2 bytes to store whether its BCD or a normal number but when it came to outputting the result for a normal number you had to do the following:
Divide by 10 and use the remainder as the units
Divide by 10 and use the remainder as the 10's number
Divide by 10 and use the remainder as the 100's number
Use whats left as the 1000's part.
On the other hand to display a 4 digit BCD number you had to:
AND the LSB with 0x0f and use the result as the units
Shift the LSB right 4 times and use the result as the 10's
AND the MSB with 0x0f and use the result as the 100's
Shift the MSB right 4 times and use the result as the 1000's
Each of the above operations on the BCD number may well be implemented on the processor as a single instruction so it was very efficient. On the other hand the normal number has to implement 3 divides which is very inefficient.
Elementary my dear Watson
My guess is that that the processor has the ability to use BCD (Binary Coded Decimal) representation with each diigit in 4 bits. This would mean that, for example, the decimal number 76 would be represented by the hex number 0x76 (decimal 118) with four bits for the seven and four bits for the 6.
Now some poor programmer has used the instruction to store the year (in this case 10) as BCD so it is being stored internally as 0x10 (or 16 in decimal). However, when other subroutines are accessing the date they are treating it as a normal hex number and so are reading it as the year 16 instead of the year 10. The years 0 - 9 would be unaffected as they are the same whether using BCD or not.
How fast they can fix this will depend upon how much of their code treats it as BCD and how much doesn't.. Since BCD is rarely used today my guess is that the only code that is using BCD is the part that stores the date.
Now since I'm wearing my deerstalker and smoking my pipe (my new years resolution regarding the opium still stands) I will speculate as to who is responsible. This is the type of task that is given to new interns fresh out of college/university ie. "Write a subroutine to get the date from there and store it over here.". Unfortunately new interns tend to not actually know much so (s)he probably scanned the instruction set and used the first instruction (s)he came across that would do the job. Unfortunately for them it happened to be a BCD instruction.
A definite fail for the programmer.
And you just proved his point. First the date he put in was in the format unknown to us, it could have been DD-MM-YY, MM-DD-YY, YY-MM-DD or YY-DD-MM. He then says the date format used by everyone should be YYYY-MM-DD. But due to the ilogical way americans do the date format, they would assume it was actually YYYY-DD-MM. Now the date format that you used on the date that he provided you did MM-DD-YY, where as the date if he was actually writing it as a date would most probably have been DD-MM-YY, as it wouldnt have been in american format. But again, if it was in the format that he thinks should be used, you and everyone you asked, would have again been wrong.