I know that a lot of Y2K efforts/scare was somehow centered on COBOL, deservedly or not. (heck, I saw minor Y2K bug in a Perl scripts that broke 1/1/2000)
What I'm interested in, was there something specific to COBOL as a language which made it susceptible to Y2K issues?
That is, as opposed to merely the age of most programs written in it and subsequent need to skimp on memory/disk usage driven by old hardware and the fact that nobody anticipated those programs to survive for 30 years?
I'm perfectly happy if the answer is "nothing specific to COBOL other than age" - merely curious, knowing nothing about COBOL.
Yes and No. In COBOL you had to declare variables such that you actually had to say how many digits there were in a number i.e.,
YEAR PIC 99declared the variableYEARsuch that it could only hold two decimal digits. So yes, it was easier to make that mistake than in C were you would haveintorshortorcharas the year and still have plenty of room for years greater than 99. Of course that doesn't protect you fromprintfing19%din C and still having the problem in your output, or making other internal calculations based on thinking the year would be less than or equal to 99.