I remember COBOL, though fortunately, I never had to code in it.
I think that people who make statements like this significantly underestimate the complexity and ambiguity of natural language. There is a tendency to think that all terms are as well-defined as mathematical language.
Certainly, understanding language at the level of James Joyce or most "art" poetry is beyond the reach of current AI, and everyday language is almost as complex in semantics.
I read a book about musical interpretation that stated that the notation conveyed about 35% of the composer's intent, and the performer had to supply the rest by learning about context. Where the author got that number from is unknown, but it seems about right for understanding language as well. Of course, spoken language is more ambiguous and incomplete than written language, which is way more complicated than any coding language.
As an illustration, the dictionary used by WL has about 35,000 words, and is about at the level as one of those cheap paperback dictionaries students used to use. To make any sense of English in its full expression, you really need the equivalent of the Oxford English Dictionary, which has 800,000 words, plus historical context and multiple definitions for many words. As I recall, the definition for "is" runs for several pages.
One of the primary complaints of programmers (not just newbies) is that the program did what they typed, not what they meant. We can tell a person "you know what I mean" and most of the time they do, despite syntax errors and ambiguity, because of a shared cultural history and years of experience. I don't think that a computer will have an equal level of understanding until it can share the same cultural history and experience.