For some reason I remember an odd feature of PL/1: Areas and offsets. If I am remembering correctly, you could allocate structures in an area and reference them by offset within that area. That stuck in my mind for some reason, but I never found a reason to use it. It struck me as a neat way to persist pointer-based data structures. And I don't remember seeing the idea in other languages.
Maybe the reason it stayed with me is that I worked on Object Design's ObjectStore. We had a much more elegant and powerful way of persisting pointer-based structures, but an area/offset idea could have given users some of the capabilities we provided right in the language.
We take this for granted now, but at the time it was revolutionary. In part, we've done things like mandating Unicode and IEEE 754, but nowadays most of our languages also encourage portability. We think very little of moving an application from Windows on x86_64 to Linux on ARMv8 (apart from the GUI mess), but back when Cobol was being created, you normally threw your programs away (“reprogramming”) when you went to a new machine.
I haven't used Cobol in anger in 50 years (40 years since I even taught it), but for that emphasis on portability, I am very grateful.
you need special custom numerical types to come even close in, say, java or C++ or any other language.
I guess you mean:
>digest -> digits
>loosing -> losing
Is that the same as BCD? Binary Coded Decimal. IIRC, Turbo Pascal had that as an option, or maybe I am thinking of something else, sorry, it's many years ago.
1100 in “regular” binary is 12 in decimal.
0001 0010 in BCD is 12 in decimal.
ie: bcd is an encoding.
High precision numbers are more akin to the decimal data type in SQL or maybe bignum in some popular languages. It is different from (say) float in that you are not losing information in the least significant digits.
You could represent high precision numbers in BCD or regular binary… or little endian binary… or trinary, I suppose.
My main beef, however, is that the last sentence in the section seems to suggest that the birth of Haskell killed SML on the vine because suddenly everybody only wanted pure, lazy FP. That's just wrong. The reality is that these two branches of Functional Programming (strict/impure and lazy/pure) have continued to evolve together to the present day.
I put the blame solely on the management of Borland. They had the world leading language, and went off onto C++ and search of "Enterprise" instead of just riding the wave.
When Anders gave the world C#, I knew it was game over for Pascal, and also Windows native code. We'd all have to get used to waiting for compiles again.
Would I be wrong in saying that SQL has what feels to me to be a very cobaly syntax. By which I mean, I know it is not directly related to cobal, But someone definitely looked at cobal's clunky attempt at natural language and said "that, I want that for my query language"
I feel that the article should have made this a lot more clear - as so many people code along the APL -> Matlab / R (via S) -> NumPy family tree.
Pascal, particularly the Delphi/Object Pascal flavor, is also still in widespread use today.
As an aside, the article you linked to is pretty obvious AI slop, even aside from the image ("blockchin infarsucture" and all). Some of the details, like claims that MIT is offering COBOL programming classes or that banks are using COBOL to automatically process blockchain loan agreements, appear to be entirely fabricated.
No.
You have to put this relative to projects started in other languages, at which points new projects started in COBOL is even less than a rounding error, it probably wouldn't result in anything other than 0 with a float.
edit: for ancient Greek to become a dead language, will we be required to burn all of the books that were written in it, or can we just settle for not writing any new ones?
Same with a programming language - is no one is wiring code in it, it's dead
(There are a few other threads with a smaller number of comments.)
"COBOL was one of the four “mother” languages, along with ALGOL, FORTRAN, and LISP."
Lisp isn't as widely used as, say, Python, but it's still something a lot of people touch every single day.