COBOL's promise was that it was human-like text, so we wouldn't need programmers anymore. A lot like "low code" platforms, and now LLM generated code.
The problem is that the average person doesn't know how to explain & solve a problem in sufficient detail to get a working solution. When you get down to breaking down that problem... you become a programmer.
The main lesson of COBOL is that it isn't the computer interface/language that necessitates a programmer.
> COBOL's promise was ... we wouldn't need programmers anymore..average person doesn't know how to explain & solve a problem
COBOL wasn't intended to be used by an "average" person but rather those with deep domain knowledge. They would know the business processes so well that they could transcribe it in COBOL with little or no need to learn how the computers worked. In some ways similar to analysts/data folks using SQL to communicate with databases.
While at it let me share a few more aspects of the top of my head.
COBOL and 4GLs in general were primarily intended to be used to build business applications; payroll, banking, HRMS, inventory management and so on. Even within that emphasis was more towards batch processing operations to reduce the burden on people doing routine bulk operations like reconciliation.
COBOL harks back to the times when there was no dedicated DBMS software. Which is why you see so much focus on how files are organised and the extensive verbs around files which somewhat resemble SQL today.
Getting anything you can use to construct a work plan, never mind a detailed feature list, out of clients can be a dark art.
*To the point I have repeatedly experienced a point close to the end of the project where they go “What do you mean you don’t handle a case I have failed to mention for the entire duration of the project?”
"The transaction consists of a credit stub and a debit stub. If the debit stub is missing and is of type X then we do A and if it is of type Y then we do B."
How to know what flavour the missing item was? Absolutely no mention of that...
The fact that they "know" a missing stub would have a type is because they actually have some more information than they let on, and this information is only known by the expert. For example, they know if the submission was from party A, it must be type X.
But that fact might not ever be recorded in the computer system, in a way that the old business process would've had a record of.
And this is just one small example - imagine something more complex!
So realistically, the job of a programmer is to force an expert to articulate all of their assumptions. IMHO, the best way to do it is to be sitting with the expert, and observe exactly what they do.
If they do, then they by definition become a domain expert. It's just that this takes a while, and projects usually don't give enough time for such to take place unfortunately.
Ha! As far as I remember it was almost exactly this when we interrogated them (but it's been a while).
Or you give them a prototype of the program, and see what they complain about?
Occasionally, you'll randomly get something they accept - but only for a few weeks until they come across some missing capability for some other thing they never told you about.
Yes, I wasn't entirely serious.
Though you can get pretty far by doing some roleplay, where you pretend to be the computer/system (perhaps put up paper screen to make it easier to roleplay, and pass messages written on paper) and have the expert interact.
When you are solving a real problem, you will still receive complaints, but they will be much more constructive.
COBOL dates back to 1959, much earlier than 4GLs, and the cited 1992/1999 articles make the point that 4GLs were poised to replace the likes of COBOL and FORTRAN when in fact those dinosaurs, or rather nautili since still living, turned out to outlive 4GLs except SQL (when counted as 4GL).
But SQL has the exact same problem. Except for very trivial scenarios, you can't just be an expert and plop your expertise into a SQL query. You have to learn how to use SQL to use SQL.
As with any other tool one has to learn it to effectively use it. Some find the learning curve not worth it and stick with Excel which is OK. But the thing is even Excel has to be learned to make full use of its potential.
Ultimately every abstraction is leaky. There will never be a solution where you never need to understand how computers work under all circumstances. But my impression is that you can go a lot further in Excel before the stuff going on behind the scenes starts to get in your way? From what I have seen, Excel itself is more likely to get in your way before not knowing how computers work does.
I intuit this also is an intrinsic limit to LLM based approaches to "you don't need them expensive programmers no more"
with LLMs magically "generating the solution" you move the responsibility for concise expression of the problem up the ladder.
and then you "program" in prompts, reviewing the LLM-proposed formalization ("code").
I other words, the nature of "programming" changes to prompt engineering. alas you still have to understand formal languages (code)...
so there'll always be plenty to do for humans who can "math" :-)
Maybe this is a reflection of local conditions, I'm not sure, but it doesn't seem like the truly revolutionary changes require the solution to find a problem. It was immediately clear what you could do with assembly line automation, or the motor car, or the printing press.
To elaborate: in the bad old days of you had one big engine, eg a steam engine, that was driving shafts and belts all around the factory. There was a lot of friction, and this was dangerous. So you had to carefully design your factory around these constraints. That's the era of multi-story factories: you used the third dimension to cram more workstations closer to your prime mover.
With electricity, even if you have to make your own, you just need cables and you can install small electric motors for every task on every workstation. Now your factory layout becomes a lot more flexible, and you can optimise for eg material flow through your factory and for cost. That's when factories becomes mostly sprawling one-story buildings.
I simplify, but figuring all of that out took time.
"Let’s look at the brief history of computers. Best way to understand it’s probably an analogy. Take the electric motor. The electric motor was first invented in the late 1800s. And when it was first invented, it was only possible to build a very, very large one, which meant that it could only be cost-justified for very large applications. And therefore electric motors did not proliferate very fast at all.
But the next breakthrough was when somebody took one of these large electric motors and they ran a shaft through the middle of a factory and, through a series of belts and pulleys, brought…shared the horsepower of this one large electric motor on 15 or 20 medium-size workstations, thereby allowing one electric motor to be cost-justified on some medium-scale tasks. And electric motors proliferated even further then.
But the real breakthrough was the invention of the fractional-horsepower electric motor. We could then bring the horsepower directly to where it was needed and cost-justified it on a totally individual application. And I think there’s about 55 or so fractional-horsepower motors now in every household."
If GenAI now was like early electricity, we would know what we wanted to use it for, even if we weren't there yet. That isn't what it looks like to me, but I'd be curious to know if that's just where I'm sitting, metaphorically speaking.
Every company I have worked for had more work than hands for programming and other knowledge work. Capacity is valuable. Does anyone here see GenAI teams being spun up for "management" by a human? Or do we see fancy Google search / code completion?
I was talking about the need to re-imagine and re-organise how factories work, not about the physical factories themselves. So it's more like a 'software' problem.
> Does anyone here see GenAI teams being spun up for "management" by a human? Or do we see fancy Google search / code completion?
How would the two cases look different? If you have a human worker that uses GenAI to help her complete tasks (via something like fancy auto-completion of text, code etc) that previously took a whole human team, that's exactly how you would 'spin up a team of GenAI for management by a human' would look like, wouldn't it?
It's just our framing that's different, and perhaps who that human is: you take someone who's familiar with the actual work and give her the tools to be faster, instead of taking someone who's more familiar with the meta-level work of managing humans.
I suspect that's because managing humans is a rather specialised skill in the grand scheme of things, and one that doesn't help much with telling GenAI what to do. (And, human managers are more expensive per hour than individual contributors.)
---
In any case, I agree that GenAI at the moment is still too immature to be trusted with much on its own. I hope more globally optimising AI like AlphaGo etc comes back in style, instead of essentially 'greedy' contemporary GenAI that just produces one token after another.
1) Every human programmer becomes the surgeon in Fred Brooks's surgical team model (https://en.wikipedia.org/wiki/The_Mythical_Man-Month#The_sur...) and AI provides the rest. In effect, all working human programmers are software architects in the sense that they exist in large companies. The unstated assumption here is that going from vague user input to a solution is roughly equivalent to AGI, and so is further out than anything on the immediate horizon.
2) GenAI is used as a sort of advanced template/snippet/autocomplete system.
The first one is a fundamental paradigm shift. Professional programmers don't cease to exist, but the profession becomes inherently smaller and more elite. The bar is higher and there isn't room for many perfectly intelligent people who work in the field today.
The second one is a force multiplier and is helpful, but is also a much more banal economic question, namely whether the tool generates enough value to justify the cost.
I have no complaint either way and I'm definitely interested in the next step beyond what we've seen so far. The hype implies that the first branch above is where everything is headed, hence the "death of programming as a profession" type articles that seem to be making the rounds, but that isn't what I've seen day-to-day, which is what prompted the original thought.
But I think a more subtle, harder-to-see aspect, that may well be bigger than all those forces, is a general underestimation of how often the problem is knowing what to do rather than how. "How" factors in, certainly, in various complicated ways. But "what" is the complicated thing.
And I suspect that's what will actually gas out this current AI binge. It isn't just that they don't know "what"... it's that they can in many cases make it harder to learn "what" because the user is so busy with "how". That classic movie quote "Your scientists were so preoccupied with whether they could, they didn't stop to think if they should" may take on a new dimension of meaning in an AI era. You were so concerned with how to do the task and letting the computer do all the thinking you didn't consider whether that's what you should be doing at all.
Also, I'm sure a lot of people will read this as me claiming AI can't learn what to do. Actually, no, I don't claim that. I'm talking about the humans here. Even if AI can get better at "what", if humans get too used to not thinking about it and don't even use the AI tool properly, AI is a long way from being able to fill in that deficit.
This is some years ago, but a friend of mine, trained in a 4GL that was still a procedural programming language, went somewhere that was using a higher level, model-based generation of code based on that language. It turned out they still needed a few people who understood how things worked beneath the hood.
I am deeply skeptical that human-language level specifications will ever capture all the things that really need to be said for programming, any more than they do for mathematics. There are reasons for formalisms. English is slippery.
On one hand you’re correct in that there will always be a need for programmers. I really doubt there will be a great need for generalist programmers though. The one area that may survive is the people who’re capable of transforming business needs and rules into code. Which requires a social and analytical skillset for cooperating with non tech people. You’ll also see a demand for skilled programmers at scale and for embedded programming, but the giant work force of generalist developers (and probably web developers once Figma and similar lets designers generate better code) is likely going to become much smaller in the coming decades.
Then is basically what the entire office workforce is facing. AI believers have been saying AI would do to the office what robots did to the assembly line for years, but now it actually seems like they’re going to be correct.
So, if AI follows suit, we will witness the dumb (but very knowledgeable) AI start to supplant workers with questionable results; and then someone (or a team) will make a discovery to take it to the limit and it’ll be game over for large swaths of jobs.
This works for them because an MVP typically isn't a lot of code for what they need, and LLMs have a limited scope within which they can generate something that works.
The author mentions "4GLs" were all the rage in the early 1990s, but I doubt that that was true outside of the mainframe world. The 4GL movement, as a conscious movement, seems to have always been highly mainframe oriented (the Wikipedia article mentions reducing the amount of punched cards necessary for a program as initial goals). By the 1990s you could categorize many languages as 4GL, but I doubt this term was used with any enthusiasm outside of the mainframe world. It was the opposite of a buzzword.
1992 wasn't too long ago. Linus Torvalds has already released Linux, and Guido van Rossum was already working on Python. Perl was already gaining popularity, and Haskell also saw it first versions released. The forefront of technology was already shifting from expensive workstations to consumer-grade PCs and language designers gave little thought to 4GL concepts, even when they happened to design something that could qualify as a 4GL for personal computers (e.g. dBase, HyperTalk, AppleScript).
I agree that human-like text is a bad idea for most use cases of programming, but I think this is not why the 4GL movement failed, and in fact most 4GLs weren't more "natural language-like" than the 3GL COBOL. I think the main problem was that the 4GL movement has never really defined a new generation or anything useful at all. The previously defined generations of language introduced revolutionary changes: translation from friendlier assembly language to machine code (2GL) and compilation (3GL). The only change we can properly define from the loose definition of 4GL is "put more features that used to be external routines or library directly into the language".
This approach worked out relatively well when the language was domain-specific. This is how we got some of the most successful 4GLs like SQL, R and MATLAB. These languages have syntax that deals directly with data tables, statistics and linear algebra directly into the language and became very useful in their own niche. The concept of a general-purpose 4GL, on the other hand, was always destined to boil down to an overly bloated language.
But, yes, I agree that aside from the generally more verbose and sometimes unwieldy syntax, there wasn't really that much to it in practice. I did work with FoxPro, and the reason why it was popular was not because you had to write things like "ACTIVATE WINDOW", but because it had many things baked directly into the language that nicely covered all the common tasks a pre-SQL data-centric app would need - e.g. a loop that could iterate directly over a table.
It's interesting that we have largely abandoned this approach to software development despite its amazing productivity in that niche. I guess a large part of it is because custom-made software is much less common in general than it used to be.
Then VB 4.0 started to get popular around 1996 and ruled the roost...
So many technologies... does anyone remember 'SUPRA' from that era! (think it was supposed to be a 4GL language/interface for mainframe databases)
From my perspective, the standard libraries of languages like Python and Java, as well as effective package managers such as pip or npm or cargo, have raised the bar so high that it is difficult for old, specialist languages to compete in most cases. Although the security problems of the package managers give me some pause.
In my experience, most 4GL languages were aimed at microcomputers and did reasonably well. Others have mention FoxPro and dBase, 4D and FileMaker also slot nicely into this category. IMHO, they had great success in the back office of small businesses.
I have seen some effort to force SQL into this category, perhaps with the idea that a SQL database with stored procedures technically meets the 4GL definition.
The more things change, the more they are the same.
This blows my mind, since it seems like a fairly low level/terse language compared to more modern domain specific languages.
But in some sense they were dead right... since (I assume) that what "programming" meant at the time was being able to write raw machine code by hand on paper, and have it work - something few people can or need to do nowadays
I have heard others and myself describe COBOL in many ways, most involving creative expletive phraseology which would make a sailor blush, but "low level/terse language" is a new one to me.
> But in some sense they were dead right... since (I assume) that what "programming" meant at the time was being able to write raw machine code by hand on paper ...
LISP and Fortran predate COBOL IIRC.
I didn't mean to imply COBOL was anything close to the first programming language, only that I was speculating what 'programming' generally meant within computer culture at the time. I was not around at that time- but I strongly suspect that directly writing machine code and/or assembly was still common practice throughout the entire 1950s, whereas it is far less common nowadays.
I wonder what year Fortran overtook assembly and became the most popular programming language during that era? I suspect it was well after COBOL came out. Surely there is a lag time for any new programming language to become commonplace.
I couldn't find any data on that, but I was able to find that C was released in 1972, but took until 1982 to overtake Fortran, and until 1985 to overtake Pascal. I often forget how slow new things propagated through the culture in pre-internet times.
It depends on what type of programming was being done. What we now call "kernels" and "device drivers" were largely authored in assembly languages well into the 80's (depending on the machine).
> I wonder what year Fortran overtook assembly and became the most popular programming language during that era?
Fortran got its name from "formula translator" (or equivalent, depending the source), so it quickly gained traction in the scientific/mathematics domain. As an aside, there are variants of BASIC which have much of the semantic mathematical capabilities of Fortran yet with better I/O support IMHO. None come close in performance AFAIK.
> I suspect it was well after COBOL came out.
COBOL took off in the business world, as it was intended to do, and remains prevalent in at least the insurance and banking industries to this day.
> ... I was able to find that C was released in 1972, but took until 1982 to overtake Fortran, and until 1985 to overtake Pascal.
K&R C was viewed by many as "portable assembly code", with Unix helping it to gain acceptance. ANSII C largely replaced use of K&R C in the late 80's.
Depending on the OS, a small amount of "shim code" remained in assembly (such as interrupt handlers and MMU interaction) up to most of the OS being written in assembly well into the late 80's.
This is all to say that:
> ... directly writing machine code and/or assembly was still common practice throughout the entire 1950s
Is still common practice in OS development, albeit to a lesser degree, if directly writing machine code is omitted.
HTH
Correct. Fortran, LISP, and COBOL were invented in ‘57, ‘58, and ‘59, respectively.
Yes. but the ideas behind LISP were older still: Church's typed lambda calulus was conceived in 1936.
A bit like writing enchantments to force demons to do your bidding.
But without the cool chanting and poetic language; just like cyberpunk was realized without the vivid imagery and neon lights :(
The 21st century never ceases to disappoint. It’s a cheap, low budget and dystopian version of what we imagined.
When you're graduating students from high school who go into college as engineering hopefuls who can't solve X - 2 = 0 for X, what hopes does the average individual have for solving programming problems?
All the source code is available and theoretically I could make changes and compile it up. The language itself is basically just plain procedural code but with SQL mixed right in -- somewhat like DBase or Foxpro but worse. I think the compiler produces C code and is then compiled with C compiler but it's been a while since I looked into it. Requires a version of Kornshell for Windows as well.
I have no evidence to say that apple use Linux, but businesses gotta business so isnt a big bet to make.
Doubtful. Surely they would know macOS is XNU-based?
I work for Red Hat in the kernel maintenance team.
Edit: i just realised you were doubting the correctness that it was BSD, not that I knew.
No, you just switched usages of the word "base" mid-conversion so you could say other people are wrong?
This conversation isn't advancing anyone's understanding. It's just pedantry.
The original assertion was: "Apple is BSD based". While we did move to assume Apple means macOS (iOS, et. al), we stayed the course with the remainder. There is nothing about macOS that is BSD-based. Containing some BSD code does not imply that it is the base. macOS also contains curl code. Would you say macOS is curl-based?
Regardless, what you may have missed is the additional context the followed: "not Linux". The parallel to Linux in macOS is XNU. Therefore, if other systems are Linux-based as we are to infer from the original comment, then macOS is XNU-based, not BSD-based. Yes, XNU contains some BSD code, but it is not BSD. It is very much its own distinct kernel maintained independently of BSD-adjacent organizations.
> This conversation isn't advancing anyone's understanding. It's just pedantry.
It could advance someone's understanding if they were open to seeing their understanding advance. I understand not everyone is accepting of new ideas and that many fear learning something new.
A decent chunk of the kernel was directly lifted from FreeBSD (and in bizarrely stubborn '90s-era design philosophy fashion, glued to Mach); some older stuff from NeXT came from earlier BSD codebases. I see 155 files in the ssh://git@github.com/apple-oss-distributions/xnu.gi repo that still have FreeBSD CVS version tags on them for some reason.
There is no curl code in the kernel. (If you want to be truly pedantic, and I see that you do, there is one shell script in that repo that assumes curl is installed.)
Sure, just as we already discussed at the very beginning of our exchange. Glad you able to learn something from our discussion, even if it has taken you an astoundingly long to time to get there. Here I was starting to think you were one of those who reject learning. I am happy to learn that you're just slow.
But they were not.
https://redmonk.com/sogrady/2024/09/12/language-rankings-6-2...
The initial hype has died off and that's OK. The hype cycle is inevitable for all languages. Also, predictions rarely happen, mostly because the landscape has changed. Mainstream programming languages can no longer die like Cobol did.
E.g., Java has been dying ever since 2001, surviving the dotcom bubble, .NET, the P in LAMP, Ruby, JS, or Go. Python was supposed to die on its version 3 migration, with people supposedly moving to Ruby.
FWIW, Scala is the world's most popular FP language, it has good tooling, and libraries, and Scala 3 is a wonderful upgrade.
Personally I feel that scala has too much in the language and the compiler is too slow. The tooling is pretty good but it is finicky and ends up getting slow and/or unreliable with larger projects. Even if I were to restrict myself to a small subset of scala, I would still be unsatisfied with the long compile times which was the primary reason I decided to move on.
I don't know if I agree with your contention that languages can't die like COBOL. I think you can relatively easily keep a legacy scala system up, put it in maintenance mode and write new features/products in something else. That is what I expect is already happening with scala and that this trend is likely to accelerate. Keep in mind also that Martin Odersky is nearing retirement age and it's really hard to imagine scala without him. He has much more power/control than the head of most languages.
Again, look at Java.
Ofc, there's always the question of what happens with a market that isn't constantly growing due to zero-interest rates phenomenon. I guess we'll see, but IMO, that's problematic for newer languages, not established ones.
I too am a contributor of very popular libraries and am very familiar with ecosystem. One thing to keep in mind is that the language's culture has evolved. When I picked up Scala, back in 2010, the Future pattern and Future-driven libraries were all the rage. Whereas nowadays people prefer alternatives which now includes blocking I/O (Loom), with Future-driven libs being a risk going forward.
> overwhelmingly driven by existing large enterprise codebases
That happens with all mainstream languages, but it's a feedback cycle. The more popular a language is (in large enterprise codebases), the more it will get used in new projects, for obvious reasons. People want to get shit done and to have good ROI and maintenance costs. Therefore, the availability of documentation, tooling, libraries, and developers helps, in large and small projects alike.
And yes, Java is quite fresh, IMO.
It seems to me the more popular a language, the more poorly written libraries are found in it, which soon starts to draw people away from what is popular to a new language that has a limited library ecosystem thinking they can fix the mistakes they saw last time and make a name for themselves in the process. Lather, rinse, repeat.
We've had a potential client ask for a PoC in Java 8, to integrate with their current system... But yeah, our product is deployed with Java 11 and since some dependencies have issues with 18, we'll likely stay that way for a few more years
Nearly all companies I worked for were developing new systems, tools, etc. Rarely I was doing maintenance on "existing larger enterprise systems".
Nowadays it is increasingly niche. Like COBOL there is still a lot of perl code out in the wild.
I wrote a _lot_ of Perl, starting with Perl4 cgi scripts in the mid 90s, then Perl5 and FastCGI and Apache ModPerl. I loved it as a language. But by the time I left that gig in 2008, nobody wanted Perl any more. I mostly drifted around PHP, Python, Ruby, and Javascript for a few years until moving away from full time coding and up (sideways?) into leadership and mentoring roles.
Interestingly I got _into_ the Perl gig when I bailed on a COBOL maintenance gig where it was clear nobody was at all interested in listening to how their 10+ year old custom COBOL warehouse management app (written by the company the boss's sister used to own) running on EOLed Wang minicomputers - was completely incapable of dealing with 4 digit dates for Y2K. I jumped ship to make that somebody else's problem.
Still, if you buy a brand new mac today, most of the executable scripts in the system are written in perl.
You can check it yourself by running:
file -bL /bin/* /usr/bin/* | cut -d' ' -f1 | sort | uniq -c | sort -n
As of 2024, macOS is essentially a Perl operation.The good thing with Bash etc is that they are so bad you wont and when you do it anyway atleast you get some whip lashes for it.
There are many others, of course, bit those are the teams at places people have heard of off of the top of my head. It's far from dead.
What does seem to be dying are the framework-centric Play Akka, and non Airflow raw Spark jobs out there. Now, a lot of that is because they were framework jobs that happened to originate in the scala ecosystem - scala was largely incidental and was chosen because of founding project members' preferences or due to the need to develop a commercial market, imho.
Examples?
The problem however is that I can't be bothered to roll out a JDK, and secondly if I did it might encourage someone else to start writing Java again internally. Risky payoff...
It does make me wonder about millions and millions of lines of Java out there; Java has more or less eaten the enterprise space (for better or worse), but is there any reason to think that in 30-40 years the only people writing Java will be retirees maintaining old banking systems?
it’s not even esoteric and difficult, just a lot of it without much structure visible to you.
The mainframe is turning into a middleware layer running on Enterprise Linux. We've containerized the mainframe at this point, and I mean that directly - eg. Running jcl, multiple CICS regions, all in COBOL that originated on z/OS is now running in k8s on amd64.
Even ignoring the needs of the super high end customers like banks (eg, cpus in lockstep for redundancy), being able to write your app and just know that inter-node message passing is guaranteed, storage I/O calls are guaranteed, failover and transaction processing is guaranteed, just raises the bar for any contender.
K8s is wonderful. Can it make all the above happen? Well, yes, given effort. If I'm the CTO of an airline, do I want to shell out money to make it happen, risk it blowing up in my face, or should I just pay IBM to keep the lights on, kick the can down the road, and divert precious capital to something with a more obvious ROI? I think their "no disasters on my watch/self preservation" instinct kicks in, and I can't really blame them.
HN thread:
Other places stopped development 20 years ago and surrounded the mainframe with now legacy middleware. A lot of the “COBOL” problems with unemployment systems during COVID were actually legacy Java crap from the early 2000s that sat between the mainframe and users.
But that's the thing, we are at the point when "keep paying IBM" isn't the acceptable answer anymore.
HN is not the place to seek authoritative experience with something like COBOL.
A lot of these services are completely transparent to the application, but that doesn't mean they are totally transparent to the entire programming staff. The system configuration and programming is probably more complicated (and lower level usually, certainly YAML hasn't really caught on in the Mainframe world outside of the Unix environment) all things considered than something like k8s.
So that's where a lot of the complications come in to play. Every application migration is going to necessarily involve recreating in Kubernetes or some other distributed system a lot of those same automations and customizations that decades worth of mainframe systems programmers have built up (many of whom will no longer be around). And however bad the COBOL labor shortage really is, the shortage of mainframe assembly programmers and personel familiar with the ins and ours of the hardware and system configuration is 10x worse.
It should also be noted that not everywhere that has a mainframe has this issue. There is a wide disparity between the most unwieldy shops and the shops that have done occasional migrations to new LPARs and cleaned up tech debt and adopted new defaults as the operating system environments became more standardized over time. In the second case where a shop has been following the more modern best practices and defaults and has fewer custom systems lying around, ... the amount of effort for a migration (but also in a lot of ways, the motivation to take on a migration project) is lessened.
The case where some company is just absolutely desperate to "get off the mainframe" tend to be cases where the tech debt has become unmanageable, the catch 22 being that these are also the cases where migrations are going to be the most likely to fail due to all of the reasons mentioned above.
That applies everywhere.
Your parent comment has managed to stuff a mainframe in a container and suddenly, hardware is no longer an issue. COBOL is well documented too so all good and so too will be the OS they are emulating. I used to look after a System 36 and I remember a creaking book shelf.
The code base may have some issues but it will be well battle tested due to age. Its COBOL so it is legible and understandable, even by the cool kids.
If you lack the skills to engage with something then, yes, there will be snags. If you are prepared to read specs, manuals and have some reasonable programing aptitude and so on then you will be golden. No need for geniuses, just conscientious hard workers.
It's not rocket science.
The problem is -- it's very smart and unique, while organizations that have this kind of a problem don't want to depend on unique set of skills of a few highly capable individuals. Everything needs to be boring and people have to be replaceable.
In this paradigm, vendor java with aws lock-in is a cost, but in-house fancy stuff with cobol on k4s done by smart people in house is worse -- it's a risk.
I'm working at one. You wouldn't believe the stories.
Will you let me know some of the names in the space so that I can research more? Some cursory searching only brings up some questionably relavent press releases from IBM.
IBM Wazi As A Service is supposed to be more affordable than the self hosted version and the Z Development and Test Environment (ZD&T) offering. ZD&T is around $5000 USD for the cheapest personal edition, so maybe around $2500-3500 USD per year?
If you’re a bank, you run COBOL. Estimates are 95% of ATM transactions go through a COBOL program.
But it doesn’t have to run on a mainframe! We’re adding COBOL to the GNU Compiler Collection. www.cobolworx.com.
I expect Ada will capture 0.05% of the market for the next 100 years.
Ada definitely does seem pretty cool from the little bit I have read about it. I’m not sure why it’s fallen by the wayside in favor of C and its derivatives.
This is much the same reason I'm highly skeptical of Rust as a replacement systems language to C. A multitude of very talented folk have been working on writing a second Rust compiler for years at this point. The simplicity and ease of bootstrapping C on any platform, without any special domain skills, was what made it absolutely killer. The LLVM promise of being easily ported just doesn't hold true. Making an LLVM backend is outrageously complicated in comparison to a rigid, non-optimizing C compiler, and it requires deep knowledge of how LLVM works in the first place.
In theory, something like Rust could do the job instead, but they'd still have to verify the entire chain. Rust is for the rest of us to get something half as reliable as that while also being able to write more than two lines of code per day.
Most of the new code from the past few years has been in Kotlin though.
I've got a soft spot for it as well because I actually used it. At work. On a PC. In the 90s. My assignment was to figure out how to get data into it, for which I ended up writing a routine that operated on floating point numbers as vectors of 1s and 0s and swapped the bits around to convert from Microsoft to IEEE format. While wearing an onion on my belt, of course.
Other newer array languages exist too - https://aplwiki.com/wiki/Running_APL if want to explore the current space.
It feels like we're getting into that space already.
It can also be upgraded in smaller chunks and finding enough developers for the tool is an important metric corporate is looking at.
If anything, banks are actively optimizing for developer experience to make sure 60% of new hires don’t run away in the first year. If anything, banks are better at navigating those kind of structural risks, they were just slow on undertaking such risks exist.
If you have an episode of existential anxiety because of dat AI eating mijn job, getting a union job in a bank is a way to hedge this particular risk.
Um oh yeah, the reason we're hiring 20-year-olds is because we want to ensure we have lifelong support for the new system we're writing. Not because they're cheaper, they're still idealistic and naive, they'll work long hours for foosball tables and stacks, or anything like that...
Uhm... loyalty is punnished and workers need to change jobs to keep 'market rate' wages. So dunno about that.
I think it is more about that newcomers to the job market are easier to abuse.
Well I hope they’re wise enough to not let any good employment attorneys catch wind because that’s blatantly illegal.
Discrimination is an almost "thought crime", meaning you can commit it entirely in your head. But the outcome is real. So it's very tough to spot, particularly when said discrimination also aligns with the most common societal biases.
Initially and up to some extent still now, it is verbose and wording wise, very similar to COBOL, then somewhere I guess in the late 90s, OO paradigm wave came in, and it had "OO ABAP" with classes and methods. Now cloud wave is influencing it and ABAP now has a new cloud flavor "ABAP for cloud" where most of the old constructs are not supported.
I don't think so. But it's pretty much guaranteed that a lot of the people who are complaining about COBOL today are writing systems that will be legacy in 30 years. And the generation of programmers then will be complaining about today's programmers.
Especially when I look at node or python with tons of external packages (.NET going the same way), I don't see a good long term future.
I am very much glad I wasn't alive at the time this was the state of the art.
Found the paper with original code here, it's for a Reinsch spline: https://tlakoba.w3.uvm.edu/AppliedUGMath/auxpaper_Reinsch_19...
I started programming in COBOL (circa 1990) and took the tutorial just for fun earlier this year.
I kinda suspect that if Java is still around in 30 years, what we call Java will be - at best - vaguely recognizable.
COBOL is alive in that it keeps changing from era to era, to the point modern COBOL looks rather little like the 1950s COBOL everyone instinctively thinks about when they heard the term. It's as if we were still programming in Algol because Java had been called Algol-94 or something.
MULTIPLY A BY B GIVING C ON SIZE ERROR STOP RUN.
any more.> EBCDIC: /eb´s@·dik/, /eb´see`dik/, /eb´k@·dik/, n. [abbreviation, Extended Binary Coded Decimal Interchange Code] An alleged character set used on IBM dinosaurs. It exists in at least six mutually incompatible versions, all featuring such delights as non-contiguous letter sequences and the absence of several ASCII punctuation characters fairly important for modern computer languages (exactly which characters are absent varies according to which version of EBCDIC you're looking at).
It also does a reasonable job of generating working COBOL. I had to fix up just a few errors in the data definitions as the llm generated badly sized data members, but it was pretty smooth. Much smoother than my experiences with llm's and Python. What a crap shoot Python is with llm's...
Ruby is nice.
They feel great to type out when you're in the flow, but coming back and reading them grates on my nerves. Seeing the condition first means I load a logical branch into my mental context. Seeing the condition after means I have to rewrite the context of what I just read to become part of a logical branch, and now the flow of reading is broken.
And in any event it’s a very natural language pattern if you know what I mean.
Natural language patterns are conversational, and / or use pacing to create emphasis and imply meaning.
With code, we aren't reading a natural language. Injecting natural language conventions amongst things like some_string.chars.each { |c| ... } Is entirely unnecessary and unhelpful in my not very humble opinion.
The infix if form is as if not more readable than the prefix if in cases where only a single statement is guarded.
I mean we could code without any pesky natural language at all by using some kind of lambda calculus with de Bruijn indices, but I think most people would find that considerably less readable.
You had me at MULTIPLY A BY B
Kemeny and Kurtz described Fortran as "old-fashioned" in 1968! <https://dtss.dartmouth.edu/sciencearticle/index.html>
I am managing an ERP system implemented / went live in 2016. It's working on modern P10 hardware, which was released in 2021. The ERP system is continually updated by the vendor and customized by the client.
Even for COBOL running on an actual mainframe, which I think most HNers would think of 1970s dinosaur, most of the actual machines in production would be pretty new. IBM z16 was launched in 2022.
So they are "legacy systems" in the sense they're not written on a javascript framework which was launched last week, running on lambda instances in AWS :). But they are not "OLD" systems, as such.
The new Telum II processor (and certainly this also implies another big ISA update and new hardware cycle) was announced at Hot Chips just a few weeks ago for example. See:
https://chipsandcheese.com/2024/09/08/telum-ii-at-hot-chips-...
Indeed, mainframes are hard to get access to, and require a training by themselves, I have worked on Linux and Windows for years, and development on a Mainframe has nothing in common :-)
I think the problem of COBOL is not only the lack of COBOL developers, it is the lack of expertise in COBOL environments, because they have become obsolete (both on mainframe and proprietary tooling for Linux/Windows). By providing a modern environment on Linux for COBOL, our goal is to solve the hardest part of the problem, as learning COBOL itself is not so hard for existing open-source developers...
But what I remember most: the two other students were mainframe programmers, and they were just as baffled by my world as I was by theirs. It really was an entirely different computing paradigm, although 30 years later I probably have enough experience to make more connections than I could then.
Cobolworx is indeed working on a gcc frontend for COBOL. It's an impressive work (that was presented at FOSDEM this year), but less mature than GnuCOBOL, and tied to gcc, whereas GnuCOBOL can work with any C compiler (llvm, msvc, etc.) by translating COBOL to C.
Though we desig SuperBOL to work with GnuCOBOL, it could also be used with GCOBOL when it will be officially stable.
If we can call a technology dead once no new business is built on it, then I think we can safely call COBOL dead (and the IBM 390x aka Z/OS platform along with it, for which "COBOL" is usually a proxy).
But if we say that anything still being used in production is not dead, then of course COBOL is alive and significantly more alive than many other things which are younger than it.
But this shouldn't really be taken as a positive point for COBOL or the mainframe ecosystem. It's simply a fact of life that organizations tend to stick with the first thing that works, and for the types of entities involved in the first wave of digitalization (e.g. governments, banks, airlines) that was usually an IBM mainframe along with the software that runs on it.
The problem with killing the mainframe is that no other platform really exists that can handle the amount of simultanous IO that you can get on a mainframe. Our mainframe easily processes 100m transactions per hour, with room to spare. And keep in mind that those transactions are for the most part synchronous, and will result in multiple SQL transactions per transaction.
Yes, eventual consistency is a thing, but it's a very bad match with the financial world at least, and maybe also military, insurance or medical/health. You can of course also partition the workload, but again, that creates consistency issues when going across shards.
Also, COBOL is far from dead, but it's slowly getting there. I don't know of a single bank that isn't actively working on getting out of the mainframe, though all projections i've seen says that the mainframe and COBOL will be around until at least 2050.
Give that a thought. That's 26 years of writing COBOL. Considering that COBOL programmers are also highly sought after, and usually well paid, one could literally still begin a career as a COBOL programmer today and almost get a full work life worth of it.
So in that sense, not much has really changed, and for the target market of the product, I don't think it makes sense as a good metric for whether the platform is dead or alive.
You don’t suppose any bank - or other large financial institution - might have standardised on Cobol for their core business flows/processes? In which case a new business-unit or “internal startup” team (e.g. a new category of insurance product) might very-well have some part written in Cobol so it integrates with the rest of the bank - or at very-least might be built-on-top of the org’s existing Cobol-running infrastructure (i.e. Not written in Cobol, but still runs on Z/OS because there’s no budget for buying new commodity x86 racks and the people to manage and run them).
What I mean is that nobody starts a business today and says "Ok, we need an IBM mainframe running DB2 and we'll have a bunch of COBOL, ReXX, and PL/I programs for handling our business logic".
But it has happened at least a little within the past couple of decades, most notably with China but there have probably been other examples in Asia.
We saw exactly the case of a new business unit being created, and like most other units it wouldn't get direct access to the lowest layer, and interact instead with a saner level of API and modules in the language of their stack.
The grandpa could create (using CICS), a very reliable and performant service that would call other services inside the same transaction. The platform would handle all the complicated stuff, such as maintaining data integrity.
Try to write AWS Lambdas that call each other within the same transaction.
Vendor lock-in from a single vendor? Wildly expensive capex and opex? Impossibility for people to know any of the tech involved without you sending them on a course to learn about it or them already having experience with it?
> Try to write AWS Lambdas that call each other within the same transaction.
Why is that your comparison? Was deploying to the mainframe as simple as throwing a .zip with your code at an API that you could give access to developers?
Is this a trick question? The answer is 'yes' to all three.
For AWS, it isn't. Outside of a few narrow exceptions, there is no vendor lock-in into a single vendor. (A container that can run into Lambda can run into Google Cloud Run just fine).
There is no capex with AWS.
There's a free tier and it's freely accessible to anyone. Anyone, and I mean anyone, can start learning it if they want to.
Good luck getting access to a mainframe to play around to see how and what works. Or finding any useful tutorials from this century.
The grandpa developer delivered a function, not a .zip file. Nowadays the developer needs to deliver a .zip file -- because the developer is responsible for wrapping the function in something that executes the function -- often SpringBoot in corporate environment.
He could use AWS Lambdas, but that locks them in. Also, you need to worry about restart times, and price/performance is high, because there are many layers of virtualization.
But the biggest loss is that in "best-of-breed architecture" (bunch microservices running in Kubernetes) the developers have in practice no way of guaranteeing data integrity. Systems are in perpetually inconsistent state (called "eventual consistency") and we just pretend the problem does not exist.
The grandpa developer could develop functions that would call other functions, and all would be executed within a transaction. It would be within his means to maintain data integrity.
For the individual developer, the situation is much better. I can get my Django application to be hosted on fly.io in no time and it is quite cheap. I think the cost of running services is temporarily subsidized by influx of VC money, but that will eventually change.
The irony is that we already had a memory safe and stable language in Cobol that was easier to read and understand than Rust. But, no one wants to use it so it is "dead" but it runs everything that made the modern age possible.
RUST:
println!("Enter number: ");
let mut input_string = String::new();
io::stdin().read_line(&mut input_string).unwrap();
let number: i32 = input_string.trim().parse().expect("Please enter a valid number.");
let result = if number % 2 == 0 {
"EVEN"
} else { "ODD"
};println!("The number: {}", result);
COBOL:
display 'Enter number: '
accept number
if function mod(number,2) = 0
move 'even' to result
else move 'odd' to result
end-ifdisplay 'The number: ',result
But I find that hard to believe. Does COBOL really solve all the same problems Rust is intended to solve? Is it as performant? Can it interface with native code from other languages in the same way? Does it have a usable and sane package manager built on top of a module system that facilitates composability and backward compatibility? Does it have a way to describe the shape of data and errors as ergonomically as Rust's algebraic data types?
Genuinely curious: as I said, I don't know COBOL. I'd find it extremely surprising if the answers to all these questions are "yes," though. Just as there are reasons COBOL is still used, there are also (good) reasons new languages have been created.
Do they solve all the same problems? No, for example COBOL lacks a modern concept of concurrency within a single program. COBOL's concurrency features are based on task-level parallelism, which involves dividing a program into multiple tasks that can be executed concurrently.
Is it performant? Yes. COBOL is highly efficient particularly in handling large datasets and complex business logic and its compilers are optimized for reliability and speed.
Can it interface with native code? Yes.
Does it have a package manager? No.
Does it describe shape of data? No. Data structures in COBOL are defined using fixed-length records.
Note: I'm not a COBOL expert. I did learn it in college, though.
And that underpins most of the critical infrastructure in your country.
identification division.
program-id.
even-or-odd.
data division.
working-storage section.
01 num pic 9.
01 result pic x(4).
procedure division.
display 'Enter number: '
accept num
if function mod(num, 2) = 0
move 'even' to result
else
move 'odd' to result
end-if
display 'The number: ', result
stop run.
It's peculiar to call out Rust's syntax specifically when, like most other languages these days, is mostly C-like (though with a sprinkling of OCaml). And syntax aside, Rust and Cobol have wildly different goals, so "just use Cobol" doesn't suffice to obviate Rust's purpose for existing.I guess my post is getting misread as "just use cobol" when it was more of a XKCD-like reflection; e.g. why did we all do that / keep doing that. We done did Cobol, and Rust. And, one is "dead" but not really and now here we are.
COMPUTE SIZE-NEEDED = LENGTH OF OBJ + LENGTH OF VARTAB * NUM-ELEMENTS
ALLOCATE SIZE-NEEDED CHARACTERS INITIALIZED RETURNING VPTR
SET ADDRESS OF VARGRP TO VPTR
MOVE NUM-ELEMENTS TO OBJ
MOVE BUFFER(1:SIZE-NEEDED) TO VARGRP
SET VPTR TO ADDRESS OF BUFFER
FREE VPTR
The compiler did normally warn for data bounds checking, so I figured it would in this case. If that's not the case anymore then I'm wrong.
It is estimated that there is 800 billion lines of COBOL code in production systems in daily use. That is a bit more than 100 lines.
This was why Y2K genuinely scared everyone and was a very real problem. The only reason we can look back at it and laugh now is that an army of engineers sat down and rewrote it all in the nick of time.
> army of engineers sat down and rewrote it all in the nick of time.
No way did all get rewritten. Where source was available, fixes were applied and systems retested.
True drama ensued for programs for which the source was no longer obtainable.
The company I was at during that time had programs that had been in production since at least 1960.
The other effort that took place was attending to the systems during the midnight boundary with everybody either in the office or on call.
The other strong observation was that the risks were very much not understood, with exaggerations both extreme and dismissive. Also not discussed in the popular press at the time was the extent that most of these systems were not truly totally automated.
It's performant, you can't take away that.
It's just that nobody understands how the systems work and they're ossified. Those systems are going to be emulated until our grandchildren take over because nobody can understand them well enough to craft a replacement. Juuuust up until an LLM rewrites them for us.
[edit] I mean those airlines systems are so old that they don't support special characters on names, passenger names are two fixed-length fields (first name, last name) and title and middle name just gets appended together.
So you get LASTNAME/FIRSTNAMEMIDDLENAMENTITLE on your bookings. And each of those fields is truncated lol.
and of course flight numbers are fixed at 4 digits, so we're running out of those.
Not exactly a great ad.
If these new fangled languages are so great, one day they can be legacy code too. :P
Yeah, there are fewer engineers in COBOL which is why it pays BIG bucks now. They desperately need someone to maintain that massive infrastructure that has been built up over 75 years that cannot be replaced easily or quickly.
"A ship in harbor is safe, but that is not what ships are built for."
I didn't say I wanted to code in it, though. I'd prefer in no particular order Kotlin, Python, Go, C++, Rust, Perl, C#, Java, Zig, etc. Anything really over COBOL myself. I'm part of the problem.
But, if I was hard up for money and wasn't getting nibbles for jobs? I could see getting into COBOL because there is a lot of money in it and always work available.
My statement stands though, we need to do better when designing the syntax of our languages. Cobol is disliked, yet simple and readable. What does that say about our new languages. How hated are our "new" language remnants going to be when few of us are longer around to maintain them 50 - 75 years from now? And, how easy are they going to be to pick up?
Addendum: I guess it won't matter if the singularity comes and just writes it all for us, of course. Then it will all just be machine code and we won't need these "only human" translation layers any longer.
I think new Cobol has 'allocate' and 'free' though.
It's like saying no gardener should be allowed near a garden that would choose a shovel over a pair of shears. Both have a place.
https://x.com/grauhut/status/1000017084435312642
Translated:
> "I found some COBOL at a customer site. Fine. Mainframe. Nothing special. > The last comment is from 1985. > Written by my mother."
Its only new builds on someone else’s computer that have this modern issue
If enough stuff needs it, people will keep it running. Java 8 will probably be in the same boat eventually if/when Oracle finally drops support.
I guess deploying it on a newer OS which might make it challenging to install unless it is a freshly compiled build?
The bigger problem: COBOL was an open standard but none of the implementations were open source for ages (I haven’t looked at GNU COBOL in years, but I think this is no longer the case) so nobody was building new things or experience when they had to pay to get started.
The SO Developer Surveys give some info on the job market for COBOL as it appears on the average salary versus years-of-experience graphs, which I like as there's as many stories or reasons as you can think of to explain them.
In 2023 there were 222 respondents who averaged 19 years of experience, and an average salary of $75,500. In 2024 the exact number of respondents is not shown, but likely similar based on the color code of the point, but the average experience had dropped to 17 years.
Elsewhere in the graph my favourite open question is: how come the over 2000 respondents mentioning Swift average over 11 years experience in a language that's only been public for 10 years?
2024 https://survey.stackoverflow.co/2024/work#salary-comp-total-...
2023 https://survey.stackoverflow.co/2023/?utm_source=so-owned&ut...
Seems a shame that people report Objective-C experience as Swift experience to such a great extent. These surveys are not resumes...
Perhaps it just "proves" that all data in these charts is questionable.
Not to be too macabre, but we need to transfer the knowledge while the people who have it are still alive, can remember and can teach others to pick up the torch. And, let us call it was it is, of those remain and still have the desire to make the effort to transfer that knowledge.
It is easy to look back on y2k and think well that wasn't a big deal, but the only reason it wasn't is because people tirelessly worked to stop it. It is a testament to their success.
Regarding y2k Robert Bemer tried to warn people in 1971, with 29 years left to go. And, Peter de Jager published his attention-getting article "Doomsday 2000," in 1993 (in Computerworld), with a mere 7 years left which finally put the fire under everyone's ass. Keep in mind, there were still many original COBOL programmers and mainframe experts left to talk to at that time. And, there was a lot less code to change back then than there is now.
Voting tabulation, insurance, utilities, administrative systems, banking, ATMs, travel, healthcare, social security, point of sale, IRS, pension funds, TACTICAL NUKES, hotel bookings and payroll programs. More than 800 billion lines of COBOL code in production systems in daily use. For better or worse, it is the very bedrock of our modern society.
If you want to replace it with something that you want to maintain instead, that's fine too but we're running out of time.
"Danger, Will Robinson! DANGER!" https://www.youtube.com/watch?v=OWwOJlOI1nU
"Listen the nothing will be here any minute. I will just sit here and let it take me away too. They look... like... big... strong hands.... Don't they?" https://youtu.be/symP4QT7wLU?feature=shared&t=24
Nah. We need to not transfer that knowledge, because the problem will be solved when the house is on fire.
But do not worry : nothing will happen until then. If those people cared, they would work to replace all that cruft, not enhance it to fix 2038.
Old man reminiscence following, skip if you are not bored:
I worked with SNOBOL and I thought it will be a long term programming language. I also want to think that I had some tiny, minuscule hand in dev of RIPscrip pre-Telegraphix, alas it went as the dodo bird.
I think I have forgotten more programming languages than I can count on my hands. Yet, I see them in some part every day in newer languages, "discovered" by some expert. "What has been will be again, what has been done will be done again; there is nothing new under the sun."
One language has come to my aid for the last 30-ish years Perl has came to my aid many times.
(I tell you a secret - in the deep deep bowels of a a very, very large, jungle named company, servers still have tiny Perl scripts running some core functions. I discovered this, when there was a problem that I had to deep dive into. I a recommendation to change to a hard-coded variable. The answer was "it will take two weeks". Why? Because no one knew what it will do or could read Perl. It was a 30 second job, including sdlc. Think xkcd Dependency https://xkcd.com/2347/ )
COBOL developers are literally dying out which has made for a competitive market for remaining talent. I've heard of some large consultants charging over $500/hr to their clients for a COBOL developer!
My first job after college was a software shop organized in a "services" model, where clients would have to sponsor teams to do feature dev or support beyond initial onboarding. It's been a long time and my memory is hazy, but as I recall I was expected to bill ~40 hours a week to clients and if I only worked 40 hours that week (being OT exempt, this was always the goal), my hourly pay came out to between 10-20% of what the company billed the client.
So $500/hr on the bill and $45/hr on the paycheck both manage to sound plausible, even at the same company.
It's like the old joke about the engineer being asked for an itemized bill: "Chalk mark: $1. Knowing where to put it: $4,999."
But yeah that stuff is never going away as far as I can tell. Its just too risky to rewrite those core systems and many a boondoggle has tried and failed.
The third is that COBOL is only the tip of the iceberg. As soon as I spent time learning about the code I was being asked to look at, you get into decades of evolving programming practises. Modern COBOL is multithreaded, probably uses DB2 and relational datamodels. COBOL from thirty years ago is probably single-threaded, only runs right on high-clocked single-execution models, cuts down to hand-written s390 assembler regularly, and uses VSAM files with non-relational data. Older code still will be sharing data simply by banging it into memory regions for other code to read out of, because that's how you got performance back in the day.
Trying to identify how you'd pull a function out of that and move it off is somewhere between extremely difficult and impossible. It's usually so complicated and expensive it's easier to try and hire people who want to apprentice as mainframe programmers and keep the current codebase running.
And that's why so many neo-banks/fintechs are eating the lunch of the established banks left and right, same for insurance. The "old guard" is unwilling to pay the costs of not just upgrading off of mainframes (aka the rewrite work itself)... but of changing their processes. That is where the real cost is at:
When you have 213.000 employees like BoA has and everyone needs to have at least 10 hours of training and 2 weeks until they're familiar with the new system enough to be fully productive, that's like 2 million man-hours just for training and 16 million hours in lost productivity, so assuming $50/h average salary it's around 900 million dollars in cost. Unfortunately for the dinosaurs, the demands of both the customers and (at least in Europe) regulatory agencies especially for real-time financial transfers just push the old mainframe stuff to limits, while at the same time banks don't want to cede more and more of that cake to Paypal and friends that charge quite the sum for (effectively) lending money to banks.
In contrast, all the newcomers start with greenfield IT, most likely some sort of more-or-less standard SAP. That one actually supports running unit and integration tests automatically, drastically reducing the chance of fuck-ups that might draw in unwanted regulatory attention.
Source: worked there for many years, and built some of those integration systems.
Most fintechs aren't banks and partner with a Real Bank™ to provide the actual bank accounts. Fintechs are under much less regulatory scrutiny (for now—that may be changing with recent, high-profile screwups) and can move with much more freedom regardless of the tech stack they've chosen.
I work at a company with a large COBOL codebase and this has been mentioned in a few presentations about our modernization efforts.
I bet LLMs can make working with COBOL a lot easier and more fun than it ever was. I bet that's true for a lot of legacy stuff.
Like others have said, what's valuable is an understanding of the business and legacy cruft that comes with spending time working at this kind of companies/banks/etc rather than knowledge of COBOL.
Generalists are usually offshored and are cheap.
In any case, they would have to pay well by a large margin to justify working on dead boring legacy systems, too.
Nowadays, even if someone is right about something and most people are doing it wrong, nobody will care to even discuss it unless the person making the statement is one of maybe 3 top influencers in that field.
It is a bit of a reality check when words like 'grandpa' are linked to an article from 1992! My brain is expecting the article to be from the 60's, 70's... or possibly 80's.
My world view, it is hard to image a child born in 2000 is 24 years old now. Their grandparents could be as old as I if they had children (and their children) at a young age.
Then I read at the end he was 91 when he passed. He did well! Likely around my Grandads age - and managed to last an extra 24 years on this planet!
I remember reading a book on COBOL in my younger days learning to program, alongside BASIC, C, and Pascal. I might still have it. Despite reading and never coding in it, I have been (fortunate, I guess) to have never programmed in it.
I do agree with the writer that using the word "dead" in the programming language world is unrealistic. Some would argue that there are popular, modern languages out there as being "dead" - but they might get a huge push for one reason or another in the future. Could COBOL find a new, niche spot.
Maybe.
The punchline is that this was in 2018.
For any cobol devs here, we at https://cobolcopilot.com would love to hear from you
Looking back, COBOL would have been a better technical choice back then. Dataflex's metadata-based dynamic UI and report generation saved some simple, repetitive work, but much more effort was wasted working around its limitations.
I'll have you know I was approached for a FileMaker project not too long ago!
For those who haven't heard about it, ABAP (Advanced Business Application Programming) is the name of SAP’s proprietary, fourth-generation programming language :) It's SAP's main language. It's a direct descendant of COBOL, I'd describe it as a COBOL with OOP extensions.
Since SAP's ecosystem is sneaking everywhere, COBOL in its modern, very close incarnation (ABAP), gains new space!
If in any doubts, check some ABAP code. It's not simply influenced by COBOL, it's COBOL.
So despite its long death, it still seems to be kicking about. I doubt we'll ever get rid of it.
The story of languages like COBOL isn't that a language is too deeply embedded to become too expensive to replace. It just means the replacement will happen at a higher level - the business itself, and will take more time as a result.
In '92 I was maintaining COBOL code for a custom written warehouse management system for a wholesale boat bits distributor. The company that wrote it had lost almost all their COBOL devs, and were all in on Windows NT application dev.
I hate to admit it to myself, but I am in fact _just_ old enough that I could have cs grad aged grandkids, if I'd had kids early and they'd also had kids early. :sigh:
There is an ongoing effort to refactor as Java. This will ultimately take years and cost $100s of millions of dollars. There is still a small but shrinking team of graybeards who can actually maintain the code, which has to be reprogrammed every year to accommodate changes to tax code.
See, e.g., IRS IT Strategic Plan documents, publicly available.
Guess COBOL is alive enough to warrant this kind of support.
The old guard mostly still prefers ISPF though cause they've become really fast in it not unlike a Unix greybeard is gonna prefer something like vim.
I'm sorta torn on it. I like using the 3270 environment cause I can get around to different places a little easier than via vscode, but if I'm editing a lot of large files, it's nice to be able to see more code at once and have them open in multiple side by side tabs. You can do that in ISPF, but it's a little more unwieldy and you have less dynamic control over font size.
The only real upside was, COBOL is so wordy, it forced me to improve my typing speed!
We might all die, but COBOL will sit happy in its steel reinforce nuclear bunker
Your last sentence explains why ASM is a language. ASM compiles to machine language.
Ada I've never heard of, so maybe that one's dead?
If they're able to write WebAssembly compilers for all these languages, then they'll probably live forever!
The only reason punchcards are "dead" is bc the machines are gone or mostly unavailable...
He's been that for ~5 years.
I don't think it's going away any time soon.
This breaks down with the fact that it's really difficult, outside of really constrained spaces, to turn a "what" specification into a high-performance implementation, so any practical language needs to let you give some degree of control in the "how", and as a result, any modern language is somewhere uncomfortably between the 3GL and 4GL in the paradigm, not fitting entirely well in either category.
I think we’ve had at least 4 generations of that idea that reducing coding time will be a game-changer: the COBOL/SQL era of English-like languages promising that business people could write or at least read the code directly, 4GLs in the 80s and 90s offering an updated take on that idea, the massive push for outsourcing in the 90s and 2000s cutting the hourly cost down, and now LLMs in the form being pushed by Gartner/McKinsey/etc. In each case there have been some real wins but far less than proponents hoped because the hard problem was deciding what it really needed to do, not hammering out syntax.
There’s also a kind of Jevons paradox at work because even now we still have way more demand than capacity, so any productivity wins are cancelled out. At some point that should plateau but I’m not betting on it being soon.
It wasn't true, although they did make some operations easier in tangible ways.
Rust is a wholly different kind of thing -- not easier than, say, Java, but lots more control with better guarantees. It's more a systems programming language. 4GLs were all application-focused.
* https://www.youtube.com/watch?v=nMtOv6DFn1U
One reason COBOL systems have been around for so long is because they encoded business rules that need to be understood if you want to try to transfer them to a new system. From the podcast (~16m):
> Like when we're working in unemployment insurance, again during the pandemic, my colleague was talking with the claims processors week over week and we're trying to dissect it and figure out what's going wrong and clear this backlog and one of these guys keeps saying, “Well, I'm not quite sure about that answer. I'm the new guy. I'm the new guy.” And she finally says, “How long have you been here?” And he says, “I've been here 17 years. The guys who really know how this works have been here 25 years or more.”
> So think about. You know, going from doing some simple cool, you know, tech app, you know, easy consumer app to trying to build or fix or improve upon a system that is so complex that it takes 25 years to learn how to process a claim.
> That's sort of, I think, what needs to be on the table as part of this agenda is not just “can the tech be better?” But can we go back and simplify the accumulated like 90 years of policy and process that's making that so hard to make?
Also an observation on how decisions are sometimes made:
> And I think that there's a deep seated culture in government where the policy people are the important people. They do the important stuff and technology, digital is just part of implementation, which is not just the bottom of a software development waterfall. It's the bottom of a big rigid hierarchy in which information and power and insights only flows from the top to the bottom.
> And so it's problematic in part because the people who are doing the tech are really just sort of downstream of everything else and the power and ability and willingness to step up and say “Hey, we probably shouldn't do those 6,700 requirements, we should probably focus on these 200, get that out the door and then, you know, add edge cases as as later.” There's no permission really to say that.
I think there would be some value to closing that feedback loop to give legislators the signal "You know, what you're considering is actually pretty fuzzy conceptually... We're discovering while considering how to code it up that you probably don't actually have good, clear definitions for all the terms in this bill." But the biggest thing to remember about government IT is the clientele, which changes the approach from commercial / industry software.
Google can optimize for the common case. Google can cut the edge cases. Google can change APIs on a whim.
Google's users choose to be Google's users and can go elsewhere if they don't like it.
Government citizens don't have that choice. And in general, people don't lose access to their food if Google effs up. Or go without their legally-deserved unemployment pay. Or go to jail because their taxes were mis-calculated.
In the government space, the "edge cases" are human beings, alike in dignity. The rules and policies end up complicated because human beings are complicated. And yeah, it ends up being some messy software. Because you can't just decide to ignore the law when it's inconvenient to plumb the information that the client has a child under the age of 18 who is not a dependent because they're an emancipated minor, but said emancipated minor does have a child of their own, and the client is the primary caregiver for that child while her parent is in prison... from here to there in the dataset.
That's all very true, but nobody ever codifies that. When the data doesn't fit the constrains of the form that aims to handle a reasonable generalilized case, you simply get a phone call from a human in the loop. That human has a supervisor and you can also go to a court when they write your name with E instead of É and try to bullshit you about some kind of ASCIEBCDIC nonsense like it's real.
In the end you have one dataset which tells who is a child of who, another telling who has custody rights and a third one making sense of amounts and recipients of childcase subsidies. Maintained by different departments and eventually consistent or maybe not.
I don't think it's about avoiding programming 6700 edge cases, but more so that when you have an excessive number of cases, it's likely an indication that something is being missed. that could be due to a bug in software or due to unclear definitions in the legislation.
in those cases, rather than attempting to program it exactly, it might be better to bring a human into the loop.
and to me, that could be the point of having a tighter feedback loop. because otherwise the developers will just do their best, which will be buggy or incomplete. because they can't not do their job.
There is not permission to say that because your requirements are often set in a black letter law and you didn't buy a right kind of suite to be present where they were decided for the last 40 years.
Estimated popularity:
PHP: 100% (reference)
Perl: 30%
COBOL: 25%
BASIC: 10%
TCL: 8%
ColdFusion: 5%
(I consider all of these dead, or, "in maintenance mode")JS is not too pretty either, but I understand many believe it's the only lang the browser understands.