Where to find 'Real' programmers online?

Sometimes I wonder, is that really for any practical reasons, or is it to really stroke the vanity of the last few remaining guardians of "COBOL's legacy" ?
:rolleyes:
Its probably because the code sprawl is so vast, and so boring that no-one else wants to do it.

The challenge is split into 2 parts:
  1. Learning COBOL (easy)
  2. Learning the existing system codebases (difficult)
I don't feel AI/LLMs are really up to the task of #2 yet. Sure, it can have a good crack at it but no large corporation is going to risk even 1% of the code being defective (this is why they pay the legacy engineers high salaries).

What AI could do well is to automatically document #2 in a way that it allows a human to jump in and get bootstrapped quickly.
 
Yes, using the AI as an analysis tool to help understand the existing code might be where it will be most useful, that's what my friend said.
Indeed. And it is a really great tool for that. Rarely its brought up because analysis, documentation and testing is seen as "boring" and doesn't really massage this current "AI Hype" on i.e LinkedIn but really is the more realistic outcome. AI is already pretty good for generating tests to ensure the newbie humans are well guided.
 
Sometimes I wonder, is that really for any practical reasons, or is it to really stroke the vanity of the last few remaining guardians of "COBOL's legacy" ?
:rolleyes:
Definitely for practical, bottom line reasons. Each mainframe hardware refresh cycle costs the companies that use cobol on mainframe many millions, yet that is still vastly cheaper than the huge cost of migration to the cloud, let alone the risk of business disruption. IBM just had a bumper year selling their latest versions of mainframe hardware and the associated consultancy to all the companies that they have locked in to using it, which is many of the really big firms, especially the longer established ones. It's still IBM's core business. They call it an "AI mainframe" now. Yeah, right.
Their turnover in 2025 was 67.5 billion dollars, so its still big stuff. They are still a big firm. Not as big as when they used to do $100 billion annually, but still pretty big.
"IBM Z up 67 percent, up 61 percent at constant currency." Z (ie, mainframe) hardware was their star performer last year.

I wonder how Elon's re-write of the US social security system's millions of lines of cobol into java is going? It was going to be completed in a few months, I seem to remember him saying. It seems to have gone very quiet... 😂
 
Those geniuses did not know about the RFC that set the date for "unknown birthdate" and claimed that there were tons of people who are >120 years old collecting money. Same with elections. There is a huge problem with implied boundary conditions which are not part of any spec because they are part of the language definition and runtime spec. I'm sure Grace Hopper would have some interesting and quite pointedly things to say to these greenhorns. I would pay good money to be a fly on the wall then and there, but sadly the Admiral has steamed off to greener pastures.
 
The only real "use" of AI for COBOL can be to lower salaries of COBOL programmers, i.e. blackmail. Like "We plan to start using AI but can offer you job for $x000".
 
The only real "use" of AI for COBOL can be to lower salaries of COBOL programmers, i.e. blackmail. Like "We plan to start using AI but can offer you job for $x000".
Weirdly, they have had that strategy available to them for a decade in the form of outsourcing to india/china. They never really took it.

Possibly because the COBOL programmers would be more likely to just retire a few years early than take the salary cut.
 
Possibly because the COBOL programmers would be more likely to just retire a few years early than take the salary cut.
Or they are back at their old desk two month later for twice the bucks as a freelancer. That happend to a coworker of my dad.

You can say about banks what you want, but they have a better sense of risk-down-the-road than the run-off-the-mill beancounter.
 
Indeed. And it is a really great tool for that. Rarely its brought up because analysis, documentation and testing is seen as "boring" and doesn't really massage this current "AI Hype" on i.e LinkedIn but really is the more realistic outcome. AI is already pretty good for generating tests to ensure the newbie humans are well guided.
it is not any of those things, though. see, for example, https://www.tue.nl/en/our-universit...-summaries-suitable-for-studying-and-research

also have you actually read any of the tests these things turn out? more often than not they just test some trivial properties and print "everything was great". sometimes they just print out "everything was great" without testing anything.
 
grandpa Yes it does. And it has/had indent-sensitive semantics. (shudder)
Also, let me congratulate you on your signature, good sir. Now I have a nice stroll down memory lane.
 
it is not any of those things, though. see, for example, https://www.tue.nl/en/our-universit...-summaries-suitable-for-studying-and-research

also have you actually read any of the tests these things turn out? more often than not they just test some trivial properties and print "everything was great". sometimes they just print out "everything was great" without testing anything.
So technical documentation is quite a different thing to what you are referring to here.
Think more along the lines of a replacement for Doxygen rather than a comprehension assignment for Macbeth.
 
So technical documentation is quite a different thing to what you are referring to here.
Think more along the lines of a replacement for Doxygen rather than a comprehension assignment for Macbeth.
our experience is that everyone says "they're good for (this one thing i personally don't want to do)" and then the research into that thing shows that they are not, in fact, good at that thing, and nobody can in fact agree on the one thing they're good for. this leads us to conclude that they are not good for anything.
 
I have no reason to do any Fortran programming anymore, except that I have a copy of the Snoopy calendar program for IBM mainframe on my FreeBSD laptop. It worked for a while but recent versions of GCC Fortran and LLVM Fortran fail to understand such ancient code.

Might I consider taking on a COBOL contract at a bank for extra spending money after I retire? Certainly. I understand the money is good. Though they're working on AI maintaining ancient COBOL code. That remains to be seen though.
Fortran has always been the preferred language of engineers, so it has acquired many useful features for engineering, including complex numbers and matrix processing. It also supports parallel programming using the MPI interface. It also has the KIND system, which makes it much easier to get the same results on different hardware. Rather than specifying REAL*4 or REAL*8, like we used to, you can tell it to give you a real variable with p decimal digits of precision, and and exponent range of at least r. The compiler will use that to select the appropriate kind of variable on your particular machine. A minor, but handy, feature is printing numbers in engineering notation, like scientific notation except the mantissa is greater or equal to 1 and less than 1000, and the exponent is a multiple of three (works nicely with kilo-, mega-, giga-, tera-, etc.).
 
What kind of online communities do hardcore fortran-77 operations research type folk hang out?
It might be tricky finding a community for FORTRAN 77, because these day, most people are using Fortran 90, Fortran 2003, Fortran 2008, or Fortran 2018. FORTRAN 77 was the last Fortran to require fixed-format input, and there have been all sorts of improvements that make it more fun and useful to use.

The NASA Modeling Guru forums used to have some Fortran coders, but it got shut down due to budget constraints.

(A bit of trivia. Before Fortran 90, the language is officially spelled FORTRAN, for Fortran 90 and later, it is spelled Fortran.)
 
True, like BLAS and LAPACK for linear algebra, and (my favorite) the Fortran Astrodynamics Tool Kit:
I still use the algorithms published in the Journal of Applied Statistics, mainly for the code quality and robustness - all the mathematical issues, convergence, handling of floating point, etc, have been carefully thought through.

https://lib.stat.cmu.edu/apstat

I would expect there to be a lot of FORTRAN in some R packages - converted to C using f2C.

Believe it or not, there was an operating system written in FORTRAN back in the days of minicomputers:

https://en.wikipedia.org/wiki/PRIMOS
 
Yeah, understanding the existing COBOL codebase is important, and it is part of the problem. But mathematically speaking, if a language is Turing-complete, it is quite possible to translate the COBOL codebase into a modern language. Should I mention Rust as a candidate, or will everyone laugh at that?
🤣
 
mechanical translations are always pretty fucked. check out the Project Xanadu Gold release: http://udanax.xanadu.com/gold/download/index.html

to wit:
we enhanced the system to automatically translate from a modified subset of Smalltalk, which we called XTalk, to a corresponding heavily macro-ized subset of C++ we called X++. This Smalltalk fileout came from this modified system, and will not filein to a normal Smalltalk system without a lot of work.
later on down the page, they say:
[M]ajor portions of it are the result of automatic translation from Smalltalk, this code is not maintainable. Think of these portions as compiler output (though it's not quite that bad).
 
Back
Top