What’s the oldest, weirdest, nastiest, or most unusual language you’ve ever coded in?

fitten

Ars Legatus Legionis
52,251
Subscriptor++
At least in Commodore Basic, you needed a loop that read through the DATA arrays and POKEd the raw values into memory. You then used SYS commands to call code at specific memory addresses.

I remember seeing many Basic program listings that were basically using Basic as a boot loader to an assembly program. The program being all 'data' and a line of Basic that was the read/poke loop with a SYS.
 
  • Like
Reactions: zeotherm
Some people consider COBOL to be a weird language because, based on Grace Hopper's design for FLOW-MATIC,
COBOL statements have an English-like syntax, which was designed to be self-documenting and highly readable. However, it is verbose and uses over 300 reserved words. In contrast with modern, succinct syntax like y = x;, COBOL has a more English-like syntax (in this case, MOVE x TO y).
The Defense Dept. supported Hopper because "business" applications in the early 1960s were being coded in 2nd-generation assembly languages by programmers knowing no more than elementary algebra, who were in short supply. I'd done about a year of that myself, so learned COBOL in 1 week flat in 1969. Problems arose when such applications endured.

Such applications endured because they embodied particular organization requirements. They needed conversion, as here,
From mid-1994 to early 1996 I was the lead programmer on a project to convert the 2100+ programs of the NYC Financial Information Services Agency's Integrated Financial Management System from IBM's OS/VS COBOL to IBM's COBOL II. The project, which I think was projected to take a bit more than one man-year, ended up taking around five man-years.

First the bare-minimum background. FISA was established in the wake of the NYC 1975 financial crisis ("Ford to City: Drop Dead!") to solve two problems: (1) There was no city-wide budget vs. actual bookkeeping system. (2) Various city agencies were paying employees for whom they had no budget lines. FISA hired a prominent consulting firm to build two systems: (1) an Integrated Financial Management System and (2) a Payroll Management System that—in addition to doing all the usual things a payroll system does—hooked into IFMS to make sure that no un-budgeted employee got paid.

Both of these systems were written using IBM's OS/VS COBOL compiler, by different contractor crews. The underlying problem was that, because the OS/VS COBOL compiler was just a reworked version of IBM's original pre-ANSI-standardization F-level COBOL-65 compiler, it could handle statements that were IBM extensions rather than part of CODASYL COBOL-65. Some of these extensions fulfilled immediate needs that ANSI satisfied later with different syntax/semantics, but some of these extensions formed a source-level debugging sub-language (COBOL was originally designed for writing batch processing applications) that undisciplined programmers couldn't resist using for non-debugging code. IBM wrote its COBOL II ANSI-85-compliant compiler from scratch using recently-evolved CS techniques, and did not re-implement the F-level extensions. ....

IBM had found it necessary/helpful to add 1965 F-level extensions because of the hurried way COBOL had been adopted. But COBOL's weird English-language syntax made it easy to add such extensions, just requiring additional reserved words.
.
 
Last edited:

dmsilev

Ars Praefectus
5,410
Subscriptor
LabView is the language I have used that I liked the least. Partly, because the codebase I got lumped with included "virtual instruments" (labview for "module") that required three monitors to fit fully on-screen at once, and would either IT or my PI pay for that many screens for a lowly PhD student?
That’s shitty, albeit fairly common, LV coding style. Subroutines, people, subroutines.

In fairness, LV is one of the few languages in which, because of the wiring metaphor, it’s possible to write code that literally looks like a plate of spaghetti. Given the color-coding (which tells you the data type flowing through each wire), probably a pasta primavera.
 
  • Like
Reactions: zeotherm
There were, in fact, a few people who could follow "spaghetti code" with lots of GOTOs. One of them was a programmer named Jerry at the NYC Transit Authority. Another was (then recently-promoted) Captain Grace Murray Hopper, of whom I asked a question about GOTO in COBOL from the audience at a session of the 1973 Fall Joint Computer Conference. Her reply was "The problem is that people don't learn enough solid geometry". Evidently she didn't realize she had a rare talent.

In 1983-1984 I was a programmer for the Computer Sciences Corp., which had built an early time-sharing system whose primary user language was Fortran 77. I did my best to write structured code, but once I had to modify somebody else's program written in classic Fortran style. It was written with lots of 3-way branches, which I had great difficulty following.
 
Last edited:
  • Like
Reactions: educated_foo

BitPoet

Ars Legatus Legionis
21,424
Moderator
There were, in fact, a few people who could follow "spaghetti code"
You can write shit in any language, some of them are more conducive to it than others. I've worked on many C projects where there is only one exit from a function, everything sets a particular error and does a "goto end;" which unwraps everything cleanly and in one spot. It's wonderful. I've also seen an entire preprocessor function called u, which takes no arguments, so you'd just get u; in the middle of a function with no explanation. I asked someone where it was defined in the code and what it did and got a "it's well known" answer.

As mentioned earlier Perl seems to attract people who want to slam their forehead into a keyboard and get cohesive code, but it doesn't have to be written that way.

Generally as long as a person is not amazingly intelligent, and coded to the utmost limits of their ability, you tend to be fine. Keep the idea that code has two functions, one is to make a computer do a thing, and the second is to make what you're doing easily understood by the person reading it.
 

TRX302

Smack-Fu Master, in training
73
I had to learn Algol in school. We wrote our "programs" out in longhand, and the instructor graded them. There was no computer to run them on.

In the early '80s I played with some 8-bit home computers, most with their own wildly individual versions of BASIC. The weirdest was probably the one in the Sinclair ZX81.

In the '90s I did some stuff in TCL. I have recently resumed; TCL has come a long way since then, and it's a perfect fit for the problem set I'm working with. But I wasn't particularly proficient with TCL way back when, and had forgotten... basically everything. I've been a Pascal programmer for a very long time, and got used to having all sorts of high-level functions to play with. TCL is much different in how it does things. It's not really hard, it's just that I've become set in my ways. TCL is tiny, and has many interesting tricks up its sleeve. And for a language that's supposed to be next to dead, pieces of it sure get around; SQLite was original TCL's database handler, and its graphics extensions have been ported as Perl/Tk and Tkinter under Python.

I dabbled in FORTH, once upon a time. I was enthusiastic with the concept, but the implementations were... crude. And while it probably worked well for machine control, which it was originally designed for, it was a poor fit for the real-world software I wanted to write. TCL reminds me a lot of FORTH, except you can write real programs with it.
 
You can write shit in any language, some of them are more conducive to it than others. I've worked on many C projects where there is only one exit from a function, everything sets a particular error and does a "goto end;" which unwraps everything cleanly and in one spot. It's wonderful. ....

....

Generally as long as a person is not amazingly intelligent, and coded to the utmost limits of their ability, you tend to be fine. Keep the idea that code has two functions, one is to make a computer do a thing, and the second is to make what you're doing easily understood by the person reading it.
The problem was that —up through the early 1970s—most programs were written either in assembly languages for 2nd-generation computers or in Fortran—which up through Fortran II was fundamentally a machine-independent assembly language with math-expression and array-handling capabilities grafted on. Thus many programmers, having been forced to to make extensive use of GO TO by the language they were coding in, thought they were good at writing and reading "shit".

Following my experience finding an elusive bug in my RCA Phase 3 program, I read Dijkstra's 1968 Communications of the ACM "Go-to statement considered harmful" letter and realized that I wasn't that good with GO TO. So when I learned COBOL in 1969, my first program followed the "structured programming" guidelines @BitPoet describes as quoted above.
 
Last edited:
  • Like
Reactions: AndrewZ
Until last night I'd managed to forget my less-than-one-year 1972-73 job as a Project Control Analyst in the Finance Dept. of NYSE's Management Services Group. Besides designing a computer operations database for a proposed Finance MIS, I debugged existing Finance Basic MIS programs and developed procedures for billing computer time and developer services to NYSE and AMEX operating departments. The merger of our group into SIAC was followed by layoff of all its employees.

The existing Finance Basic MIS programs had been written in PL/Iprobably by an IBM systems engineer. What makes that language worthy IMHO of mention in this thread is, as the Wikipedia article says in its "Programmer Issues" sub-section,
Many programmers were slow to move from COBOL or Fortran due to a perceived complexity of the language and immaturity of the PL/I F compiler. Programmers were sharply divided into scientific programmers (who used Fortran) and business programmers (who used COBOL), with significant tension and even dislike between the groups. PL/I syntax borrowed from both COBOL and Fortran syntax. So instead of noticing features that would make their job easier, Fortran programmers of the time noticed COBOL syntax and had the opinion that it was a business language, while COBOL programmers noticed Fortran syntax and looked upon it as a scientific language.

Both COBOL and Fortran programmers viewed it as a "bigger" version of their own language, and both were somewhat intimidated by the language and disinclined to adopt it. Another factor was pseudo-similarities to COBOL, Fortran, and ALGOL. These were PL/I elements that looked similar to one of those languages, but worked differently in PL/I. Such frustrations left many experienced programmers with a jaundiced view of PL/I, and often an active dislike for the language.

The programs I inherited should have been written in COBOL, although IIRC the program that read computer time from IBM OS/360 log files had assembler-language aspects. I didn't have to debug those—just the developer-time-billing program. AFAICT the only reason these programs had been written in PL/I was because that language was a proud product of IBM.

IBM systems engineers'sales-oriented job was to assist the customer with technology. They certainly weren't application experts, as missing features of the developer-time-billing program showed. I had to do all my debugging and production runs during the late night or early morning, on the actual computers that were running NYSE on-line during the day.
 
Last edited:

johnny.5

Ars Scholae Palatinae
678
MUMPS. It’s the absolute nastiest 1960s garbage still in use, although it’s now “M” because that’s apparently less uncool.

Among its many wonderful properties: global persistent state. No keywords (everything is context-dependent, so “IF IF=PRINT PRINT ‘HELLO’” could be a valid statement.
Also you can abbreviate things as long as they’re unambiguous, so

I I=P P ‘HELLO’ would be valid.

Some statements last until the end of the line.

Your medical records system’s backend, and maybe your bank’s, is written in this.

It’s now sold as a database with a thin veneer of SQL on top and called Intersystems Cache. It’s ungodly expensive.

What do I win?
I came here to say this. I have a caché developer certification from the medical record vendor you speak of and mumps is the most god awful language I've come across out in the wild in my professional programming career that's now going on 20 years. I'm sure there are plenty of others, but for me this is it. I hate it and I can't understand why anyone would willingly choose to use it but no, they do. There are others besides the big medical record vendor building commercial software around this thing. I've encountered 2 more in the last 4 months.
 
  • Wow
Reactions: continuum

hanser

Ars Legatus Legionis
41,687
Subscriptor++
Some people consider COBOL to be a weird language because, based on Grace Hopper's design for FLOW-MATIC,

The Defense Dept. supported Hopper because "business" applications in the early 1960s were being coded in 2nd-generation assembly languages by programmers knowing no more than elementary algebra, who were in short supply. I'd done about a year of that myself, so learned COBOL in 1 week flat in 1969. Problems arose when such applications endured.

Such applications endured because they embodied particular organization requirements. They needed conversion, as here,


IBM had found it necessary/helpful to add 1965 F-level extensions because of the hurried way COBOL had been adopted. But COBOL's weird English-language syntax made it easy to add such extensions, just requiring additional reserved words.
.
This Developers Life podcast has an episode called "Dinosaurs" that seems relevant to this post, and this thread in general.

 

hanser

Ars Legatus Legionis
41,687
Subscriptor++
You can write shit in any language, some of them are more conducive to it than others. I've worked on many C projects where there is only one exit from a function, everything sets a particular error and does a "goto end;" which unwraps everything cleanly and in one spot. It's wonderful. I've also seen an entire preprocessor function called u, which takes no arguments, so you'd just get u; in the middle of a function with no explanation. I asked someone where it was defined in the code and what it did and got a "it's well known" answer.

As mentioned earlier Perl seems to attract people who want to slam their forehead into a keyboard and get cohesive code, but it doesn't have to be written that way.

Generally as long as a person is not amazingly intelligent, and coded to the utmost limits of their ability, you tend to be fine. Keep the idea that code has two functions, one is to make a computer do a thing, and the second is to make what you're doing easily understood by the person reading it.
I intensely dislike adherence to single exit as a best practice. I know why it was done, especially in languages like C with explicit resource management requirements where having multiple exit points could cause memory leaks. There are some weird people who still insist on it.

 

AndrewZ

Ars Legatus Legionis
11,390
I intensely dislike adherence to single exit as a best practice. I know why it was done, especially in languages like C with explicit resource management requirements where having multiple exit points could cause memory leaks. There are some weird people who still insist on it.

But can't still have multiple return statements in a routine? I have to admit I never did this for style reasons.
 

hanser

Ars Legatus Legionis
41,687
Subscriptor++
I came here to say this. I have a caché developer certification from the medical record vendor you speak of and mumps is the most god awful language I've come across out in the wild in my professional programming career that's now going on 20 years. I'm sure there are plenty of others, but for me this is it. I hate it and I can't understand why anyone would willingly choose to use it but no, they do. There are others besides the big medical record vendor building commercial software around this thing. I've encountered 2 more in the last 4 months.
AIUI, Epic uses C# for a lot of stuff now, too. But their legacy core is still MUMPS.

I guess it helps to have a cult-like culture for hiring, but they struggle to retain people beyond 3-4 years. College kids are easy to hire.
 

default1024

Smack-Fu Master, in training
34
I intensely dislike adherence to single exit as a best practice. I know why it was done, especially in languages like C with explicit resource management requirements where having multiple exit points could cause memory leaks. There are some weird people who still insist on it.

For C, a single return it is a MISRA requirement. It does make it easier when you can rely on the idea that the end of the function will be reached. Not a big deal if it's a simple function though.
 

default1024

Smack-Fu Master, in training
34
It's little easier to have multiple returns in C++ compared to C. The destructors will clean up after themselves so there is usually a lot less work to do before the return. Although you have to actually write in C++, not just do C things in a file that ends in .cpp.
It's not that it's harder in C to return in multiple places, all the underlying housekeeping is taken care of for you so there is nothing to "do" but type return wherever you want (has been that way in my experience for a very long time), it's just to prevent confusion. It's similar to the guidance of not using goto.
/tangent
 
Last edited:

Haas Bioroid

Ars Scholae Palatinae
1,424
Subscriptor
I intensely dislike adherence to single exit as a best practice. I know why it was done, especially in languages like C with explicit resource management requirements where having multiple exit points could cause memory leaks. There are some weird people who still insist on it.


They say aviation regulations are written in blood, I say coding conventions are written in tears.
 
  • Like
Reactions: AndrewZ

AndrewZ

Ars Legatus Legionis
11,390
Ah, one-entry, one-exit. One of the tenets of the Structured Programming movement. I still have twitchiness when I make guard clauses. Made for some really ugly code, normally. But if you apply modern "extract until you can't extract anymore" methodology, it wouldn't be so bad.
Reminds me of my first University programming class, structured Pascal. I was so annoyed when the edict was declared: No global variables, pass in all your parameters... The final grade was dictated by sitting down and writing a complete program in 4 hours.
 
  • Like
Reactions: zelmak
I wrote a couple of programs around 100LOC program in BASIC and FORTRAN on paper in the final examinations at high school. Got 200/200 in CS.
I still have many of my junior college-level source code BASIC graded printouts from 1985-ish. I also have my semester project from compiler's class on a 3.5" floppy. Am I a hoarder? No, I can't possibly be a hoarder...
 

fitten

Ars Legatus Legionis
52,251
Subscriptor++
I still have many of my junior college-level source code BASIC graded printouts from 1985-ish. I also have my semester project from compiler's class on a 3.5" floppy. Am I a hoarder? No, I can't possibly be a hoarder...

I don't think I have any printouts or files (on floppies) of any of the code I wrote in school. Probably a good thing ;) I wouldn't mind seeing some of it, though, just for nostalgia (and weirdly nostalgic, to hold something that I created/held back then as well... kind of like a few months ago when I found a stash of printouts of old D&D characters from ~1984 through ~1990... it was a great find but also had a bit of nostalgia in that I was literally holding papers that I had held in my kid hands back then).
 
  • Like
Reactions: hanser

Aleamapper

Ars Scholae Palatinae
1,284
Subscriptor
I still have many of my junior college-level source code BASIC graded printouts from 1985-ish. I also have my semester project from compiler's class on a 3.5" floppy. Am I a hoarder? No, I can't possibly be a hoarder...
I dont have anything from university (they deleted our home drives before graduation day!) but I do have a shoebox full of AMOS Basic games I've not seen since I wrote them as a teenager. I think my old amiga still works, too...now im thinking about getting a USB floppy disk reader and making ADFs or whatever from them...would a normal usb floppy reader work on Amiga disks?
 
  • Like
Reactions: zeotherm

ajk48n

Ars Centurion
218
Subscriptor
There was a database in the 90s (maybe as early as 80s) that used a visual programming language like a flowchart. Kind of fun to play with, a fucking nightmare to do anything of value with.
This isn't really the same thing, but maybe kinda the same idea.

Houdini is a program for visual effects. It's entirely node based, and it's nodes go from very low level, up to high level nodes which are wrappers for the low level ones. It also has a well- thought out way of viewing all the data that the nodes are passing around.

It's not explicitly a programming language, but I very much look at it as a visual programming language. Also not that obscure since it's used throughout the visual effects industry.

As for actual programming language, the closest I have to weirdest are Extendscript (JavaScript type thing to automate Adobe products) and MEL (embedded scripting language for the 3d software Maya)
 

Hagen Stein

Ars Praetorian
567
Subscriptor
Back in school (early 1980s) we had a voluntary IT course for which I signed up. The lab was equipped with Apple ]['s, teaching language was UCSD Pascal, though we also had a brief look at Logo. As Integer BASIC was part of the OS (much like QBasic in MS-DOS), I learned that too.

As far as weirdest goes, yeah the already mentioned Tcl/Tk comes to mind. Come to think of it and if it counts as such, I also did a course (and got a qualification certificate) in CNC Heidenhain programming as part of an internship.
 
Last edited:

rain shadow

Ars Praefectus
5,444
Subscriptor++
some PL/I
My first real paying job was in PL/1.
  • It had that thing where a literal or a variable could be interpreted as string sometimes or a number other times
  • pointers were untyped so it was easy to point to the wrong thing and seg fault or worse access the wrong data
  • the accompanying library used exceptions quite often instead of just returning error codes
  • everything was dog slow so we called OS-specific functions instead of using the PL/1 library
  • and the calling convention was call-by-address
  • other than that it was not terrible
 
Last edited:

drogin

Ars Tribunus Angusticlavius
7,222
Subscriptor++
Yeah, it was fantastic. Well, for some definition of "fantastic".

At least it was easier to get access to the mainframe the PL/I code ran on. For the really old FORTRAN stuff, I had to go across to the other side of the hangar to talk to the old guy (whom I called the Wizard of Oz) that kept the VAX alive...
 
  • Haha
Reactions: zeotherm
Oldest: FORTRAN 66
Weirdest: SNOBOL 4 (Just because I could. I was on a "learn every language" kick.)
Nastiest: APL

Per Stan Kelly-Bootle:
There are three things a man must do before his life is done. Write two lines in APL. And make the buggers run.

In case you're unfamiliar with APL, here's Conway's Game of Life expressed in it:
life ← {⊃1 ⍵ ∨.∧ 3 4 = +/ +⌿ ¯1 0 1 ∘.⊖ ¯1 0 1 ⌽¨ ⊂⍵}

I can't believe I used to understand this.
 

fitten

Ars Legatus Legionis
52,251
Subscriptor++
(Just because I could. I was on a "learn every language" kick.)

Yeah, I did some of that, too, way back in the day. I stopped counting (and stopped that kick) sometime in the late 1990s. At the time, I think I had 36 languages on my resume that I had used to do non-trivial things (granted, several of those were assembly languages). These days, I only include about 6 or 7. I actually looked at SNOBOL very briefly but never did anything significant with it so it wasn't on my list.
 
  • Like
Reactions: AndrewZ

quarlie

Ars Scholae Palatinae
1,318
Subscriptor++
I occasionally (now very rarely) maintain a legacy RAD system, where the syntax, structure, IDE, and runtime deployments appear to be designed by maniacs. The prime example is that numbers can have a maximum of 38 decimal digits.

Why is that so bad? Here's a hint: the default field size is character 40...
Uniface?
 
  • Like
Reactions: Nazgutek
I just remembered a customer of C-E-I-R Inc.'s Cambridge MA branch who once used an even-weirder compiler to compile his Fortran II program—the Fortran compiler for the IBM 1401. Because the IBM 1401—designed for "commercial" programming in assembly language—was built with decimal arithmetic and no floating-point hardware (same WP article's "Architecture" section), Fortran compilers IBM built for it gave an application programmer the ability to specify FP precision.

Tom the customer worked for a company named AirTech, and he probably worked from home. Because data communication only became routine in the late 1960s, Tom used C-E-I-R's security-cleared truck driver to deliver his printouts and paid for C-E-I-R's junior Professional Services programmers to keypunch marked-up modifications to his Fortran II card deck. "Deck" in the singular, because he was developing the same application program from at least 1964–1968. C-E-I-R programmers (not AEC-cleared) didn't then know what the application was, but he later told us it was for simulation of a thermonuclear explosion. (When Tom left AirTech, my C-E-I-R boss hired him—only to have Tom go into a mental hospital.)

Tom had so much trouble with his AirTech program that at one point he had C-E-I-R's very-knowledgeable systems support programmers (possibly AEC-cleared) look it over. They later told us they'd found Tom had problems coding Fortran II, but Tom didn't accept that verdict. He later decided his problems were caused by inaccuracy of the IBM 7090 double-precision FP arithmetic, so I suggested compiling (probably with the Fortran IV compiler, which was upward-compatible with most Fortran II) some source on the 1401—specifying a greater FP precision than the 7090's hardware. Results were no better.
 
Last edited:

Mat8iou

Ars Praefectus
4,859
Subscriptor
Weirdest? = LISP, MIPS, and TCL/Tk, but those were used in my undergrad program as teaching languages.
At one point I did a fair bit of stuff in Newlisp - because I already knew LISP and it seemed to be one of the few freely available versions for the PC that was relatively compact. IIRC, most of its user interface stuff was done in TCL and it was distributed with it - I always managed to avoid that though and sticked to creating console based stuff.
 

quanticle

Wise, Aged Ars Veteran
199
Oldest? = BASIC
Weirdest? = LISP, MIPS, and TCL/Tk, but those were used in my undergrad program as teaching languages.
Weirdest that I actually used IRL? = ColdFusion circa the Macromedia timeframe.

Depends on the Lisp variant, I suppose, but isn't Lisp just as old as Fortran, and considerably older than BASIC? Richard Stallman was programming in Lisp in the 1970s.
 

Aleamapper

Ars Scholae Palatinae
1,284
Subscriptor
The deadline for this year's IFComp is rapidly approaching, and I thought I'd have a go at making an entry, so I'm learning Inform 7. It's not particularly old, and not particularly nasty, but it's definitely unusual.

Strangely fitting for a language to write interactive fiction in, It's a natural language DSL, and it actually works, on the whole. Unfortunately, much like the interactive fiction you create with it, the ambiguities of English and shortcomings of parsing it mean figuring out how to explain what you want to do can involve a lot of guess-the-verb, at least when you start.

It's well worth having a play with, if you're into programming languages, though (I seem to remember Eric Lippert is a big fan?)