What would the perfect programming language look like?

iljitsch

Ars Tribunus Angusticlavius
8,474
Subscriptor++
I keep hesitating to partake in the thread What’s the oldest, weirdest, nastiest, or most unusual language you’ve ever coded in? My quick answer would be: all of them.

C, Java, Javascript and no doubt many others suffer from the { } ; nonsense. Python shows this is not needed. Then again, Python just fights you when you want to do stuff that is simple in other languages, such as print some text to the terminal without a newline. This is insane.

Strong typing can be good to avoid mistakes, but it can also be a huge collection of hoops to jump through that don’t add anything. Implementing your own data types such as heaps in C is great when you need it. Higher level languages don’t really let you because their smarts get in the way. Then again having to implement these in C because you only get simple arrays by default sucks.

So what would a perfect programming language look like?

I’d say: start with the best parts of C and the best parts of Python. Indentation for blocks by default, { } only for special circumstances. No more ;. Allow strong typing, but don’t require it. Allow "manual" memory management, but don’t require it.

I’m not familiar with Swift or Rust, but these are fairly new and highly praised, are they contenders for the ideal programming language crown? Or is there another runner up?

And please list your own requirements.
 

Ardax

Ars Legatus Legionis
19,076
Subscriptor
Ars#?

Oranging this because I want to follow along, but I don't have the mental bandwidth to really think about this too much to participate at the moment.

Right out of the gate I feel like worrying about braces and semicolons is kind of putting the cart before the horse. Not that syntax doesn't matter, it does, but... are braces really that big of a deal?

However, I'm a bigger fan of static typing than dynamic because I like having my errors show up at compile type rather than runtime. Others feel differently, but I'm not sure it's possible to design a single type system that really accommodates both from a "first principles" perspective. (Maybe it is and people who do language design for realz know this stuff better than I do.)
 
  • Like
Reactions: lukem

fitten

Ars Legatus Legionis
52,251
Subscriptor++
There's not one, that's why we have so many. It's like the old saying goes...

standards.png
 

koala

Ars Tribunus Angusticlavius
7,579
Braces, whitespace, semicolons, those are nearly irrelevant to me.

Rust is pretty solid overall, but it has a steeper learning curve when you don't care about some of its more unique features. I completely subscribe to the meme of the "easy Rust".

Honestly, with Python and Java I already had 90% of what I need most of the time. Rust is very pleasant when you hit its good points, though. I'm even using it for shell scripting nowadays, and I prefer it to Python in some ways.

Most of the time, the ecosystem matters much more than the language. If I need something like Django's admin, I'm going to use Python, even though it were a project where otherwise I'd prefer Java. If I need to distribute binaries, Rust gains a ton of points. If it's a small thing, Python is likely going to win for me. Most of these decisions are unavoidable tradeoffs.

But I'll say it again: still, my favorite programming language in the world is sh and its relatives. It's the language that it's best at doing what it's designed for. Prolog is a close second, because when you use it somewhere where it's well-suited for (that's quite rare), it's nearly magical (plus, I learned recently about DCGs and I love them). And SQL is the most underrated language ever.
 

fitten

Ars Legatus Legionis
52,251
Subscriptor++
And SQL is the most underrated language ever.
I've seen (and done) some pretty crazy stuff with SQL. For some things, it's extremely powerful. I've seen 50 lines of SQL (not like C, Python, or other calling libraries... just straight SQL) using nothing but built-in SQL stuff pull off some crazy processing. And then you get to the times when you find a hot spot and rewrite some of it and get like a 5x speedup (or more... I improved some stuff one time from running 18 hours down to 15 minutes... but that was more of a case of the original SQL not being very good to begin with).
 

Apteris

Ars Tribunus Angusticlavius
8,938
Subscriptor
Roughly like Lisp. Clojure is reasonably close. If you need high performance, Extempore's XTLang does the trick. Type checking and such can be implemented with various plugins.
I like this answer, I was thinking of posting something similar.

But, is accessibility to the average Joe a criterion for the perfect programming language? Because as much as I like LISP, it's never going to be as understandable as an imperative language, for example.
 

Lt_Storm

Ars Praefectus
16,294
Subscriptor++
But, is accessibility to the average Joe a criterion for the perfect programming language? Because as much as I like LISP, it's never going to be as understandable as an imperative language, for example.
One of the things I like about Lisp and SmallTalk especially, though, also, things like Ruby and Python, if to a lesser degree, is how the relatively simple and flexible syntax means that you can tackle this kind of problem fairly directly. If your program is best described in imperative syntax, it's fairly simple to write something with a threading macro that looks just like imperative code. Similarly for any other bit of syntax that would make your program easier for Joe Average to understand, the nature of the language lets you create that syntax. As such, the vast majority of a program written in LISP is typically much easier to understand than one written in an imperative language.

Of course, this does create a problem when Joe Average programmer is starting a new program from scratch and doesn't really know how to manipulate the syntax of the language to achieve what he wants. In that situation, programming in LISP is indeed less understandable than an imperative language. There is a bit of a learning curve for the process of bootstrapping the language to meet your problem.
 

koala

Ars Tribunus Angusticlavius
7,579
I don't understand why Lisp advocacy has evolved to be ineffective in most cases. (While I do understand why Haskell advocacy is ineffective.)

I normally read all the Lisp advocacy pieces that come across me. But it wasn't until my Rust poser phase, where I read Rust articles before playing seriously with the language, that macros became really attractive to me.

The Lisp advocacy I've read mostly revolves around:

  • How many parts of the language which normally are core language features, in Lisp they are stdlibby. This is very neat, but mostly the advocacy shows control structures I already have in most languages. I can see the appeal of being able to tweak those to my liking, and introduce new ones. Likely I'm a Mort and my opinion is worthless, but that's rare for me, and when it happens, I am used to say, oh, well, that's life.
  • How you can DSL all the things. Neat, but the examples are rarely compelling.
  • The simple syntax, which a ton of people reject.

Whereas, when I read about Rust stuff that uses macros, I think they're very neat, but they also address pain points or bring me really nice comforts. All the formatting macros in the core Rust stdlib bring verification of the format string, which is something you hit constantly. The dbg! macro is fantastic. A ton of Rust libraries just revolve around macros and bring you new useful things (serde, xshell, clap are the ones that come to mind).

What is weird is that likely all those features exist in Lisp, but somehow I think I rarely read an advocacy piece that mentions them!

(Mostly I'm focusing on static checking- but as it has been mentioned, that can be done in Lisp too! Rarely see examples of that in Lisp advocacy either!)
 

koala

Ars Tribunus Angusticlavius
7,579
So I'm perfectly convinced that other than the syntax, which is a very personal choice*, Lisp can be great. And I think, other than the syntax, it's easy to write very readable declarative Lisp, and that you can give the average Joe the proper abstractions, as libraries, to do that.

But other than Clojure (which fixes some bits in the syntax to my eyes), I think Lisp is having much less direct positive impact on programmers nowadays. And its ideas are being more impactful as inspiration for other languages.

* Witness how Python syntax is so divisive- and ironically how I mentioned Java/Python syntax differences are irrelevant to me, yet my body rejects Lisp syntax.
 

Lt_Storm

Ars Praefectus
16,294
Subscriptor++
I don't understand why Lisp advocacy has evolved to be ineffective in most cases. (While I do understand why Haskell advocacy is ineffective.)

I normally read all the Lisp advocacy pieces that come across me. But it wasn't until my Rust poser phase, where I read Rust articles before playing seriously with the language, that macros became really attractive to me.

The Lisp advocacy I've read mostly revolves around:

  • How many parts of the language which normally are core language features, in Lisp they are stdlibby. This is very neat, but mostly the advocacy shows control structures I already have in most languages. I can see the appeal of being able to tweak those to my liking, and introduce new ones. Likely I'm a Mort and my opinion is worthless, but that's rare for me, and when it happens, I am used to say, oh, well, that's life.
  • How you can DSL all the things. Neat, but the examples are rarely compelling.
  • The simple syntax, which a ton of people reject.

Whereas, when I read about Rust stuff that uses macros, I think they're very neat, but they also address pain points or bring me really nice comforts. All the formatting macros in the core Rust stdlib bring verification of the format string, which is something you hit constantly. The dbg! macro is fantastic. A ton of Rust libraries just revolve around macros and bring you new useful things (serde, xshell, clap are the ones that come to mind).

What is weird is that likely all those features exist in Lisp, but somehow I think I rarely read an advocacy piece that mentions them!

(Mostly I'm focusing on static checking- but as it has been mentioned, that can be done in Lisp too! Rarely see examples of that in Lisp advocacy either!)
So, here's the thing: when Lispers talk about how the language's syntax is mutable, they are talking about, among other things, macros. It's just that, given a language with inherently complex syntax, the role of macros as a mechanism of providing syntax is... less clear. As such, until someone bakes in lisp for a while, they don't quite understand what mutable syntax really means from the same perspective. So, the full message of how all this ends up being really really useful for all sorts of tasks depending on what a particular program / programmer values isn't obvious until you know.

Given the direct link between syntax, semantics, and built in data structures provided by Lisp, there are all sorts of amazingly powerful features which are all relatively easy to implement. This can include macros that make specific problems easier to solve correctly, making specific linters (such as a type checker) relatively easy to create, removing duplicate code, etc. It's such a flexible tool* that it's really hard to convey just how useful it is to someone unused to it. It's sort of like trying to explain how useful a smart-matter multi-tool is to someone who is an expert with traditional hand tools, those hand tools solve most of his problems quite readily.

That said, for examples, dbg is a pretty good basic obvious example. I'm not sure I have ever worked in a Lisp environment that didn't have it. Then you have macros built around resource management such as with-open-file. Then you have threading macros such as ->, which can be used to write imperative code. There's a whole host of various macros for flow control and error handling. But, honestly, the biggest problem is that the list of uses are so incredibly common that you probably don't even notice it when you use a macro in Lisp, I mean, variable assignment is typically just a macro. Once you get used to that kind of flexibility, not having it is a bit like having a arm chopped off.

I suppose that the short version is: you remember every time you used a source code generator for something it was made for? Magic isn't it? And how, not-so-occasionally, you will find yourself doing something that is almost-but-not-quite what that source code generator does, so, you have to do it the hard way and spend hours writing out almost the same code over and over again? Well, in Lisp, thanks to Macros, that never ever happens, because, in Lisp, it is fairly easy† to write a new source code generator that does just exactly what you need it to do. Better, the base syntax of Lisp is so very simple and uniform that said new source code generator just works.

* Incidentally, this is why you rarely see static checking, especially as seen in C-like type systems discussed. Those type systems are typically so inflexible that they are crippling. As such, fairly few lispers want to work under one unless they are doing something where the code generation benefits are worth it, so, in that category of type checking, you get XTLang and nothing else. Of course, you will see some H-M type checkers, but, then you are basically working in Haskell.

† Sure, it takes a little knowledge to make it all work correctly, hygiene is critical. But, honestly, unlike C/C++ macros, this isn't rocket science, even fairly junior programmers can get it write given a little guidance and know-how.

So I'm perfectly convinced that other than the syntax, which is a very personal choice*, Lisp can be great. And I think, other than the syntax, it's easy to write very readable declarative Lisp, and that you can give the average Joe the proper abstractions, as libraries, to do that.

This is where you are wrong: the simplicity and uniformity of Lisp's syntax is the secret sauce. It is what both super-charges macros and also makes them fairly easy to write. Because it is trivially parseable, it means that it is comparatively trivial to do complex rewriting in macros which makes the function comparatively powerful. Given something like Python syntax, and suddenly the task becomes much harder. Even Ruby syntax makes it difficult (though, Ruby provides other nice tools to achieve similarish results). Given something like Perl's or C's syntax, and you get a nightmare of buggy unreliable code out of it.
 
  • Like
Reactions: Apteris

Apteris

Ars Tribunus Angusticlavius
8,938
Subscriptor
But other than Clojure (which fixes some bits in the syntax to my eyes), I think Lisp is having much less direct positive impact on programmers nowadays. And its ideas are being more impactful as inspiration for other languages.
Yeah, the question of why the currently-popular languages are popular currently is a big one, and hard to answer.

I miss the feeling of writing LISP, though.
 

iljitsch

Ars Tribunus Angusticlavius
8,474
Subscriptor++
* Witness how Python syntax is so divisive- and ironically how I mentioned Java/Python syntax differences are irrelevant to me, yet my body rejects Lisp syntax.

I really hate that { } ; stuff in C and C-styled languages. I guess the ; sometimes does something useful, such in a do { } while; but largely it’s just a waste of key presses. The { } for blocks is worse, because my brain can’t handle

Code:
this {
  style;
  }

It has to be

Code:
this
  {
    style;
  }

but that wastes a lot of space. So I’m all onboard with Python’s indenting to make blocks. (Of course they go overboard because extra indenting in the middle of a block is not allowed!)

But it seems to me that it should be easy enough to bring indenting for blocks to C, either has a compiler feature or using a preprocessor that adds the { } to make it normal C. Then you only need { } for blocks within a single line. Similarly, omit the ; at the end of a line and the compiler or preprocessor knows it should be there except for conditionals. And then you only need them if you have multiple statements on the same line.
 

dspariI

Smack-Fu Master, in training
33
For the most part, I like the indentation in Python but it has one drawback. It makes it difficult to have multiline anonymous functions. There's hacky ways around it and very old (and very simple) proposals on how to have proper support. Ultimately, Guido van Rossum just didn't like the concept of multiline anonymous functions so they're not in the language.
 

fitten

Ars Legatus Legionis
52,251
Subscriptor++
Indentation styles ( https://en.wikipedia.org/wiki/Indentation_style ) is a 'whatever' to me. I use whatever style the codebase (or even file... I've seen cases where each file conforms to a different style) is in. That's one reason why Python bothers me... it's rigid. I haven't found it reducing any errors because if you don't indent far enough, it'll still look correct and and still run most of the time. Personally, for C-like languages I prefer Allman. As far as semicolons, I don't even know if it's taught anymore but in C and C++, for example, the ; is a statement terminator. People think that "every line has to have a ; on it so it's a waste" but I think that's because they aren't told about statement separators vs. statement terminators. Not that those are some earthshaking things, they are just different. Similarly, from what I've seen, in C people talk about the brackets (blocks) being part of the control statement and that's it... everything in between { and } are part of the control of flow... but code blocks mean a bit more than that*.

As far as "one language", it'd have to cover everything from Average Joe to bit twiddling to being able to manipulate hardware, without any of the complexities (syntax, etc.) required for the complex cases impacting Average Joe and the tradeoffs for making the easy stuff easy for Average Joe making the complex stuff hard to get at for those who need it. Since we seem to agree here that it hasn't been done already, it must be a non-trivial task ;)


*A guy in our group didn't know that you can write a loop in C like this, once, or why you can...
Code:
    for (conditions)
        stmt,
        stmt,
        stmt,
        stmt;
not recommended for production code, of course ;)

Similarly, I've seen various people not know (or why) you can just put a code block in the middle of your other code without a control statement in front of it. They thought the { } were a part of the control statement syntax.
 
Last edited:

ShuggyCoUk

Ars Tribunus Angusticlavius
9,975
Subscriptor++
There never will be one perfect language. It's a fools errand.

You don't have one perfect tool to achieve anything else - even within the same field often the tool you use varies depending on the specific scale of the task/thing you want to achieve.

Separating the language and the aspects of the platform/runtime it exists on is also a bad idea as the latter can and should drive the former (Rust would have no need for ownership semantics if it was operating under GC rules for example)
 

koala

Ars Tribunus Angusticlavius
7,579
I said "not that important". I do prefer some languages to others. But Python/Java are very different languages, yet I find them plenty good for general purpose programming. And often, my choice between them will be more about the ecosystem, who I'll be working with, etc.

Of course, qualities of the language do have an influence on how the ecosystem evolves, how many people are knowledgeable with the language, etc. But marketing and other forces also count (would Go be as popular as it is if it wasn't Google's? would JS be as popular if WASM had existed since the beginnings?)
 

ShuggyCoUk

Ars Tribunus Angusticlavius
9,975
Subscriptor++
You're conflating importance with popularity/usage. The latter impacts the former yes, but look at the evolution[1] of languages.

Java was (and is) insanely significantly used. Checked exceptions? almost entirely defunct elsewhere.

Closures - originally designed to allow evaluation of the lambda calculus I doubt many programmers now are even aware of SECD's existence, and I think the first actual implementation including the capability in the language itself PAL was designed entirely as a teaching aid.

What Python and java have in common is relying on GC, so tossing huge swathes of complexity from the programmer to the runtime. But it entirely precludes those languages (as is) from every being realistic[2] for implementing an OS.

I think "what can't you do with a language" is often the most instructive way to think about them, and notably when languages try to "let you do that thing we don't let you do but with some additional syntax" they end up getting very hairy and most people don't use them that way.

COBOL was likely a respectable fraction of the entire code on the planet. IIRC you couldn't pass parameters to procedures until COBOL-74. You can have languages (overly) designed by academics but fear more the language without any of them!

1. In the loose sense of that word.
2. The intense effort and near total failure of MS's Midori is instructive for this, when you can't allocate in an interrupt handler it really does cause you problems, apart from the very people involved they didn't gain much from that.
 

Lt_Storm

Ars Praefectus
16,294
Subscriptor++

ShuggyCoUk

Ars Tribunus Angusticlavius
9,975
Subscriptor++
Given LISP predates that, I strongly suspect that closures are also older than SECDs...
The history is complex, but based on the timeline here it can’t properly have been there as we know it till lexical scoping came in and that was ”experimental stuff, then Scheme” the early lisps differed in their semantics based on whether they were compiled or interpreted :flail:
 

iljitsch

Ars Tribunus Angusticlavius
8,474
Subscriptor++
Indentation styles ( https://en.wikipedia.org/wiki/Indentation_style ) is a 'whatever' to me. I use whatever style the codebase (or even file... I've seen cases where each file conforms to a different style) is in. That's one reason why Python bothers me... it's rigid.

Too much choice is not a good thing. Then again, not enough choice (hi Apple!) is also problematic. Obviously if you are a professional software engineer and have worked on many projects, you must have learned how to deal with stuff like different indentation styles.

But that doesn’t mean there is not a problem here. Like I said, the opening { on the same line and dangling closing } just doesn’t work for me. And as the biggest programming job I’ve ever worked on was together with just one other programmer, I’ve always been able to do what makes sense to me. (I studied computer science in college but then went into networking, with some small coding projects and coding on the side on my own stuff over the years.)

As far as "one language", it'd have to cover everything from Average Joe to bit twiddling to being able to manipulate hardware, without any of the complexities (syntax, etc.) required for the complex cases impacting Average Joe and the tradeoffs for making the easy stuff easy for Average Joe making the complex stuff hard to get at for those who need it. Since we seem to agree here that it hasn't been done already, it must be a non-trivial task ;)
I think the discussion here has been "what is the perfect programming language", which is subtly different from my original question. Obviously if we could identify that perfect programming language, we’d know what it looked like.

I started programming in BASIC on a C64. Then Forth, 6502 assembly, Pascal, DOS batch files, AREXX and probably lots of stuff that I’ve completely forgotten. That stuff was just very limited. Then I learned C, a language powerful enough to build OSes. Sniffed around Java but didn’t like the smell. Then sh/bash, PHP, Javascript and some Python. Those are all not inherently constrained. But they all have things that to me are severe shortcomings. Or worse, idosyncrasies where they do something different apparently just for the sake of being different.

I would love to have some base syntax that is easy to understand for everyone, and then add various flavors of more complex stuff on top of that but then at least we all can read the obvious control flow and it’s just the narrow purpose stuff that’s new, rather than have new languages pop up all the time that add new cool things, but also rearrange the boiler plate, making everything more difficult than it needs to be.
 

Ardax

Ars Legatus Legionis
19,076
Subscriptor
Or worse, idosyncrasies where they do something different apparently just for the sake of being different.

That's just plain old PHP brain damage. Most of the rest of the languages you mention had some kind of principles that guided their grammar and syntax along with some limitations that arose because of their expected use cases, operating environment, available computing power at the time, etc.

I'm not sure you'd want to run the JVM on a C64, for example.
 

ShuggyCoUk

Ars Tribunus Angusticlavius
9,975
Subscriptor++
Then sh/bash, PHP, Javascript and some Python. Those are all not inherently constrained.
Yes they are.
All rely on GC (technically I think you could do sh/bash without it but I think it would be problematic for non trivial scripts).
GC is hugely constraining, but also very pleasant to work with if the abstraction of “I have infinite memory address space” appeals.
None of them would play well/at all in kernel level code.

These are very hard constraints on a swathe of very important (niche, but not something you can just assume will fade away) use cases
But they all have things that to me are severe shortcomings
Yes, that too wouldn’t want to disagree there.
Then again I think all of them support (to a greater or lesser degree) a REPL. Not sure that’s a meaningful thing in C
 

snotnose

Ars Tribunus Militum
2,747
Subscriptor
And as the biggest programming job I’ve ever worked on was together with just one other programmer,
This explains why you think trading in braces for whitespace isn't an issue. You've never had to deal with weird problems that trace back to Fred's weird indenting habits.
 

ShuggyCoUk

Ars Tribunus Angusticlavius
9,975
Subscriptor++
Back when I got some really miserable merges git gave up on I found that throwing three way diffs in c# (likely any c style language would) at Beyond Compare[1] would often have it magically JustWork (tm) (literally hit save on the suggested file and done).

In f# it was miserable. Now some of this is that F# is vastly less popular, but fundamentally there so much additional context in c style languages (terseness isn't always good - that's why we use erasure coding!) that f# and other "the indentation is all you have unless you completely lex the whole file" get screwed.

You get the same effect in intellisense (remember, almost all intellisense is done on broken code), Again Roslyn has had incredible amounts of effort put into it compared to the FCS but still, 'broken' code stops being obviously broken - so use reasonable heuristics with the various braces and semi colons giving great hints as you go to "Well HM type inference can make some horrid partially applied mess of all this so we'll throw that at the intellisense api and get nothing useful). "AI" style intellisense is actually likely to function way better there, but then looses out due to a lack of decent training data (an area where sheer public popularity on github is a massive win now)

I've had quite a few discussion with folks who argue "terseness is more important". When I present them with the slippery slope to things like K or embedding many more regexes into the code base all but one acknowledge it's a continuum and they think f# is the right side of the line. In many cases I found that they still opted to do everything in f#, not just the things where it added significant benefits that would offset the long term cost of being written in a more esoteric language - that didn't age well.

1. I 💝BC sooo much
 
  • Like
Reactions: MilleniX

Hagen Stein

Ars Praetorian
567
Subscriptor
C, Java, Javascript and no doubt many others suffer from the { } ; nonsense.

Sooo much this, especially the terminating semicolon.

I do understand that in the dark ages of computing compilers may have needed a helping hand in splitting lines/commands. And that with the monitor resolutions available at that time coding lines span over more than one line.

But we've successfully made it into the 21st century, ffs! Monitors literally span whole walls, so there's very seldom the need to split one line into more (I also doubt the readability/maintainability of such code). And the computing power in each machine is more than enough that a compiler should be able to determine a coding line's ending on its own. Here's a hint: look for line breaks!

I really would switch the whole concept of it. Instead of treating everything as a single line of code unless there's a terminating semicolon, assume a line break as the end of a coding line unless there's an explicit line continuation character present that tells the compiler that this line of code continues on the next line.

BASIC does this for decades where the physical line ending is also the logical (code) line ending unless the last two characters of a line are " _" (space and underscore), signaling the continuation of the logical line on the next physical line.
 

iljitsch

Ars Tribunus Angusticlavius
8,474
Subscriptor++
I was under the impression that \ melds two lines together, but since end of line is rarely significant in C I’m probably wrong about this and it’s just the preprocessor that uses\needs this.

I didn’t know about that BASIC thing. It’s probably specific to a limited number of dialects. I dipped a first toe into BASIC on the ZX Spectrum (~ known as the Timex Sinclair 2000 in North America). Yes, I was actually that kid who brought his cassette player to the corner of the store where all the computers were on display in order to save my BASIC programs so I could return the next day to tinker some more with them. I never understood why they let us do that. Later I realized computers being used to show (somewhat) cool stuff operated by transfixed teenagers sell better than ones just sitting there with a blinking cursor after boot.

Oh right, BASIC. On the C64 I got soon after you can cram in as many statements you want on a single line as long as that line is no more than 80 characters (i.e. two lines on the display) and you separate those statements with :.
 

curih

Ars Praetorian
453
Subscriptor
Instead of treating everything as a single line of code unless there's a terminating semicolon, assume a line break as the end of a coding line unless there's an explicit line continuation character present that tells the compiler that this line of code continues on the next line.
Just make sure you do it well. I'm not the only person who's lost hours of their life debugging something like TCL only to find the problem was an invisible space between the \ and the invisible newline.