Ada, Its Design, and the Language That Built the Languages

Posted by mpweiher 5 hours ago

Counter169Comment99OpenOriginal

Comments

Comment by YesThatTom2 3 hours ago

Ada was also ignored because the typical compiler cost tens of thousands of dollars. No open source or free compiler existed during the decades where popular languages could be had for free.

I think that is the biggest factor of all.

Comment by twoodfin 2 hours ago

Ada’s failure to escape its niche is overdetermined.

Given the sophistication of the language and the compiler technology of the day, there was no way Ada was going to run well on 1980’s microcomputers. Intel built the i432 “mainframe on a chip” with a bunch of Ada concepts baked into the hardware for performance, and it was still as slow as a dog.

And as we now know, microcomputers later ate the world, carrying along their C and assembly legacy for the better part of two decades, until they got fast enough and compiler technology got good enough that richer languages were plausible.

Comment by shrubble 1 hour ago

The first validated compiler for Ada that ran on the IBM PC was released in 1983.

The third validated compiler ran on the Western Digital “Pascal MicroEngine” running the UCSD p-system with 64K memory. The MicroEngine executed the byte code from the p-system natively, which was an interesting approach.

I think more research is warranted by you on this subject.

Comment by twoodfin 1 hour ago

I’m not saying it wasn’t possible, I’m saying the larger ecosystem was never going to embrace a language that was as heavyweight as Ada. In 1983, most PC system software was written in assembly!

Comment by sehugg 36 minutes ago

I sometimes wonder what "Turbo Ada" would have looked like, but I think it would have probably looked like later versions of Borland Pascal. Things like generics and exceptions would have taken some of the "turbo" out of the compiler and runtime -- the code generator didn't even get a non-peephole optimizer until 32-bit Delphi, it would have been too slow.

It might be nice to have Ada's tasks driven by DOS interrupts, though. I think GNAT did this.

Comment by michaelcampbell 2 hours ago

I used it a bit a Uni and remember enjoying it, but can you say what was slow about it; compilation or runtime or all of it?

Comment by twoodfin 2 hours ago

I’ve never directly played with Ada but my understanding is that it was very much both.

Ada includes a number of critical abstractions that require either dynamic runtime code (slow runtime) or the proverbial sufficiently smart compiler (slow compile-time).

These were for good reasons, like safety and the need to define concurrent systems within the language. But they were too heavyweight for the commodity hardware of the era.

Nowadays, languages like Go, C++, Java, Rust, … have no trouble with similar abstractions because optimizers have gotten really good (particularly with inlining) and the hardware has cycles to spare.

Comment by sidewndr46 2 hours ago

I had to take some course that was something like "Programming Language Theory". As a result I had to look at the specifications for dozens of different programming languages. I remember looking at the features of some languages and scratching my head trying to figure out how some of this would ever be practically implemented by a compiler. Later on I found out lots of stuff is just implemented by a runtime anyways, which lead to me realize that those fancy language features are often better as a library.

Comment by cubix 13 minutes ago

[dead]

Comment by acomjean 2 hours ago

A huge factor. I used ada for years and the fact everyone I worked with did hobby projects in other languages didn’t help it. And most of us liked Ada.

It had other warts the string handling wasn’t great, which was a huge problem. It was slow too in a time where that mattered more (we had c and ada in our code base.). I remember the concurrency not using the OSs so the one place we used it was a pain. HPUX had an amazing quasi real time extensions, so we just ran a bunch of processes.

Comment by shrubble 2 hours ago

The GNU ADA compiler was first released in 1995: https://en.wikipedia.org/wiki/GNAT

Comment by jghn 1 hour ago

GNAT has existed since at least the mid-90s, and in that time period plenty of companies used non-OSS compilers.

In that era, the largest blocker for Ada was it ws viewed as having a lot of overhead for things that weren't generally seen as useful (safety guarantees). The reputation was it only mattered if you were working on military stuff, etc.

Comment by adrian_b 53 minutes ago

True, but at that time it was already too late. C/C++ had won.

Moreover, for a very long time GNAT had been quite difficult to build, configure and coexist with other gcc-based compilers, far more difficult than building and configuring the tool chain for any other programming language. (i.e. you could fail to get a working environment, without any easy way to discover what went wrong, which never happened with any other programming language supported by gcc)

I have no idea which was the reason for this, because whichever was the reason it had nothing to do with any intrinsic property of the language.

I do not remember when it has finally become easy to use Ada with gcc, but this might have happened only a decade ago, or even more recently.

Comment by dharmatech 37 minutes ago

The ADA compiler for OpenVMS was over $200,000 in the 1990s.

Comment by eager_learner 2 hours ago

This. Nothing can compete with free.

Comment by globalchatads 2 hours ago

[dead]

Comment by rhubarbtree 2 hours ago

Strange comment. GNAT?

Comment by donatj 3 hours ago

I like the article overall but the continually repeated 'Language X didn't have that until <YEAR>' is very grating after the first ten or so.

I also wish there were concrete code examples. Show me what you are talking about rather than just telling me how great it is. Put some side by side comparisons!

Comment by microtherion 2 hours ago

You could do the same in reverse as well. Many of the features listed in the first paragraph existed before in other languages, though probably not all of them in a single language. In fact, I believe the design process (sensibly) favored best practices of existing languages rather than completely new and unproven mechanisms.

So there was considerable borrowing from PASCAL, CLU, MODULA(-2), CSP. It's possible that the elaborate system for specifying machine representations of numbers was truly novel, but I'm not sure how much of a success that was.

Comment by adrian_b 1 hour ago

Ada has borrowed nothing from Modula.

There are features common to Ada and Modula, but those have been taken by both languages from Xerox Mesa.

The first version of Modula was designed with the explicit goal of making a simple small language that provided a part of the features of Xerox Mesa (including modules), after Wirth had spent a sabbatical year at Xerox.

Nowadays Modula and its descendants are better known than Mesa, because Wirth and others have written some good books about it and because Modula-2 was briefly widely available for some microcomputers. Many decades ago, I had a pair of UVPROM memories (i.e. for a 16-bit data bus) that contained a Modula-2 compiler for Motorola MC68000 CPUs, so I could use a computer with such a CPU for programming in Modula-2 in the same manner how many early PCs could be used with their built-in BASIC interpreter. However, after switching to an IBM PC/AT compatible PC, I have not used the language again.

However, Xerox Mesa was a much superior language and its importance in the history of programming languages is much greater than that of Modula and its derivatives.

Ada has taken a few features from Pascal, but while those features were first implemented in Pascal, they had been proposed much earlier by others, e.g. the enumerated types of Pascal and Ada had been first proposed by Hoare in 1965.

When CLU is mentioned, usually Alphard must also be mentioned, as those were 2 quasi-simultaneous projects at different universities that had the purpose of developing programming languages with abstract data types. Many features have appeared first in one of those languages and then they have been introduced in the other after a short delay. Among the features of modern programming languages that come from CLU and Alphard are for-each loops and iterators.

Comment by mcdonje 2 hours ago

I imagine an ada dev would find the pattern grating over the decades, so it reads like an expression of that experience.

Comment by coldcode 2 hours ago

The US Air Force intended to use ADA, but had to use JOVIAL instead because ADA took so long to be developed. Most people have never heard of JOVIAL but it still exists in the USAF as a legacy.

I worked with JOVIAL as part of my first project as a programmer in 1981, even though we didn't even have a full JOVIAL compiler there yet (it existed elsewhere). I remember all the talk about the future being ADA but it was only an incomplete specification at the time.

Comment by adrian_b 1 hour ago

JOVIAL had been in use within the US Air Force for more than a decade before the first initiative for designing a unique military programming language, which has resulted in Ada.

JOVIAL had been derived from IAL (December 1958), the predecessor of ALGOL 60. However JOVIAL was defined before the final version of ALGOL 60 (May 1960), so it did not incorporate a part of the changes that had occurred between IAL and ALGOL 60.

The timeline of Ada development has been marked by increasingly specific documents elaborated by anonymous employees of the Department of Defense, containing requirements that had to be satisfied by the competing programming language designs:

1975-04: the STRAWMAN requirements

1975-08: the WOODENMAN requirements

1976-01: the TINMAN requirements

1977-01: the IRONMAN requirements

1977-07: the IRONMAN requirements (revised)

1978-06: the STEELMAN requirements

1979-06: "Preliminary Ada Reference Manual" (after winning the competition)

Already the STRAWMAN requirements from 1975 contained some features taken from JOVIAL, which the US Air Force used and liked, so they wanted that the replacement language should continue to have them.

However, starting with the IRONMAN requirements, some features originally taken as such from JOVIAL have been replaced by greatly improved original features, e.g. the function parameters specified as in JOVIAL have been replaced by the requirement to specify the behavior of the parameters regardless of their implementation by the compiler, i.e. the programmer specifies behaviors like "in", "out" and "in/out" and the compiler chooses freely how to pass the parameters, e.g. by value or by reference, depending on which method is more efficient.

This is a huge improvement over how parameters are specified in languages like C or C++ and in all their descendants. The most important defects of C++, which have caused low performance for several decades and which are responsible for much of the current complexity of C++ have as their cause the inability of C++ to distinguish between "out" parameters and "in/out" parameters. This misfeature is the reason for the existence of a lot of unnecessary things in C++, like constructors as something different from normal functions, and which cannot signal errors otherwise than by exceptions, of copy constructors different from assignment, of the "move" semantics introduced in C++ 2011 to solve the performance problems that plagued C++ previously, etc.

Comment by sardon 40 minutes ago

I remember learning ADA at uni in the 90s and not loving it because of the syntax and it being slow to work with. I also remember the Arianne 5 rocket crash in the late 90s being blamed for a software bug, and the software being written in ADA. Now i understand that it was not a pure software issue, but still, all that safety did not prevent the major disaster that it was

Comment by pyjarrett 35 minutes ago

Ariane 5 became one of the most reliable rockets ever made and was used to launch the JWST.

Comment by alyls 4 hours ago

The Twitter account is from April 2026:

https://xcancel.com/Iqiipi_Essays

There is no named public author. A truly amazing productivity for such a short time period and generously the author does not take any credit.

Comment by Geezus_42 3 hours ago

No author because its a bot.

Comment by slackfan 1 hour ago

There's a lot of autodidact researchers out there that existed before AI, and AI-assisted, their research and output has sped up significantly. To call them a bot is to mildly put, miss the forest for the trees.

Comment by IAmBroom 2 hours ago

Yes, that was the point.

Comment by shminge 3 hours ago

I really don't want this to be AI writing because I enjoyed it, but as other commenters have pointed out, the rate of publishing (according to the linked Twitter account) is very rapid. I'm worried that I can't tell.

Comment by randusername 1 hour ago

I try to reserve judgement, but 110 em dashes is... excessive.

I really enjoyed the essay, only checked afterwards when I started reading comments.

I hate that I'm starting to develop a media literacy immune system for blog posts of all things.

Comment by aeve890 2 hours ago

>the rate of publishing (according to the linked Twitter account) is very rapid.

I've written almost 50 blog posts in the last 3 years. All in draft, never published mostly because a crippling imposter syndrome and fear of criticism. But every now and then I wake up full of confidence and think "this is it. today I'll click publish I don't give a fuck. All in". Never happens. Maybe this author was in the same boat until a month ago. I know there's a high chance that's just a bot but I can understand if it's not and how devastating has to be to overcome the fear of showing your thoughts to the world and being labeled a bot. If it's not already obvious English is not my first language and I've used LLMs to check my grammar and improve the style. Maybe all my posts smell like chatpgt now and this just adds to the fear of being dismissed as slop.

Comment by twoodfin 2 hours ago

LLMs do not currently improve the style of typical HN writing. Maybe someday they will; this article is less painfully bad than those of a few months ago.

The main problem with this article is that it appears to have been basically written out of whole cloth by the LLM, there’s no novel insight here about Ada beyond what you could fit in a short prompt + the Wikipedia article.

Comment by feelamee 1 hour ago

does it really matter? If AI can produce an essay of such quality - take my respect and steal my time please

Comment by randusername 58 minutes ago

I think so. Who writes something and why are important context for what we do with the information. It's an issue with the lack of disclosure, not AI in general.

Most longform readers will assume an author has deep expertise and spent a lot of time organizing their thoughts, which lends their ideas some legitimacy and trust. For a small blog, an 8,000 word essay is a passion project.

But if AI is detected in the phrasing and not disclosed, it begs a lot of questions. Did AI write the whole thing, or just light edits? Are the facts AI generated, too, and not from personal experience? What motivated someone to produce this content if they were going to automate parts of its creation; why would they value the output more than the process?

Comment by askUq 4 hours ago

From the main page of this website:

"These are not positions. They are proposals — structures through which a subject might be examined rather than verdicts about it."

The entire site is AI written.

Comment by graemep 3 hours ago

How is that evidence that the site was AI written?

Comment by twoodfin 2 hours ago

The evidence is that the article’s writing is terrible. It repeats the same rhetorical devices over and over, dressing up a series of facts in false profundity, because there’s no actual authorial insight here. It’s just “write a well-researched article that demonstrates how ahead of its time the Ada language was” + matmul.

Comment by boxed 2 hours ago

Humans are not gods of writing that will please all audiences and make no mistakes.

Comment by twoodfin 2 hours ago

Neither of those standards are what I’m talking about.

Obviously this article was highly pleasing to the hn audience as it’s currently sitting at #1. It’s still garbage, because it doesn’t have any interesting ideas behind it. Certainly not commensurate with its length.

Comment by quietbritishjim 2 hours ago

I think the quoted word salad is plenty of evidence.

Comment by zozbot234 2 hours ago

The combination of emdashes and inane non-sequiturs in "These are not X. They're Y" style is pretty damning.

Comment by tomekw 2 hours ago

Ada is underrated. I am spending lots of my time writing tons of open source software in Ada, mostly for myself, though.

Comment by tromp 3 hours ago

> Every language that has added sum types in the past twenty years has added, with its own syntax, what Ada's designers put in the original standard.

While true, that doesn't mean that other language's sum types originated in Ada. As [1] states,

> NPL and Hope are notable for being the first languages with call-by-pattern evaluation and algebraic data types

and a modern language like Haskell has origins in Hope (from 1980) through Miranda.

[1] https://en.wikipedia.org/wiki/Hope_(programming_language)

Comment by adrian_b 3 hours ago

The origin of all sum types is in "Definition of new data types in ALGOL x", published by John McCarthy in October 1964, who introduced the keyword UNION for such types (he proposed "union" for sum types, "cartesian" for product types, and also operator overloading for custom types).

John McCarthy, the creator of LISP, had also many major contributions to ALGOL 60 and to its successors (e.g. he introduced recursive functions in ALGOL 60, which was a major difference between ALGOL 60 and most existing languages at that time, requiring the use of a stack for the local variables, while most previous languages used only statically-allocated variables).

The "union" of McCarthy and of the languages derived from his proposal is not the "union" of the C language, which has used the McCarthy keyword, but with the behavior of FORTRAN "EQUIVALENCE".

The concept of "union" as proposed by McCarthy was first implemented in the language ALGOL 68, then, as you mention, some functional languages, like Hope and Miranda, have used it extensively, with different syntactic variations.

Comment by tialaramex 2 hours ago

Definitely if you don't have the C "union" user defined type you should use this keyword for your sum types. Many languages don't have this feature - which is an extremely sharp blade intended only for experts - and that's fine. You don't need an Abrams tank to take the kids to school, beginners should not learn to fly in the F-35A and the language for writing your CRUD app does not need C-style unions.

If Rust didn't have (C-style) unions then its enum should be named union instead. But it does, so they needed a different name. As we work our way through the rough edges of Rust maybe this will stick up more and annoy me, but given Rust 1.95 just finally stabilized core::range::RangeInclusive, the fix for the wonky wheel that is core::ops::RangeInclusive we're not going to get there any time soon.

Comment by mkovach 2 hours ago

I've written a few small projects in Ada, and it's a better language than it gets credit for.

Yes, it's verbose. I like verbosity; it forces clarity. Once you adjust, the code becomes easier to read, not harder. You spend less time guessing intent and more time verifying it. Or verify it, ignore what you verified, then go back and remind yourself you're an idiot when you realize the code your ignored was right. That might just be me.

In small, purpose-built applications, it's been pleasant to code with. The type system is strict but doesn't yell at you a lot. The language encourages you to be explicit about what the program is actually doing, especially when you're working close to the hardware, which is a nice feature.

It has quirks, like anything else. But most of them feel like the cost of writing better, safer code.

Ada doesn't try to be clever. It tries to be clear, even if it is as clear as mud.

Comment by theodorethomas 47 minutes ago

Reading the Steelman document is like reading a shopping list of everything that's gone into modern Fortran.

Comment by adrian_b 3 hours ago

Ada is a language that had a lot of useful features much earlier than any of the languages that are popular today, and some of those features are still missing from the languages easily available today.

In the beginning Ada has been criticized mainly for 2 reasons, it was claimed that it is too complex and it was criticized for being too verbose.

Today, the criticism about complexity seems naive, because many later languages have become much more complex than Ada, in many cases because they have started as simpler languages to which extra features have been added later, and because the need for such features had not been anticipated during the initial language design, adding them later was difficult, increasing the complexity of the updated language.

The criticism about verbosity is correct, but it could easily be solved by preserving the abstract Ada syntax and just replacing many tokens with less verbose symbols. This can easily be done with a source preprocessor, but this is avoided in most places, because then the source programs have a non-standard appearance.

It would have been good if the Ada standard had been updated to specify a standardized abbreviated syntax besides the classic syntax. This would not have been unusual, because several old languages have specified abbreviated and non-abbreviated syntactic alternatives, including languages like IBM PL/I or ALGOL 68. Even the language C had a more verbose syntactic alternative (with trigraphs), which has almost never been used, but nonetheless all C compilers had to support both the standard syntax and its trigraph alternative.

However, the real defect of Ada has been neither complexity nor verbosity, but expensive compilers and software tools, which have ensured its replacement by the free C/C++.

The so-called complexity of Ada has always been mitigated by the fact that besides its reference specification document, Ada always had a design rationale document accompanying the language specification. The rationale explained the reasons for the choices made when designing the language.

Such a rationale document would have been extremely useful for many other programming languages, which frequently include some obscure features whose purpose is not obvious, or which look like mistakes, even if sometimes there are good reasons for their existence.

When Ada was introduced, it was marketed as a language similar to Pascal. The reason is that at that time Pascal had become the language most frequently used for teaching programming in universities.

Fortunately the resemblances between Ada and Pascal are only superficial. In reality the Ada syntax and semantics are much more similar to earlier languages like ALGOL 68 and Xerox Mesa, which were languages far superior to Pascal.

The parent article mentions that Ada includes in the language specification the handling of concurrent tasks, instead of delegating such things to a system library (task = term used by IBM since 1964 for what now is normally called "thread", a term first used in 1966 in some Multics documents and popularized much later by the Mach operating system).

However, I do not believe that this is a valuable feature of Ada. You can indeed build any concurrent applications around the Ada mechanism of task "rendez-vous", but I think that this concept is a little too high-level.

It incorporates 2 lower level actions, and for the highest efficiency in implementations sometimes it may be necessary to have access to the lowest level actions. This means that sometimes using a system library for implementing the communication between concurrent threads may provide higher performance than the built-in Ada concurrency primitives.

Comment by init1 3 hours ago

Verbosity is a feature not a bug. Programming is a human activity and thus should use human language and avoid encoded forms that require decoding to understand. The use of abbreviations should be avoided as it obsfucates the meaning and purpose of code from a reader.

Comment by adrian_b 2 hours ago

The programming community is strongly divided between those who believe that verbosity is a feature and not a bug and those who believe that verbosity is a bug and not a feature.

A reconciliation between these 2 camps appears impossible. Therefore I think that the ideal programming language should admit 2 equivalent representations, to satisfy both kinds of people.

The pro-verbose camp argues that they cannot remember many different symbols, so they prefer long texts using keywords resembling a natural language.

The anti-verbose camp, to which I belong, argues that they can remember mathematical symbols and other such symbols, and that for them it is much more important to see on a screen an amount of program as big as possible, to avoid the need of moving back and forth through the source text.

Both camps claim that what they support is the way to make the easiest to read source programs, and this must indeed be true for themselves.

So it seems that it is impossible to choose rules that can ensure the best readability for all program readers or maintainers.

My opinion is that source programs must not be stored and edited as text, but as abstract syntax trees. The program source editors and viewers should implement multiple kinds of views for the same source program, according to the taste of the user.

Comment by init1 2 hours ago

It is not that I cannot remember the symbols - I don't want to; I want the language to plainly explain itself to me. Furthermore every language has it's own set of unique symbols. For new readers to a language you first have to familiarize yourself with the new symbols. I remember my first few times reading rust... It still makes my head spin. I had to keep looking up what everything did. If the plain keyword doesn't directly tell you what it's doing at least it hints at it.

To be clear Ada specifically talks about all this in the Ada reference manual in the Introduction. It was specifically designed for readers as opposed to writers for very good reasons and it explains why. It's exactly one of the features other languages will eventually learn they need and will independently "discover" some number of years in the future.

Comment by adrian_b 8 minutes ago

I agree that the use of symbols becomes a problem when you use many programming languages and each of them uses different symbols.

This has never been solved, but it could have been solved if there would have been a standard about the use of symbols in programming languages and all languages would have followed it.

Nevertheless, for some symbols this problem does not arise, e.g. when traditional mathematical symbols are used, which are now available in Unicode.

Many such symbols have been used for centuries and I hate their replacements that had to be chosen due to the constraints of the ASCII character set.

Some of the APL symbols are straightforward extensions of the traditional mathematical notation, so their use also makes sense.

Besides the use of mathematical symbols in expressions, classic or Iverson, the part where I most intensely want symbols, not keywords, is for the various kind of statement brackets.

I consider the use of a single kind of statement brackets as being very wrong for program readability. This was introduced in ALGOL 60 (December 1958) as the pair "begin" and "end". Other languages have followed ALGOL 60. CPl has replaced the statement brackets with paragraph symbols (August 1963), and then the language B (the predecessor of C) has transitioned to ASCII so it has replaced the CPL symbols with curly braces, sometimes around 1970.

A better syntax was introduced by ALGOL 68, which is frequently referred to as "fully bracketed syntax".

In such a syntax different kinds of brackets are used for distinct kinds of program structures, e.g. for blocks, for loops and for conditional structures. This kind of syntax can avoid any ambiguities and it also leads to a total number of separators, parentheses, brackets and braces that is lower than in C and similar languages, despite being "fully bracketed". (For instance in C you must write "while (condition) {statements;}" with 6 syntactic tokens, while in a fully bracketed language you would write "while condition do statements done", with only 3 syntactic tokens)

If you use a fully bracketed syntax, the number of syntactic tokens is actually the smallest that ensures a non-ambiguous grammar, but if the tokens are keywords the language can still appear as too verbose.

The verbosity can be reduced a lot if you use different kinds of brackets provided by Unicode, instead of using bracket pairs like "if"/"end if", "loop"/"end loop" or the like.

For instance, one can use curly braces for blocks, angle brackets for conditional expressions or statements, double angle brackets for switch/case, bag delimiters for loops, and so on. One could choose to use different kinds of brackets for inner blocks and for function bodies, and also different kinds of brackets for type definitions.

In my opinion, the use of many different kinds of brackets is the main feature that can reduce program verbosity in comparison with something like Ada.

Moreover, the use of many kinds of brackets is pretty much self describing, like also in HTML or XML. When you see the opening bracket, you can usually recognize what kind of pattern starts, e.g. that it is a function body, a loop, a block, a conditional structure etc., and you also know how the corresponding closing bracket will look. Thus, when you see a closing bracket of the correct shape you can know what it ends, even when you had not known previously the assignment between different kinds of brackets and different kinds of program structures.

In languages like C, it is frequently annoying when you see many closing braces and you do not know what they terminate. Your editor will find the matching brace, but that wastes precious time. You can comment the closing braces, but that becomes much more verbose than even Ada.

So for me the better solution is to use graphically-distinct brackets. Unicode provides many suitable bracket pairs. There are programming fonts, like JetBrains Mono, which provide many Unicode mathematical symbols and bracket pairs.

When I program for myself, I use such symbols and I use a text preprocessor before passing the program to a compiler.

Comment by zozbot234 1 hour ago

Rust has a complex semantics, not a complicated syntax. The syntax was explicitly chosen to be quite C/C++ like while streamlining some aspects of it (e.g. the terrible type-ascription syntax, replaced with `let name: type`).

Comment by init1 1 hour ago

In my opinion it has both complicated and terrible syntax that it inherited and extended from c++ and complicated semantics.

Comment by GhosT078 26 minutes ago

I agree. I've never understood or accepted the claim that Ada is verbose. It's simply clear and expressive. If there were some alternative concise syntax for "Ada" then I would not want to use it (because it would not be Ada).

This was proposed, as a joke, some years ago: https://www.adacore.com/blog/a-modern-syntax-for-ada

This is an old but good article on the topic: https://www.embedded.com/expressive-vs-permissive-languages-... Note that SPARK has changed significantly since this was written.

Comment by zozbot234 2 hours ago

Verbosity is a feature for small self-contained programs, and a bug for everything else. As long as you're using recognizable mnemonics and not just ASCII line noise or weird unreadable runes (as with APL) terseness is no obstacle at all for a good programmer.

Comment by Raphael_Amiard 3 hours ago

> Today, the criticism about complexity seems naive, because many later languages have become much more complex than Ada

I don’t think you really understand what you’re saying here. I have worked on an ada compiler for the best part of a decade. It’s one of the most complex languages there is, up there with C++ and C#, and probably rust

Comment by leoc 2 hours ago

Mind you, that suggests that the sentence is at least half-true even if "much more complex" is a big overstatement, since Rust, "modern" C++ and the later evolutions of C# are all relatively recent. (What would have compared to Ada in complexity back in the day? Common Lisp, Algol 68?)

As a matter of general interest, what features or elements of Ada make it particularly hard to compile, or compile well? (And are there parts which look like they might be difficult to manage but aren't?)

Comment by feelamee 1 hour ago

what do you mean under Ada's complexity? E.g. C++ is really complex because of a lot of features which badly interoperate between themselves. Is this true for the Ada lang/compiler? Or do you mean the whole complexity of ideas included in Ada - like proof of Poincaré conjecture complex for unprepared person.

Comment by microtherion 2 hours ago

I imagine Swift is also a very difficult language to compile.

Comment by mcc1ane 1 hour ago

Every time Ada is mentioned here, I start a quest - how to try it for free on Windows.

And every time I fail.

Comment by GhosT078 1 hour ago

I didn't think these were hard to find:

https://ada-lang.io/

https://alire.ada.dev/

Comment by adrian_b 1 hour ago

In the past you could easily use Ada or anything else from Linux under Cygwin.

Nowadays, you should be able to use anything from Linux under WSL.

In the past using Ada was more painful, because you had to use some old version of gcc, which could clash with the modern gcc used for C/C++/Fortran etc.

However, during the last few years these problems have disappeared. If you build any current gcc version, you must just choose the option of having ada among the available languages and all will work smoothly.

Comment by dfox 1 hour ago

Gnat Studio just works on Windows.

Comment by GhosT078 49 minutes ago

I have been using GNAT Studio (previously GNAT Programming Studio or GPS) on Linux for the last 15 years.

Comment by timschmidt 5 hours ago

It'd be a neat trick to have a single unified language which could bridge the gap between software and hardware description languages.

Comment by adrian_b 2 hours ago

The hardware description languages, even if they have a single language specification, are divided into 2 distinct subsets, one used for synthesis, i.e. for hardware design, and one used for simulation, i.e. for hardware verification.

The subset required for hardware synthesis/design, cannot be unified completely with a programming language, because it needs a different semantics, though the syntax can be made somewhat similar, as with VHDL that was derived from Ada, while Verilog was derived from C. However, the subset used for simulation/verification, outside the proper hardware blocks, can be pretty much identical with a programming language.

So in principle one could have a pair of harmonized languages, one a more or less typical programming language used for verification and a dedicated hardware description language used only for synthesis.

The current state is not too far from this, because many simulators have interfaces between HDLs and some programming languages, so you can do much verification work in something like C++, instead of SystemVerilog or VHDL. For instance, using C++ for all verification tasks is possible when using Verilator to simulate the hardware blocks.

I am not aware of any simulator that would allow synthesis in VHDL coupled with writing test benches in Ada, which are a better fit than VHDL with C++, but it could be done.

Comment by lioeters 4 hours ago

It's an intriguing idea. Having experience with software but almost none (only hobbyist) in hardware, I imagine it'd require a strong type system and mathematical foundation. Perhaps something like Agda, a language that is a proof assistant and theorem prover, with which one can write executable programs. https://en.wikipedia.org/wiki/Agda_(programming_language)

Comment by timschmidt 3 hours ago

I wonder if an escape hatch like Rust's unsafe{} would be enough... a hardware{}. The real complexity likely lies in how to integrate the synthesis tools with the compiler and debugger. The timing model. A memory model like Rust's would certainly aid in assuring predictable behavior, but I'm not certain it would be sufficient.

Comment by 09553221465 17 minutes ago

Kansnhsnnskan

Comment by 09553221465 18 minutes ago

Hack acount

Comment by 09553221465 19 minutes ago

Lonsay wawix

Comment by 09553221465 18 minutes ago

Hack acount

Comment by ramon156 4 hours ago

off-topic, this article has almost the same theme as dawnfox/dayfox which I love. It fits nicely with my terminal on the left. Cool stuff

Comment by turtleyacht 5 hours ago

The next language ought to ensure memory-safe conditions across the network.

Comment by yvdriess 4 hours ago

AmbientTalk did this. I used it for a demo where I dragged a mp3 player's UI button to another machine, where pressing play would play it back on the originator's speakers. Proper actor programming in the veins of E and Erlang.

https://soft.vub.ac.be/amop/

Comment by csrse 4 hours ago

Already exists since way back: https://github.com/mozart/mozart2 (for example)

Comment by riffraff 3 hours ago

I only realized now CTM is more than 20 years old. In my mind it's still a cool new book.

CTM: https://en.wikipedia.org/wiki/Concepts,_Techniques,_and_Mode...

Comment by gostsamo 5 hours ago

the article states that the language can have extensions for different domains, so it is also an option.

Comment by 5 hours ago

Comment by derleyici 5 hours ago

And the answer is… Rust.

Comment by anthk 5 hours ago

Or Algol 68, which is doing a comeback.

Comment by pjmlp 4 hours ago

Or even ESPOL and its evolution, NEWP, never went away, only available to Unisys customers that care about security as top deployment priority.

Comment by EvanAnderson 4 hours ago

I wish more people knew about the Burroughs Large Systems[0] machines. I haven't written any code for them, but I got turned-on to them by a financial Customer who ran a ClearPath Series A MCP system (and later one of the NT-based Clearpath machines with the SCAMP processor on a card) back in the late 90s, and later by a fellow contractor who did ALGOL programming for Unisys in the mid-70s and early 80s. It seems like an architecture with an uncompromising attitude toward security, and an utterly parallel universe to what the rest of the industry is (except for, perhaps, the IBM AS/400, at least in the sense of being uncompromising on design ideals).

[0] https://en.wikipedia.org/wiki/Burroughs_Large_Systems

Comment by pjmlp 4 hours ago

Yes, IBM i and z/OS, are the other survivors.

Comment by 1 hour ago

Comment by bananaflag 4 hours ago

I am wondering what the Ada equivalent of affine types is. What is the feature that solves the problem that affine types solve in Rust.

Comment by ajxs 3 hours ago

The SPARK subset of Ada^1 has a similar kind of move semantics for pointer types^2.

1: SPARK is a formally verifiable subset of Ada: https://en.wikipedia.org/wiki/SPARK_(programming_language)

2: https://arxiv.org/pdf/1805.05576

Comment by fweimer 4 hours ago

Limited controlled types probably come closest.

https://learn.adacore.com/courses/advanced-ada/parts/resourc...

Comment by Raphael_Amiard 3 hours ago

There is none as far as affine types go, even is there is a parallel to be made with limited types, but they don’t serve the same purpose.

The way Ada generally solves the same problem is by allowing much more in terms of what you can give a stack lifetime to, return from a function, and pass by parameters to functions.

It also has the regular « smart pointer » mechanisms that C++ and Rust also have, also with relatively crappy ergonomics

Comment by mhd 3 hours ago

No mention of Algol? Or Mesa?

Comment by projektfu 3 hours ago

[dead]

Comment by spinningslate 4 hours ago

Wonderful article and a good fit with HN’s motto of “move slowly and preserve things” as opposed to Silicon Valley’s jingoistic “move fast and break things”.

It highlights the often perplexing human tendency to reinvent rather than reuse. Why do we, as a species, ignore hard-won experience and instead restart? In doing so, often making mistakes that could have been avoided if we’d taken the time or had the curiosity/humility to learn from others. This seems particularly prevalent in software: “standing on the feet of giants” is a default rather than exception.

That aside, the article was thoroughly educational and enjoyable. I came away with much-deepened insight and admiration for those involved in researching, designing and building the language. Resolved to find and read the referenced “steelman” and language design rationale papers.

Comment by smitty1e 3 hours ago

> Why do we, as a species, ignore hard-won experience and instead restart?

Humanity moves from individual to society, not the reverse.

Some knowledge moves from the plural to the singular, top to bottom, but the regular existential mode is bottom-up, which point The Famous Article (TFA) makes in the context of programming languages.

Children and ideas grow from babe to adult. They do not spring full grown from the brow of Zeus other than in myth.

Comment by spinningslate 3 hours ago

Thanks, that’s helpful. My wife is a teacher and talks about knowledge being recreated, not relearned: IOW it’s new to the learner even if known by the teacher. Hadn’t put those things together before.

Comment by cgadski 4 hours ago

Does anyone understand how/why old HN accounts become mouthpieces for language models?

Comment by spinningslate 3 hours ago

Erm, well, the comment wasn’t AI generated, it was by me - a warts and all human. The sibling comments say TFA is AI generated and I’ll be the first to admit I didn’t spot that. Still found it interesting though.

Comment by projektfu 3 hours ago

That seems uncharitable.

Comment by DeathArrow 3 hours ago

It looks like OpenClaw started blogging. :D

Comment by phplovesong 2 hours ago

I would never work on projects that ADA is used for.

1. Would never work on "missile tech" or other "kills people" tech.

2. Would never work for (civ) aircraft tech, as i would probably burn out for the stress of messing something up and having a airplane crash.

That said, im sure its also used for stuff that does not kill people, or does not have a high stress level.

Comment by jazzypants 2 hours ago

> JavaScript's module system — introduced in 2015, thirty-two years after Ada's — provides import and export but no mechanism for a type to have a specification whose representation is hidden from importers.

What?

#1 JavaScript doesn't have formal types. What does it even mean by "representation"?

#2 You can just define a variable and not export it. You can't import a variable that isn't exported.

There are several little LLM hallucinations like this throughout the article. It's distracting and annoying.

Edit: Look, I know that complaining about downvotes is annoying, but I find this genuinely perplexing. Could someone just explain what the hell that paragraph was supposed to mean instead of downvoting me?

Comment by 09553221465 17 minutes ago

Hahahahaha