Ada Language

Recently Updated:

The current standard is Ada 2012, the latest revision adds pre-conditions, post-conditions, type-invariants, expression functions, and qualitative expressions. This results in the ability to specify in active-code things that would generally require external tools to achieve (e.g. correctness-provers & documentation-generators) and allows the compiler/runtime to ensure reliability (as opposed to misleading or outdated comments). The Language Reference Manual is here: www.ada-auth.org


From the Jargon File:

A Pascal-descended language that was at one time made mandatory for Department of Defense software projects by the Pentagon. Hackers are nearly unanimous in observing that, technically, it is precisely what one might expect given that kind of endorsement by fiat; bloated, crockish, difficult to use, and overall a disastrous, multi-billion-dollar boondoggle (one common description was "The PL/I of the 1980s"). The kindest thing that has been said about it is that there is probably a good small language screaming to get out from inside its vast, elephantine bulk.

Opposite stance, from Slashdot:

It's quite apparent that the evolution of the C family of languages (C, C++, Java, C#) is converging on a language very like Ada, except unfortunately as a kludgepile rather than a clean design.

The Ada FAQ list tries to clarify some of the more common misconceptions about Ada (www.adahome.com ):

Ada is an advanced, modern programming language, designed and standardized to support widely recognized software engineering principles: reliability, portability, modularity, reusability, Programming Asa Human Activity, efficiency, maintainability, information hiding, abstract data types, concurrent programming, object-oriented programming, et cetera. All Ada compilers must pass a validation test.

Ada is a programming language designed to support the construction of long-lived, highly reliable software systems. The language includes facilities to define packages of related types, objects, and operations. The packages may be parameterized and the types may be extended to support the construction of libraries of reusable, adaptable software components. The operations may be implemented as subprograms using conventional sequential control structures, or as entries that include synchronization of concurrent threads of control as part of their invocation. The language treats modularity in the physical sense as well, with a facility to support separate compilation.

The language includes a complete facility for the support of real-time, concurrent programming. Errors can be signaled as exceptions and handled explicitly. The language also covers systems programming; this requires precise control over the representation of data and access to system-dependent properties. Finally, a predefined environment of standard packages is provided, including facilities for, among others, input-output, string manipulation, numeric elementary functions, and random number generation.


Why was Ada developed? The truth can now be told ;->


Has anyone else here actually used Ada? I wrote in Ada for two years. My experience was that its extremely consistent data-typing rules and high readability gave me the highest productivity and lowest bug rate of any environment I've used before or since. Calling Ada "bloated, crockish, difficult to use" is ignorant; it has a very small number of keywords and constructs. Ada makes TASK a part of the language in contrast to Java exposing Threads through a library; big deal. Tasks, exceptions, and generics (a construct we only got in Java with J2SE 5) may look like language bloat to some; the interminable cavalcade of Java libraries looks a bit hefty to me. I hate worrying about counting parentheses, braces, and commas and in Ada I never had to. I would say that in Ada more than any language I've ever personally used (or seen) it is possible to truly express and recognize DESIGN.

According to the Shootout results, Ada (using GNAT) has very good speed/memory usage. I hear the idea of Ada being "sluggish" and "bloated" came from the crap compilers that first started to appear (after the language was already defined). Later, (someone said around 1989) much better compilers came out.


Ada was named for Ada Byron, Lady Lovelace (1815-1852), daugther of Lord Byron the poet, commonly regarded as the first computer programmer because she developed a form of code for Charles Babbage's Analytical Engine. It computed Bernoulli numbers, a passion among mathematicians of the time. The first Ada standard was Ada 83. The second, Ada 95, added mainly object orientation. Ada 2005 adds a C++-like dot notation for dispatching calls on objects, anonymous operations, improved generics, libraries including one for containers (vectors, sets, maps), and much more.


I personally don't use Ada as I mostly program in Java and C++, and Ruby at times. I've looked into Ada and although I currently don't have a use for it I found some features that are actually quite useful. I don't know of another language that has entry's for example such that you can have one Thread waiting for another Thread but only if it doesn't take too long or if another entry doesn't open. -- Christophe Poucet


Henry Baker also wrote a bunch of Ada-related papers.


Is Ada overly verbose? This is Hello World in Ada:

with Text_IO;

procedure HelloWorld is begin Text_IO.Put_Line("Hello world!"); end HelloWorld;

Compare with the other versions offered on Hello World In Many Programming Languages.

Hello World is a terrible metric of verbosity. Hello World is not representative of any real-world program for any real-world problem domain.

Nonsense. It's a measure of how much arbitrary crap the language makes you put up with even when it has nothing whatsoever to do with solving the damn problem. A language where Hello World is a huge program is a bondage freak's dream.

Agree.. And even Qomp Language addresses this problem of the hideous complexity required in Ada and such Algol languages.

I see Ada as a verbose language, however, unlike say Java Language, the verbosity actually benefits readability.

One opinion: use verbose hideous syntax for teaching, then migrate to a real language like Modern Pascal or Qomp Language which isn't so complex as Ada and is more widely spread for typical every day programming on almost all platforms.


Although closely related, too many people confuse verbosity with boilerplate. Ada may be verbose, but from what I can see above, it doesn't look like it requires much in the way of obligatory boilerplate. However, compare the above program with a Cee Language implementation of Hello World written for use in a Component Object Model environment. C is hardly considered to be verbose, and yet, the boilerplate involved with such a program would make even Cobol Language programmers chuckle.

Moreover, languages with large boilerplate requirements tend to have a shallower slope in its problem-complexity-vs-lines-of-code graph. [Is there a study I can peruse regarding this?] Hence, languages with large boilerplate requirements will obviously be ill-suited to simple tests as above, but might find niches in very tough problem domains, such as in mission-critical medical and aerospace applications (which, as it turns out, is precisely where Ada finds its niche).

Ada is arguably more successful in its domain not because of its ridiculous verbosity, but because it has higher level features, separation of concerns, safety checks, development teams focused on embedded programming, aircraft specialists and programmers working on the compilers, etc. One could still take all of these same Ada experts, brainwash them, and have them (or another team with more sense) implement a more reasonable syntax and notation while keeping the same range checks, contract checks, safety checks, stack abilities, separation of concerns (type declarations, arrays vs pointers to pointers, by reference vs by value or in/out params, modules, high level threading abilities, sub modules, packages).

And, invariably, what will happen is a language that looks amazingly like Ada as it exists now, or as Perl Six, which makes me want to vomit. There comes a time when words are more appropriate than symbols.

In fact it has been visually proven that a baroque hideous language can be trimmed down to an arguably just as readable or more readable notation already on this wiki on other pages.

To "prove" implies an objective, measurable reality to which everyone agrees upon. You can't prove something visually, because all aesthetics are subjective. I reject your claim out of hand on that basis.

Ada, and even Oberon style syntax are borderline cases (or in some people's opinion, are cases) of disobeying Dijkstra's recommendations:

"Then there was COBOL's creed that if your programming language was similar enough to English, even officers would be able to program." --www.cs.utexas.edu

The desire is not to program in English. However, the use of words, instead of bizarre symbology reminiscent of the glyphs found on the I-beam at the Roswell crash, proves far more preferable. If, as history demonstrates, libraries prove more preferable to language features, why do functions have names instead of symbols? What does the symbol for the keyword formerly known as struct look like?

If, as history demonstrates, libraries prove more preferable to language features, why do functions have names instead of symbols?: The former point (that 'history demonstrates' libraries preferable) is questionable (Are Design Patterns Missing Language Features? Much of the time the reason for verbosity in a language is that the language forces you to repeatedly jump through the same hoops to solve the same problems with slightly different parameters - due, seemingly, to missing features requiring you repeatedly apply design patterns.) In any case, functions tend to have names because designing languages that supported libraries of symbols as well as appropriate order-of-operations (all without interfering with the default language symbol set) was beyond state-of-the-art at the time languages in use today were actually built (the research that makes it really feasible wasn't performed until the early nineties). Plus, most 'new' language-designs are derived largely from the language the designer already knows... it appeals to familiarity.

"...the vision that automatic computing should not be such a mess is obscured, over and over again, by the advent of a monstrum that is subsequently forced upon the computing community as a de facto standard (COBOL, FORTRAN, ADA, software for desktop publishing, you name it)." --Dijkstra

I suspect this quote was taken out of context, particularly with his inclusion of desktop publishing in the list.

"We have to let the symbols do the work, for that is the only known technique that scales up." --Dijkstra

Proof? I think APL demonstrably serves as a counter-proof.

"I have warned against the ...tendency to design programming languages so that they are easily readable for a semi-professional, semi-interested reader. (Symptoms of this tendency are languages the vocabulary of which includes a wild variety of English words to be used in a nearly normal sense, and some translators that even allow a steadily expanding list of synonyms and misspellings for these words. Particularly, languages designed under commercial pressure have suffered seriously from this tendency.) It looks so attractive: Everybody can understand it immediately. But giving a plausible semantic interpretation to a text which one assumes to be correct and meaningful, is one thing; writing down such a text[.....]expressing exactly what one wishes to say, may be quite a different matter!" --www.cs.utexas.edu

Every domain has its own sub-dialect of English. Legalese is a great example. Programming should be no different.

So you would agree with an assertion to the effect that future programming languages should support the construction and utilization of distinct sub-dialects for each programming domain? (To every programming domain its own sub-dialect of the common language - i.e. Extensible Programming Language) To the extent that it allows the language to assist good programmers, rather than cater to the needs of neophytes, yes, absolutely. However, a language should neither mandate nor even necessarily encourage the use of esoteric symbols. (some discussion moved to Terse Language Weenies)

Also, words are symbols. When a calculus student looks at a differential equation and sees dx/dy, he doesn't think of a fraction of infinitesimals. He sees a single concept -- though constructed of 3 distinct sub-symbols, dx/dy is A single concept, that is, it is a word. Finally, Dijkstra Isnt God. Stop pretending that he is the grand purveyor of all wisdom of computer-related knowledge. It gets old.


References

I strongly recommend Ada As A Second Language by Norman Cohen, ([ISBN: 0070116075], [ISBN: 978-0070116078]), it is 1133 pages, and easy to read, but not overly verbose.

For some downloadable books on Ada see:

The GNAT compiler and other tool (Ada Web Server, PolyORB, Gtk Ada, ...) can be found at libre.adacore.com

There is a great Ada like scripting language, Business shell, BUSH: www.pegasoft.ca

BUSH vs. Ada 95 Ada 95 (named after the world's first programmer) is a multithreaded, exception handling, object-oriented programming language used for large, high-integrity projects. It is known for abundant features, high performance, strong checking and readability.

The BUSH shell is not an Ada interpreter. BUSH uses AdaScript, a subset of the Ada 95 language with additional features specifically for interactive shell sessions. Because of its Ada 95 background, BUSH scripts are easy to create, maintain, debug and can be compiled into fast, executable programs using an Ada 95 compiler.


I thought this was an interesting quote:

Ada didn't have type extension and dynamic binding, and so missed the boat on the object technology revolution....

You have to understand the climate of the times. Ada was largely a reaction to languages like Fortran. They [the commissioners -- the Dod] wanted once and for all to determine everything at compile time, so you could eliminate errors like the parameters of an invocation of a subprogram not agreeing with the parameters of the declaration. Certainly not an unreasonable desire!

It seems like a reasonable design goal to me for the intended niche, even now.



See original on c2.com