[Moved from Source Code Is User Interface.] See also: Source Code
I've always thought [that "Source Code Is User Interface"], especially when reading Language Pissing Match flame wars.
What the software industry needs is usability testing to show which language features are best at reducing operator (that is, programmer) error and providing other measurable benefits. Other industries do this, so why not ours?
And then, when the results have been collected and tabulated, we can design a language based on best principles, and end up with... Smalltalk :-)
Maybe, maybe not. Usability Is Hard to implement, you have to cater to the needs of Human Beings who are fallible prone to various slips and errors and so adaptable that they will learn something that is Hard To Use and not want to give it up because they put so much time and energy into learning it in the first place. This last is the cause of most Language Pissing Match discussions.
Usability testing would be hard to do well since it would be almost impossible to use a research protocol that didn't bias the results.
You could say that about any human factors experiments, or any "soft" science experiments. You have the same danger in hard science as well. That is why we have the scientific process, peer reviewed publications and repeatable experiments. Unfortunately, very little "computer science" is actually science in the strict sense of the word.
The programming community has a real cultural problem with the concept that Source Code Is User Interface, and that therefore some language features are better than others. I think this is because programmers think of themselves as a mix of artistic prima donnas and tall-foreheaded geniuses who can understand complex tools that are beyond the ken of normal folk (I know I sometimes do). Unfortunately we shoot ourselves in the foot because of it. Software has the image of being unreliable and insecure because, to use the terminology Fred Brooks used in No Silver Bullet, our languages force us to work too hard on the "accidental" difficulties of programming and so we cannot concentrate on the "essential" difficulties where we should be focussing our efforts.
I get the sense that Alan Kay and Adele Goldberg did consider this when writing Small Talk and that some of the features especially the class browser are directly intended as cognitive support for the programmer. Of course smalltalk does have it's drawbacks, but they are mostly performance related, and even when st falls down (learning the class library) on user interface things it's still much better than most other languages.
Python Language will eventually be everywhere because Guido Van Rossum has this figured out. In the middle of a culture that placed a premium on power and flexibility he focused on legibility and ease of use. I've been using IDLE the past month or so and it's surprising how much of Small Talk has been appropriated and adapted to a more Unixish language. I wonder if Python wasn't designed with Worse Is Better in mind.
Experiments in language usability have been done. One example is: www.parmita.com . The languages in question are two hardware description languages. VHDL is similar (in appearance and philosophy) to the Ada Language; Verilog is similar to C. Experienced users of each language were asked to tackle a fairly simple design problem using their favoured language. Whilst the results can be debated, the experiment does at least provide a stake in the ground around which language usability experiments can be discussed. --Dave Whipp
Very interesting, especially the bit about how the various language vendors were all over him trying to spin the results of the test one way or the other.
Does anyone besides me think that the concept of owning a language is just a tad bit strange...
This isn't going to be a very scientific discussion, but what language features are more usable than their equivalent in other languages?
I'll have a stab. Feel free to add your own. Maybe we can actually perform some experiments to prove/disprove these assertions?
Assignment
The assignment operator should not look like the equality operator. E.g. the ":=" of Pascal, Modula, Ada and Smalltalk is less error prone than the "=" of C, C++ and Java. (I believe that there have been studies that show this, but I have been unable to track down references).
''This was a problem in C but is pretty much eliminated in Java, since assignments aren't expressions in Java.'' [AFAIK, assignment IS an expression in Java, but because type checking is stronger in Java (i.e. "if (x=y)" is a type error for everything except boolean x and y), most of "if(x=y)" errors are caught by the type-checker.]
Perhaps the best would be = for comparison and <- for assignment.
Function or Method Calls
Smalltalk's keyword-based method syntax is more readable than the traditional way of passing tuples of arguments to a function. This is especially true when deciphering someone else's code.
Highly debatable. I've tried languages like Smalltalk, Ml Language, and Forth Language (way back when), but gave up largely because the syntax has never seemed natural for me - things run together due to the lack of grouping symbols. On the other hand, the Lisp Language goes too far to the other extreme. A moderate amount of grouping symbols seems best.
''On the other hand, keywords like 'and' and 'or' seem better than symbols.''
In Python, you can do this:
obj.f(x=12, y=24)
In Objective Caml, you can do this:
obj#f x:12 y:24
In Visual Basic, you can do this:
ThisDocument.SaveAs "foo.doc", EmbedTrueTypeFonts := True
I.e., a combination of Smalltalk-like and C-like. IMHO, this is very readable.
Especially since a lot of Visual Basic for Applications classes have methods with lots and lots of optional arguments. Without named arguments, the above example would have to be written like this:
ThisDocument.SaveAs "foo.doc",,,,,,,True
Blocks
Defining blocks by indentation, as is done in Miranda, Haskell and Python, is less error prone than defining blocks for the compiler with braces or begin/end keywords and for the human reader with indentation. Moreover, Haskell optionally still allows {} and ;, in order to faciliate writing automatic code generators (thinks like Yacc/Lex). See Indentation Equals Grouping.
Alternatively: Use different delimiters for different kinds of blocks, so that if the programmer writes "begin foo ... [50 lines of code deleted] ... end bar", the interpreter or compiler will catch it.
But of course having 50 lines of code between "begin foo" and "end bar" in the first place is very bad style. Verbose block open-block close construct penalize writing lots of small functions and make writing multi-page functions easier.
Nouns and Verbs
(moved from Adjectives And Adverbs:) All natural human languages distinguish nouns and verbs. A procedure or operator that both returns a value and has side effects is acting like a noun and a verb simultaneously, and therefore is apt to appear in buggy code. Think of these classic C mistakes:
if (x = 0) { doStuff(); } #define MAX(x,y) ( x > y ? x : y ) /* folks with more experience than I at making C mistakes can add to this */
Is this a language feature, or a use of the language. Which languages explicitly separate nouns and verbs? And how do they stop the user making a noun appear to be a verb and vice versa. (And isn't MAX an adjective?).
No widely-used languages explicitly separate nouns and verbs (except, perhaps, for the purely functional languages, in which verbs are excluded entirely). I think this is part of the problem. For example, suppose the syntax of C treated x = 0 as a pure verb (that is, it returns no value) and treated x == 0 as a pure noun (that is, it can have no side effect). Then if (x = 0) would be a compile-time error.
What about Perl? $scalar_noun, @plural_noun, &verb
Every language distinguishes between variables and procedures; that's not what I'm talking about.
Depth Recursion
The mechanism in the human brain that parses sentences has a very shallow stack. Consider the following sentences (the third from Marvin Minsky):
The mouse died.
The mouse that the cat chased died.
The mouse that the cat that the dog chased bit died.
The dog chased a cat, who bit a mouse, who died.
Isn't that third one similar to Forth Language? (Or some constructs in German Language, or Greek, see The Miracle Of And) I was thinking of the sort of hard-to-read Lisp programs that gave Lisp a bad name, but now that you mention it....For those confusing lisp programs, you just need to learn to read them. Most of the standard complex-looking recursive functions are easy if you think of them in terms of: what is the result of one "loop" through the function, what changes between "loops", and how does it decide to stop? The Little Schemer is a handy tutorial on this
Remember the Forth Language bumper sticker: I Forth (Heart)? Or the one that said "Forth (Heart) if honk"?
And what preposition-retentive critic can quickly parse: "What did you bring the book I don't like to be read to out of up for?"
All this reminds me of the little-known but excellent fact that the following is a perfectly grammatical English sentence: "Oysters oysters oysters split split split". I shan't explain how to parse it just yet, so as to give any curious readers the chance to work it out. I've read code that feels rather like that sentence... :-)
Rosenfelder's Advanced Language Construction quotes Anne de Roeck in response to the idea of a limit on the size of the mental stack for interior recursion: But don't you find that sentences that people you know produce are easier to understand?
I think that the focus on individual language features is misplaced: A language is "better" than another if it makes it easier to solve the problems you're trying to solve.
So, the "better or worse" rating of a tool or language depends heavily on the problem you chose to solve with it.
If you're producing lots of reports, consider using a report generating tool. It will probably be much more effective at its task than C/C++, COBOL, Visual Basic, etc, etc, etc... But the report generating tool will be useless for business screens, server-side business logic, or device drivers.
It's been my experience that the most powerful tools, the ones that give you the greatest leverage -- and hence highest productivity, have a more limited scope. That is, they're less "general purpose." So, you trade development efficiency/productivity for limited scope of solutions you can deliver. -- Jeff Grigg
Agreed, but this page is about the User Interface of a language, and that is made up of these individual syntactic features. That is, the page is not discussing what kewl features we would like to have in a language, but how one should best represent features so that humans can better understand them and understand how they are combined.
But the user interface of a language may actually include a real "user interface" -- like Visual Basic's GUI for designing forms. (And there are similar tools for a number of other languages.) I usually find dragging and dropping and easier way to design business forms than, for example, putting lots of X,Y coordinates into tables in C/C++ code.
Thus, a language, including its development environment, can help or hinder your efforts, depending on the kind of support it gives. -- Jeff Grigg
Personally, I find knowing a good Graphical User Interface framework and typing code to be more efficient than using a GUI builder, especially when you have to tie the application logic to the GUI. However, visual tools cover only a tiny part of programming, and one that is not the major area for real bugs. When it comes down to it, GUI programming (at least the kind you can do with a mouse and dialog boxes) is not much more than instantiating objects, setting properties of those objects, and putting objects inside other objects.
Getting the logic of a program correct is the hard part and understanding existing logic is even harder. Those tasks are helped or hindered by how well or badly a language makes that logic comprehensible to a human being.
Our industry has the habit of not addressing this issue, but instead inventing new technologies (GUI builders, Wizards, visual programming, CASE tools) and promising that they will make programming easier. However, when it comes down to it, complex logic and dynamic relationships have to be programmed in a textual language. The simple features of a language do have an impact on the quality of software and the ease of its development.
That's not to say that there may not be more fundamental improvements we can make to languages, such as Aspect Oriented Programming. However, most of us are still using third generation languages based on the C syntax, the awkward features of which have been known for almost 30 years!
It is important to distinguish between experienced and newbie users. Different people find different interfaces better. In GUIs, menus are good for new users and for rarely used features; power users will [nearly] always prefer keyboard shortcuts. In a language, symbols are good for regular users, but very cryptic for new users and rarely used features. A good example is perl: the symbols in $scalar, @plural, etc. are intimidating for new users but after a couple of hours seem perfectly natural. I recently had a reasonably experienced C programmer ask me what "^" means -- he'd never used it and guessed it was "raise to power"! Symbols are good for power users, words for others.
This is exactly what was found with COBOL. It was designed to be easy to read by non-programmers, and so used thousands of verbose english keywords ("ADD ITEM-PRICE TO SALES-TOTAL"). However, studies found that programmers worked better with languages that used mathematical symbols ("sales_total := sales_total + item_price").
I think it is good to use an Model View Controller architecture for languages! If you can have several ways of viewing and manipulating the source code then it can be more natural. For example, you might choose between the Business Object Notation and the Eiffel Language; or between state diagrams and state tables; or between ADFDs and ASL (in Shlaer Mellor Method). Languages that allow you to choose between concise and verbose representations are nice too.
This is partly a development environment issue, but also a core language issue. If a language is defined largely by its syntax then it is difficult to provide 1:1 mappings into (out of) other representations. Even something as complex as a subset of the UML would have difficulty to provide a 1:1 mapping of c++
I like Dave's idea of having multiple views on the language, although I can see that it would be very difficult to do well with most of the existing languages. Tools like Together or Rational Rose do cover some of the problem domain in that they give a better presentation of structure than is possible to acquire from reading the source code and building a diagram in your head, and Together does support some of the switching back and forth but neither addresses the ease of creating correct logical structures using the language.
Some tools that do try to address that are Net Rexx and Jpython which both layer a simplified syntax over the JVM. Any high level language provides a more convenient interface to a lower level language, all the way down to assembly being a convenient interface to machine language.
As far as the whole GUI bit earlier in the thread.
GUI's work for graphics but they are clumsy and inefficient for modelling logic. Just think after all of how few graphical programming environments have succeeded in the market vs. how many have been tried. (note: the only graphical programming environments I'd consider to be effective are Macromedia Director, and an environment for CNC machine paths a friend showed me; narrowly defined applications both)--Larry Price For gate-level "logic" design, when given a choice between schematics (GUI) or a netlist (text), most hardware engineers would choose the schematics --dw
... and Macromedia Director uses a scripting language, Lingo, for specifying animation and logic that is more complex than can be described with its GUI (i.e: most of it).
I think the GUIs may be a bit of a Red Herring. You can get multiple non-GUI views. As an example, consider Skill (IL) vs Lisp. Skill uses Lisp semantics for a scripting language within a family of CAD tools. However, it has a C-like front-end syntax that uses Infix Notation for various common expressions. For example, instead of writing "(setq c (+ a b))" you can write "c=a+b". Lisp experts (power users) might not appreciate the advantage that this gives to new users.
I first used Skill as in intern, about 10 years ago. At the time I knew Pascal and C (plus assembler). The learning curve for the new language was very easy -- basically just a few semicolon rules. As I progressed, I learned to appreciate the underlying power of the Lisp engine. I was soon able to think in Lisp. But for those first few days, whilst my brain didn't know how to scan lisp code, the C look really helped. I think that the Smalltalk community might learn a lesson from this. --dw
An interesting and underresearched point going beyond Language Usability are the Human Factors involved in security holes. Why do developers make the same mistakes in their Web applications over and over again (see e.g. www.owasp.org )? Why do developers love languages so much that allow them to implement buffer overflow bugs in the most efficient manner?
-- Sven Tuerpe
There has been a little work in the language usability area. The Psychology Of Programming people do some, and have developed Cognitive Dimensions, a framework to analyse the usability of visual notations. Also, there's Alice (at www.alice.org ), where they developed an API for 3D programming using many many usability studies (>100 studies). They were so successfull that primary school kids can wrote interactive 3D apps using their tool.
-- Tim Wright
See original on c2.com