Language Pissing Match

"My language can beat up your language"

Choose your language. Fight to the death.

See also:

(Lot of topics on dynamic vs static typing) Also note:


Is anyone else sick and tired of seeing this term everywhere, especially on this Wiki? It seems like "Language Pissing Match" gets used to describe almost any comparison of programming languages, regardless of the tone the discussion takes or the facts presented in it.

I think it's rather humorous, actually.

It's like calling someone an anti-Semite or comparing somebody to Hitler; a tool used to stifle serious debate and discussion.

Some of the pages linked to above contain very interesting discussions and novel ideas, yet some smartass still condemned them all as "pissing matches." Apparently making an analogy to urination proves how mature you have become and how superior you are to everyone else.

The truth is that some languages are better at certain tasks than others, and some are just plain better period. If you don't believe that, then try writing desktop applications in straight C or anything at all in COBOL.

The state of programming languages will never advance unless we are able to objectively compare existing languages with one another and identify and isolate their failings. If you are somebody who feels the need to chime in during one of these discussion and accuse everyone from atop your high horse of being involved in a "pissing match," please consider keeping it to yourself. -- Steve

I agree that the term should perhaps be scrapped for something more presentable. It is generally meant to convey a sense of never-ending emotional debates among proponents of various languages or tools. But, I don't think objectivity will ever play a big part in solving or preventing such battles, because Most Holy Wars Tied To Psychology. That is just the nature of the beast. -- top


Don't worry. It'll be great. We'll Start From Scratch in Smalltalk rather than COBOL.

...if the apps matter that much, maybe Eiffel would be a better bet.

Serious question: Why would Eiffel be better than Smalltalk?

...depends on what "better" means. The comment about Eiffel comes from the fact that Y2K problems at least partly stem from undocumented assumptions that change in the medium to long term (decades). Smalltalk excels by being flexible and relatively easy to refactor, but it doesn't focus on type checking and Design By Contract; Eiffel does. Y2K is about implicit assumptions that change; Design By Contract addresses this directly, and Eiffel supports Design By Contract from the ground up.

I don't think that Design By Contract would have helped with Y2K. To write your contracts you have to think there will be a problem. Nobody thought there would be a problem. They would have made explicit assumptions rather than implicit ones. Yeah, DBC would be very good at showing you were your software fails... after the Y2K date rollover.

I can't answer why one would use Eiffel. I'd use Smalltalk because it's more productive, for me, than C++ or Java. And yes, I've used all three. --Ron Jeffries

Bertrand Meyer's account of who's using (his version of) Eiffel can be found at: eiffel.com

Dunno whether Eiffel is theoretical or practical. But isn't Chrysler paying people with a Smalltalk program? Aren't there very large and significant Smalltalk trading programs running on Wall Street? Doesn't TI have a semiconductor fab running in Smalltalk? Aren't those fairly practical?

Does the distinction between whether a language is theoretical or

practical really matter? (In theory, no. In practice, yes. -- SH) So long as the language is usable for real projects, does it matter where it comes from? (After all, Smalltalk came from Xerox Parc - not really a "practical" environment when compared with the origin of C++.)

There's definitely a trade-off between short-term productivity, functionality, and the sort of effort that has to be made with assertions and Design By Contract. At one extreme you prove every little thing, and at the other extreme you speculatively hack. You have to decide where to pitch your development according to the forces at play: time to market, longevity of the application, complexity of the app domain, criticality of the application... There's no harm with C++ or Java if you make an informed choice. (Written with no intention of stirring up a Language Pissing Match ... honest.)

Ah, you're all full of baloney. The future belongs to Perl, which has everything all them other languages has and a whole lot more. You can always write it faster, better and freer in Perl - no tradeoffs, no compromises, no bulldust. -- Peter Merel

Perl is a compromise.

That's because Larry Wall is a linguist first and scientist second (er, third maybe). As with natural language, his intention was to roll all the best words (in Unix) into one hairball (he would say snowball). I love it; and it almost worked.

But missing values are not zero, Larry!

The road to hell is paved with melting snowballs. -- Larry Wall (quoted by Paul Taney)

The future did belong to Perl, but PHP came along and stole it away ;)


I couldn't just let this one fly, sorry. Php Language is okay to get the job done, but it's far from "complete" as an OO language. Now that the new Zend Engine is out, there's a little hope. Version 2 has support for namespaces (inner classes, actually), private members, cloning, exceptions ... a bunch of good stuff all in all. But it's only available in PHP v4.3, which hasn't been released as I write this. Also, if you want to compare it to Perl, you probably don't want to compare CPAN to PEAR, unless you're a masochist. PHP has its place under the sun, under a rock :) -- Robin Millette

Sorry, old boy (or gal), but PHP is here to stay. It is supported all over the world and doesn't even need a shebang line. Therefore, one script can supply multiple hosts - just like Javascript. Oh, well.


I'm curious where Ada95 fits it the above Smalltalk/Eiffel comparison. I know its much more ... conservative ... than C++. -- Wayne Carson


"why didn't Smalltalk ever take off in the mainstream of OO languages?" I don't know. When you realize OO itself was more or less invented in 1967 and didn't seem to take off until the 1990s, the specific failure of Smalltalk isn't so surprising. You might also ask why Lisp didn't take off. This is a very conservative industry. I expect Smalltalk demanded too much power for the time. Having been rejected, it never got reconsidered.

C++ took quite a while to catch on, too. On the PC, the only compilers were two-step things that generated C, which had to be compiled in turn. I personally date its success from Zortech's single-step compiler, whose sales shocked everyone (much as Borland's Pascal did earlier). It was a while before Borland caught up with a compiler of their own. Microsoft were years behind that, and their compiler has lagged in features for most of its history - this shows up in the design of Microsoft's class libraries. Nowadays Microsoft is almost synonymous with C++, which is ironic given how slow they were to embrace it.

"Why did someone have to invent a Java?" C++ succeeded partly because it was close to C. Java learned that lesson. In my opinion it offers no improvement over other contemporary languages, especially Sather and Eiffel, but the hordes of C/C++ programmers would not accept anything that looked so different. So Java was necessary to move the C/C++ world onto garbage collection and so forth.

The Java Byte Code is another matter. Arguably the Internet needs an efficient, secure, portable bytecode. The timing was just right to exploit web browsers and anti-Microsoft feeling. The big shame about the bytecode is that it be so tied to Java. Although [www.adahome.com Ada can be compiled to Java bytecode] Now it has crowded out other, more general virtual machines.

If you're asking about Java specifically in the context of Smalltalk, then the main difference is the static type checking. A big section of the industry believes static, manifest types are desirable for software engineering.

This has been my experience also. Any theories as to When Is Manifest Typing Considereda Good Thing?


If so many people think static, manifest types are so important, then why is Visual Basic so popular?

[I've heard it said that VB was the real competition for Smalltalk, prior to Java. -- Paul Chisholm]

You can code VB programs using (mostly) static, manifest types.

Set "Option Explicit" in every form, code module, and class module.

Avoid using the "Variant" data type.


I just have to say that I am totally entranced with Smalltalk right now, despite never having programmed in it. I've worked in C++ for years and years and years, but through conversations here and elsewhere the ramifications of garbage collection and late binding have finally sunk in for me. In C++ we go through a lot of grief to make subclassing the same as subtyping. In Smalltalk it seems that they do not have to be tied at all. I really look forward to learning it. It just seems that the typed languages are, for lack of a better word, "diluted" OO.

Side note on the Language Pissing Match and Y2K: I remember reading that one reason why so many financial folk are into Smalltalk these days is because the integers do not overflow. Crystal clear. One less way to lose money. Also, it occurs to me that in languages as dynamic as Smalltalk, you are less inclined to paint yourself in a box with the the fixed length problem: "Oops, I only allocated enough space for two digit years." Further, as Ron Jeffries has pointed out, if objects know how to save themselves, then that decision is localized.

I think I'll have to get one of the Definitive Smalltalk Books.


Smalltalk is more dynamic. This shows up in its more powerful reflection mechanisms (which few programmers use, or need to use) and in its more reactive programming environment (though Java ones are improving). It has a more mature (and complicated) class library. It is easy to create new control structures with blocks. The Smalltalk environment is more customizable and open (not entirely good) and Smalltalk programs are easier to change. Last I heard, the best Smalltalk VMs were still faster than the best Java VMs, though that will certainly change before long if it hasn't already.

Java looks more like languages people have used before. This means that a C programmer can start playing around and get a Java program working the first day. Smalltalk has a steeper initial learning curve. Java libraries are designed for multithreading, and distributed programming and the web have been part of Java from the beginning.

The biggest difference is that Java has had fabulous marketing, leading to mindshare. There are more books, more web pages, more consultants who claim to know Java, more CEOs who have heard of Java and are willing to let their employees learn it, more companies starting Java projects, and more companies trying to supply the Java market. Java proves that engineers who want to change the world should understand marketing. -- Ralph Johnson

Doesn't Microsoft prove the same thing? -- Brett Neumeier


Two comments: one quick, one longer. The first is about

Java looks more like languages people have used before.

This is, to me, the biggest advantage Java has over Smalltalk. And it's an absolutely huge advantage. For example, I have gone to talks by Bertrand Meyer. And I have thought "Hmmmm. Interesting." I've even, during the Q&A parts, harangued him on minor points of the Eiffel language. But, despite the fact that I view Eiffel as a well-designed language, I have never actually written code in it. And I probably never will.

I don't ever ever ever want to spend time learning syntax again. Learning syntax is a waste of time, a waste of effort, a waste of valuable brain cells that could otherwise be Watching Gilligans Island. Far too much programmer time is already spent absorbing and reabsorbing the same things over and over and over again.

The dream is, of course, to separate syntax from semantics. Change the programming language semantics, but keep the syntax everyone knows. Which is, roughly, what the Meta Object Protocol does, and what the designers of Java tried to do.

(Side note: I have downloaded VW3.0 [it's free] because the idea of a Refactoring Browser may just be one of those things that will convince me to learn a new syntax.)

The second comment is about late-binding. It's not entirely obvious to me that Java and Smalltalk have the same notion of "late binding."

Question moved to Smalltalk Late Binding.


Another major reason that Java took off (exploded, actually) while Smalltalk never seemed to make much of a ripple: availability of development tools. When I decided to learn Smalltalk ten years ago, there were only two environments available for the Mac (Smalltalk/V from Digi Talk, and Smalltalk 80 from Parc Place). Other platforms had only one option; none were all that cheap for an individual not working on a corporate budget. This was all due to Parc Place's licensing restrictions. Even today, there are not all that many environments available, although I have seen a free version of Smalltalk/V for Windows.

Java, on the other hand, was free. Anyone could download a fully functional compiler and debugger from Javasoft and start coding. There are quite a few decent IDEs priced under $200, and Javasoft has been licensing many, many companies to make Java tools.

This is much the same reason that Eiffel and Objective-C never went anywhere. They were never really given much of a chance in the market because of licensing restrictions.

$200? Kawa, an IDE certainly good enough for learning Java, is shareware, and registration is $50, for students $25 (www.tek-tools.com ). What's more, you don't even have to buy any book to get started - several tutorials can be downloaded for free. -- Falk Bruegmann

Will the choice of language for enterprise applications be affected by whether the IDE is $25 or $250? Will the decision which language a student fools around with be affected by how much compiler, IDE and tutorials cost him? Remember, Java probably was downloaded by tens of thousands of individuals before it was used for the first enterprise projects! -- Falk Bruegmann

I'm not sure I see the connection between your first two sentences (which I take as rhetorical) and your last. I do believe that student costs are significant (most students don't have a lot of money to spend on software). -- Russell Gold

Exactly. The ease of availability increases the number of developers who know the language (learning on their own), which increases the market awareness, which increases enterprise adoption. Smalltalk is a hacker's dream language, and had it been free, would probably have taken off. -- Russell Gold


Hmmm, what would it take for Smalltalk to be Internet-enabled? The capability to generate Java bytecode? As this is available now, is there more to it?


The original article that prompted the article mentioned above:

answers that question.

On the other hand, its author, Jeff Sutherland, apparently a Smalltalk supporter, says: "I estimate that the Smalltalk community has about one year to respond to this problem." (regarding competition from Java) and said this in June 1996!


Here's a cynical action plan for Smalltalkers who what to gain ground:

Write a web-browser in Smalltalk which has a "talklet" tag - which loads and runs classes from a URL connection. (A talklet should have some well-known file extension, say .tlk) Call it "Hot Talk". The browser can double as an IDE for Hot Talk classes.

Write plugins for Netscape and Internet Explorer which will handle .tlk files. The plugins are mostly the Smalltalk virtual machine with glue code. (This idea from Bertrand Meyer's approach to plugging in Eiffel bytecode.)

Investigate some kind of "compatibility bridge" with Java bytecodes. (Could use Fall Back On Reflection.)

(Something would need to be done to approximate the java Security Manager class - I don't know enough Smalltalk to decide if this could be done effectively.)

As for giving it away, why not use Squeak - it's already out there...


I don't think Smalltalk will ever die, and some of its features live on in Java. The question Smalltalkers seem to want answered is, "Why do C++ and VB (and now Java) dominate, instead of Smalltalk?"

IMHO, Smalltalk had no unsolvable problems, but had many issues that needed to be solved simultaneously:

Free/cheap versions? Digitalk had $100 implementations for years. GNU Smalltalk has been around for a while. Squeak has been available since at least October 1997. IBM's Visual Age has had a "demo" version.

Source control? Built-in, little, or none (not even "backup to floppy" unless you really knew what you were doing); best in the industry available (a bargain for corporations, too expensive for hackers).

Ability to create standalone executables? Available, but only for the expensive commercial implementations.

Lack of a standard? How badly has that hindered Turbo Pascal, and its successor, Delphi?

Hunger for resources? It used to hurt that Smalltalk required four whole megabytes to run well. It's hardly an issue today.

What may have also hurt is Smalltalk's very limited exposure to university students (Ralph Johnson's efforts notwithstanding). AT&T gave C++ to schools for almost nothing (like C and Unix earlier), professors taught classes with it, and graduates tried to apply it to the jobs they found.

I enjoyed Smalltalk the one time my job allowed its use. I doubt I'll ever have another commercial opportunity to work with it. -- Paul Chisholm

The only time I've run across Smalltalk in use in a commercial product setting was at Cybertek (Dallas, TX) who used it for their Life Insurance admin system's GUI.


I am always frustrated by debates about programming languages. Everyone sees their language as ultimately more powerful, eloquent, expressive or generally better than the others.

Is Smalltalk a better language than Java? Is Eiffel the best language around (perhaps we should all be doing DBC)? Is Python better than Perl because it is simpler and easier to grasp? I dunno. Most of the surviving programming languages out there all have features that best fit some application domain better than others.

Some languages just feel right. If you grew up loving Pascal/Modula[123]/C/C++/Java, will Smalltalk ever feel right? If you cut your teeth on Smalltalk or Lisp, can you ever get used to C-like languages? If you can speak Perl fluently and eloquently, will (and should) you be won over by Python? And, if Java or C++ gets the job done, and (this is important) solves the user's problem, then isn't it the appropriate language to use?

With every product/project I approach, I look into my toolbox of languages (C, C++, Perl, Python, Java and perhaps one day Eiffel or Squeak) and see which one fits best. Once I choose that language, I become fanatical about it until the next project. Then again, maybe I am just a bit weird.

While I agree that having a full barnyard of languages is important to ensure that a wide variety of tools are available, it is not sufficient that any given language merely "solves the user's problem." Really, the goal is to maximize the user's return on investment, whatever that investment may be.

It is not as efficient to write a word processor in assembler/C as it is to write it in Smalltalk, and it is not as efficient to write a blit operation in Smalltalk as it is in assembler/C, although you could do either task in both.

Some languages are better suited to a large set of problem classes than others; these are generic programming languages like C, C++, Java, Lisp and Smalltalk. Other languages are better at specific problems like Perl for text parsing (although Perl is becoming more and more generic every day) or Java Script for DHTML (although I've managed to hack Java Script into a real object-oriented language, it just isn't useful).

An old-timer programmer once said to me that he doesn't have the time or patience to learn new languages every week. My reaction was that, fine, don't learn the language, but if I do I can provide a higher return on investment than he could. Thus, I'll stand a better chance of getting the job. Never toss out a tool. -- Sunir Shah


"Why would anyone even use Eiffel or Smalltalk? Interesting theoretical languages, but practical in real world applications?"

"I have been in the Software industry since 1985 and have never encountered a commercial use of Smalltalk by anybody until coming here and hearing about the Chrysler project."

Smalltalk may have originated in a research lab as a theoretical approach to teaching about programming, but by the time Xerox spun off Parc Place in 1987, Smalltalk was already in use in "real world applications". One of the earliest ones with which I'm personally familiar was The Analyst from Xerox Special Information Systems. This product, which was like Microsoft Office and the Windows desktop, with hypertext capabilities, in Smalltalk-80 way back in 1987, was in fairly wide use within certain US Government circles in the late 80s and early 90s. Texas Instrument's Control Works debuted in Knowledge Systems Corporation's booth at OOPSLA/ECOOP '90 in Ottawa, Canada. Perhaps Ward Cunningham could provide insight into Wall Street's use of Smalltalk (surely there must be a Wiki Page On Smalltalk Projects - each Gemstone Project is probably also a Smalltalk project).

At Martin Marietta in the early 1990s we built a discrete-event simulation -based process analysis tool (before Business Process Reengineering and Work Flow were even buzzwords) with Smalltalk, Gem Stone, The Analyst, and some class libraries from Knowledge Systems Corporation. We used this tool to study launch vehicle manufacturing and launch operations processes for the Titan IV launch vehicle (as well as other processes), always trying to find ways to increase throughput and reduce cost or time-in-process per unit. This work is described in the proceedings of the Society for Computer Simulation's 1992 Object-Oriented Simulation Conference.

From interviews, brief engagements, and knowledge of vendors' customers I'm aware of other industry uses (chemical, manufacturing, finance, transportation, telecommunications, utilities, pharmaceuticals), but I'll stick to uses that I can describe best from personal experience.

From personal experience again, I know that a very large western US cellular company deployed a customer service application with a Smalltalk user interface into multiple call centers housing a couple hundred customer service representatives (I'm also aware that very large, market leading package shipping company was working on a customer service app in Smalltalk, but don't know the details). And I've worked with an east-coast RBOC that uses Smalltalk in a billing application.

I'm also familiar with four commercial CASE tools and one product configurator written in Smalltalk. OOATool from Coad International, circa 1991 (contemporary with Peter Coad's Object-Oriented Analysis book), was written in SmalltalkV from Digi Talk. Ascent Logic Corporation, founded in 1987 and still in business today, (www.alc.com), made its business on a Smalltalk-80 implementation of an upper-CASE / systems engineering tool called RDD-100 purchased by large government and commercial projects in the US and abroad. Ascent's competition, Vi Tech Corporation, has a similar tool also written in Smalltalk and Gem Stone. And I think there was a company vending a requirements management tool with a SmalltalkV user interface, that was acquired by Rational. The product configurator, Classys, was sold by a company called Antalys that sold itself to Baan.

IMHO these examples are typical of the types of applications at which Smalltalk excels: i) multi-tier business applications (customer service, billing) in which Smalltalk is used at minimum to implement a modern GUI; and ii) essentially shrink-wrap applications like Analyst, RDD-100, and Classys - although of tremendous complexity in the domain model or the UI (diagram editors, etc.) - that need to run on multiple platforms to maximize market penetration.

Both types of application development effort benefit from an integrated development environment unmatched by anything in Java space, and from unparalleled application evolvability due not only to the development environment but also due precisely to Smalltalk's approach to typing and binding, and syntax (specifically in the exception-handling area). For an experienced Smalltalker these benefits translate to a productivity level that to date has been unattainable in Java. FWIW, I've been programming in Smalltalk since 1988 and in Java since 1997, and have worked on multi-tier applications, and applications with Web browser user interfaces, in both technologies.

With respect to the latter type of application, everyone is always making such a big deal about Java being platform-independent. So what? Parc Place Smalltalk was cross-platform across Windows, MacOS, and several flavors of Unix for years and years. How much of Java's market penetration is due to platforms other than these?

"Why did someone have to invent a Java?"

I hope nobody is suggesting that Java was invented specifically because "Smalltalk didn't ever take off in the mainstream of OO languages." My understanding is that the original Oak language was invented to run microprocessors in toasters, etc. - not Smalltalk's target market. IMHO Java's success is a function of Sun's marketing (as Ralph Johnson observed) as much as any inherent technological superiority, and I concur with the above assertion that the market they reached was the vast market of folks who wouldn't have much cognitive dissonance with Java's syntax.

I also concur with the assertions that as a presentation-layer technology, Smalltalk lost market share because of its price differences, footprint differences, and publicity differences in comparison with the competing technologies: yes, Visual Basic, but also HTML and Java. However I'm not sure I'd agree with the assertion that Java is any easier to learn or use than Smalltalk. The ease-of-learning challenge in both cases comes more from gaining familiarity with class libraries, and good object-oriented design, than from gaining familiarity with syntax (and IMHO Smalltalk's is simpler and more uniform). The ease-of-use challenge in both cases comes from development environment support - and IMHO Java generally trails Smalltalk in this area, with the possible exception of Visual Age for Java, which has its own host of issues (isn't VAJ actually written in Smalltalk, forming another "commercial use"?)

Late in the game the Smalltalk vendors began emphasizing Smalltalk's viability as a server-side technology (products like Visual Wave come to mind), but Java has much more standardization to offer in application server space than the Smalltalk community ever did, with things like the Servlet spec, the EJB component model, and a more viable selection of ORB vendors. I don't think the absence of a Smalltalk VM plug-in for Web browsers is a major factor - how many serious, "real-world" applications have sufficiently overcome the inherent issues to use applets for the presentation layer?

The marginalization of Smalltalk in the mainstream market has been a gloomy thing to watch for old Smalltalkers, given everything Smalltalk has contributed to the world (go rent Pirates Of Silicon Valley sometime). Market dynamics caused the switch of horses several years ago. But there are worse alternatives than Java. Java is enough like Smalltalk that much of the design experience transfers, and I actually like the Separation Of Interface From Implementation. Java promises a level of standardization unachieved by the dialect-fractured Smalltalk community which, coupled with the ongoing amount of Java development and energy around Java, could help enable the Component Based Development approach described nicely by Clemens Szyperski in Component Software. My main gripe is the difference in productivity level.

And I don't think this discussion should be cast as a frustrating or regretted Language Pissing Match - what we're really chewing on here are the forces that drive the dynamics in the market of software development technology.


The comments people are making about Smalltalk's failure to compete effectively with Java remind me of this 1993 essay on Lisp's failure to compete with C:

If you don't have time to read this whole essay, at least check out the "Worse Is Better" section: www.ai.mit.edu


While I see a number of excellent arguments, I think there is a Human Factors angle that I haven't yet seen addressed.

Background: Almost four years ago, after programming in C and a little C++ for 10 years, I ran into a Smalltalker who, upset about the client's mandate to use Java instead of Smalltalk, decided to port the Visual Works framework to Java. So while I haven't written a line of Smalltalk, I have a little familiarity with how the framework was built (without the benefit of Smalltalk tools! Arg!)

I learned a lot, especially about OO, but about eight months later I ran off screaming and hollering. Why?

It was the first time I'd ever seen tons of little methods, each one to two lines long. Smalltalkers love to maximize reuse and flexibility, which is great, but at what price? People with good memories might not think twice about this, but I have a horrendous short term memory, and I found myself spending most of my time spinning my wheels searching for the right method, instead of getting work done. Even with tools, I suspect there would be a daunting learning curve.

But back to the point I'm trying to make: I suspect becoming familiar and productive within a Smalltalk environment requires not only time and effort, but also an innate intellectual capability that developers across the board may or may not have. This means you bifurcate your development community into those who can and those who can't, and as soon as you do that you're dead, you'll never hit critical mass because there will always be someone left out in the cold. And technologies that don't hit critical mass are forever doomed to defend their turf against those that do.


Thanks for an interesting new perspective, Eileen. But what implications might this have for refactoring? Does this mean that programs with longer (therefore fewer) methods are better, because it is easier to remember where their logic is? How to define better?

"you'll never hit critical mass because there will always be someone left out in the cold"

To achieve critical mass, IMHO, you don't have to avoid leaving someone out in the cold - you just have to attract *enough* people into the warmth. I don't think that Smalltalk failed to achieve critical mass because Smalltalkers like Composed Method. I think it failed to achieve critical mass for other reasons having to do with business decisions and market dynamics.


Hi, Randy, sorry it took a while; I was trying to keep this short but it's just not working.

"But what implications might this have for refactoring?"

Fear not, I'm not one of those Blob goddesses. :) I Love refactoring, and I laughed when Martin Fowler's book came out because I'd already been doing a lot of it on other peoples' code, often from say 800 lines down to 150 or 100 lines. It's the only method I know of to help non-OO people to see the light. I also agree with much of what I read in Composed Method; I think my methods tend to run 5 - 40 lines, with a healthy number of one-liners nowadays (it took a couple years). I cringe when I see long methods nested 7 levels deep.

When I think back and try to pinpoint what caused me to feel overwhelmed at the time, I suspect it was the layers upon layers of delegation. I remember at one point griping: "Pass the buck, pass the buck, pass the buck... Who's doing the work???" And again, while it may not bother a lot of people, I found it distressing because, having a bad short term memory, I'd finally get to where something was being done and by then I've totally forgotten what I was trying to do!

"How to define better?"

We all know that Blob methods don't work. Unmaintainable; no flexibility or reuse. At the other extreme, I don't think tons of one-line methods with layers upon layers of delegation works either. Loss of comprehension, prohibitive learning curve for a non-trivial number of developers (unfortunately, Frank Sauer disagrees. He's also the Smalltalker who sent several of us screaming and hollering the first time around). Between those two is a bewildering spectrum of options, complicated by the fact that we all have our personal preferences and can't always distinguish between a right/wrong argument vs. opinion. :) So what is better? I'll recap what I think is fairly accepted at this point:

Decouple logically unrelated functionality.

Design and refactor such that when you have to change something, you have to change it only in one place.

Try and avoid more than about 7-8 levels of indirection.

I read somewhere that people can generally handle a train of thought that includes up to 7 or 8 items, but beyond that they start dropping things. You can nest more, but you do so at the risk of losing comprehension.

"you just have to attract *enough* people into the warmth."

I fully agree. But what is *enough*? I think that's where Smalltalkers like Frank may have a problem. They're inherently an intelligent bunch with good short term memories and an ability to think outside of the box. They don't always see the limitations others have in getting to where they are. That's what happened on this particular project. What I saw was phenomenal - a small island of sanity in a room full of people running around like chickens with their heads cut off. Management would run up to Frank and say "Help! Another change in the requirements! How long is it going to take?" and Frank would say anywhere from 5 minutes to two days, and 90% of the time it was 5 minutes. He spoiled the client, and the system was successfully deployed on thousands of desktops. But, I also remember a time when Frank was scratching his head because a method was firing 10 times instead of once. When he finally traced a convoluted path and realized his events were propagating back, I realized, this is great but, this client will have to keep him on call because there's always a chance that some bug will crop up that only he can fix. And sure enough, two years later, the client decided to switch to VB, because even with Frank training the client's IT staff, I don't think he was able to get them to a point where they could maintain it. So was the project a success or failure? It opened my eyes to better programming practices, so in that respect it was a success because he managed to get one convert. As to the others who left - you're not solving the problem of getting more non-OO people on the bandwagon, you're merely shuffling the problem around, and sooner or later you're refactoring their code again. The operation was successful but the patient died.

Let's change gears and look at cars as an example. What makes a successful car? One that accommodates a variety of people. That's difficult. You have people who are tall, short, fat, skinny, wear contacts, are handicapped, have long legs, short arms, short legs, long arms, etc. If you don't wear contacts, you wouldn't be sensitive to the fact that badly designed airflow systems will dry their eyes out more quickly. But a good car manufacturer will make sure it's test driven by as broad a spectrum of people as possible. Similarly, if you want to turn this into a Smalltalk world, you have to be sensitive to where it operates counter to natural human nature, and account for it. But Smalltalk by its very nature has been used by a fairly small and homogenous group of people, so they may not be as aware of human factor issues as, for example, I am. Frank gripes that this is the "world according to Eileen". I suspect if you talk to a lot of people who tried to learn Smalltalk and gave up in frustration, you may hear a similar theme.

"I think it failed to achieve critical mass for other reasons having to do with business decisions and market dynamics."

From what I understand of Smalltalk's history, I agree that the reasons you mention are the main reasons why Smalltalk didn't achieve critical mass. As well as the fact that it was well before its time. BUT, as with all things in our wonderfully complex world, when events occur and we ask why, we often can't pinpoint it to a single or even a couple of reasons, but rather a sometimes large number of different reasons with complex interactions. And while I agreed with much that was said thus far, I feel that even in a perfect world - if Smalltalk had been taught in our universities, marketed differently, etc., there is a chance this still wouldn't be a Smalltalk world today. Why?

Back to your statement "you just have to attract *enough* people into the warmth." Let's turn that around and ask what happens when you attract "not enough" people into the warmth. That means a non-trivial number of people are left out, and I don't just mean cold numbers and percentages. I'm talking about the increasing risk that one of those people is going to say "hmph! What is this nonsense? I don't understand it!" and he goes out and writes something like Java.

Just one last thing before this poor page explodes. Rational and Together/J allow you to take code and reverse engineer it into a model. I wish sometimes that there were a way to take a fully refactored (and sometimes hard to comprehend) OO system and de-factor it into a more procedural style in some browser so that you could more easily see what's going on. Refactoring is wonderful but I feel it solves only half the problem. That's for another day.


When I think back and try to pinpoint what caused me to feel overwhelmed at the time, I suspect it was the layers upon layers of delegation. I remember at one point griping: "Pass the buck, pass the buck, pass the buck... Who's doing the work???"

That's exactly what some of my coworkers said about Java/Swing and some of my code. The cure is, IMHO, to accept that you don't need to know. You need to rely on the layers. And of course this has to be possible. If the abstraction does break down there's only one tool to help you find out what's really happening: a decent debugger.

I've found that when I refactor coworkers' code, much of the "work" ends up being done by collections ... hash lookups instead of switch staments, iterations over loops instead of gobs of ifs, etc.

Then there is the guy I share a cubicle with. He learned RPG back in 1968 and has been using it almost exclusively ever since.


I guess I've never really understood why people get so passionate about discussing the syntax idiosyncracies of languages and seem so uninterested in program structure. It's like arguing whether English, Russian, or Japanese is best suited for writing novels. I tend to find good programming relies a lot more on good structure than picking the right language. -- Wayne Mack

I think that is because, in small scale, choice of language has an immediate impact on how long it takes to solve certain kinds of tasks. Consider Ward's example for the Linear Shuffle in Icon Language:

every !deck :=: ?deck

...and contrast to the complexity of the same task expressed in most other languages.

But it's not an implementation of a linear shuffle, as is clearly explained on the Linear Shuffle page. It's beautiful, but it fails to meet the task!

However, we must add two important observations: first, what holds true at small scales should not be extrapolated to larger scales. Thus, language choice might be a less important factor in most programming projects, because as overall project size increases, the effects of architecture become more important relative to the effects of language. Second, it's easy to overlook this argument of scale, and to mistakenly extrapolate the importance of language choice in the small to larger scales; because we tend to think of language choice as important, and thus relevant to doing our jobs well, we tend to have highly polarized opinions as to the 'intrinsic value of language X', when most languages will let you do most everything mostly well.

And despite the changes in English over the decades, with more and more concepts being encapsulated in smaller and smaller terms, novels are still about the same size they were a century and a half ago. The scale of a novel is qualitatively different from the scale of the language it's written in.

The language question is still important in the large scale, because some languages don't scale well.

That depends on how you scale it. Some languages might not be very good at making single giant EXE's, but may work just fine if split up into events and tasks under an Event Driven Programming framework. Related: System Size Metrics


There is intrinsic value in being able to write

fun = anInstance.method # python fun(3, "guido")

rather than

import java.lang.reflect.*; Class aClass = anInstance.getClass(); Method fun = aClass.getMethod("method", new Class[] {Integer, String)); // arf fun.invoke(anInstance, new Object[] {new Integer(3), "billjoy"});


Carson Reynolds list sites for further comparisons at lambda-the-ultimate.org



See original on c2.com