The Art of Interactive Design by Chris Crawford. ISBN:1886411840
Interestingly enough... I was at Powells Books the other day and picked up a copy of Chris Crawfords The Art Of Interactive Design. I'm only through the first couple of chapters so far; but much of what Crawford has to say reflects some of your statements on the topic. (The book contains some interesting anachronisms, especially since it's publication date is 2003--it's apparent that parts of it were written several years before that). The book starts off, for example, by stating that usability engineers ought to throw it as far as they can, lest they be thoroughly annoyed by the heresy (to their field) that is to follow...
It's damned good for interaction designers.
Finally, a recommendation; since my little collection doesn't contain gems, I've been asking around for one for a few years, and getting only suggestions for things that are literally Off Topic but indirectly On Topic if you squint hard enough, plus comments like yours about About Face for most directly On Topic books. A few minutes after you said the above on Sunday, I ordered it from Amazon on the strength of your recommendation; thanks. (P.S. some random stuff in SIGCHI has been good, but I've never seen the cream skimmed and collected, although some are referenced here and there by people like Jef Raskin.) -- Doug Merritt
Some errors in the book:
colour depth isn't what's important, it's colour contrast. Colour contrast is a mix of colour depth and monitor contrast ratios. LCD contrast ratios absolutely suck, CRT are much better, the best ratios are only available in plasma screens.
contrary to what's implied, there does in fact exist an input device beyond the keyboard and mouse, it's the voice recognition card / microphone
monitors can be arrayed cheaply, just buy 3 of them and put them side by side
the keyboard CANNOT be used arbitrarily for non-text input, you can only use Mouse Keys for overriding ergonomic reasons
It's actually quite annoying how the mouse, a continuous input device, is broken down into discrete units in order to force a comparison with the keyboard; the essential difference between the two is that one is discrete where the other is continuous, that they are incomparable
I don't get this. Surely any analogue device must be sampled to be used as input to a digital computer? Therefore, the mouse must already be in discrete units.
Regarding monitor arrays, is it possible they meant "seamlessly arrayed edge to edge"?
Unfortunately not, since he proceeds to describe how tooltips, balloon help and context menus add pixels to the screen. IOW, his use of pixels and screen size is loose enough that it should allow setting monitors side by side. But he overlooks that possibility because it's not commonly done. The same thing with voice recognition; not common enough to merit his recognition.
(And actually, Chris Crawford uses multiple monitors to increase screen realestate so ....)
It's either that or the book is really old. Because despite a few references to Java, his outlook is very procedural. Also, in the Thinking chapter, he starts by making the asinine observation that all computer thinking reduces down to basic logic operations but neglects to mention that neural networks reduce down to excitations and inhibitions. That's a pattern that recurs in Chris Crawford's writing; really solid understanding of the practical coupled with weak understanding of the theoretical. Or a better way to put it might be that he understands everything about the hardcore CS aspects and is very weak when he tries to relate it to the world in general.
One criticism I have of the book is that it frequently is completely unscientific. This is a criticism that Crawford would likely strongly reject, as he makes it clear that he considers interactive design to be art and not science (and that it should always remain so).--a dichotomy which I think is unfortunate when used in a prescriptive rather than descriptive fashion). The book is full of personal anecdotes, rambling analogies (though amusing ones) and such which Crawford employs in a attempt to prove his points. Except that this sort of evidence doesn't prove anything--it serves to highlight his opinions, but doesn't help to elevate his claims above opinion.
The other criticism that I have--though I suspect RK agrees with Crawford--is his occasional habit of accusing the programming community--as a whole--of what I would consider to be misconduct. He has lots of advice for budding Interaction Designers on how to detect the "lies" that programmers (alledgedly) tell, when they don't wish (usually on account of "laziness") to implement the interaction designer's designs (this is a topic that RK has discussed a bit here). In my experience, programmers like interesting challenges and are more than happy to take on complex projects; but often times the complex design (even if correct from an interaction design point of view) takes longer to code--informing the project stakeholders of that is our professional duty. Where the problem occurs--and where programmers do often chafe against their employers--is when the PHB (be he an interaction designer or a garden-variety manager) wants the more complex design--but doesn't want to lengthen the schedule, add staff, or remove other capabilities. Instead, the expectation seems to be that the programmers should simply work smarter, or work harder--and for them to not accept additional burdens without any additional consideration is laziness.
Even the word processor example he gave in that chapter demonstrates this pattern. It's a particularly noxious example of design which Alan Cooper would never have stood for, ironically enough. What's particularly disgusting about it is that word processors epitomize two things which Crawford rightfully condemns. First, word processors are built around a static object (a "document" with no history) and not an interactive process (a history of document editions and changes). Second, word processors have been transposed wholesale from other media (typewriters, paper), which he condemns wholesale in the case of game design. The fundamental unit of text isn't the line, it's the paragraph; only paper imposes lines and pages. See Referential Editor for more. But to get back to the point, Crawford doesn't seem to see the correspondence between word processors and paper because, I speculate, he doesn't understand the subject of paper. This is the generous explanation since the alternative is that he doesn't understand electronic media.
I get the feeling that Crawford isn't just a grey-haired consultant, that he does have a deep understanding of the subject matter, but that he's been overly focused and it's resulted in his becoming isolated from other subjects. -- RK
Crawford's been working on this thing called the Erasmatron for years--and he's been doing so out somewhere in the mountains of eastern Oregon. Apparently, he has some rather tenuous relations with the rest of the interactive fiction community. Which isn't necessary a bad thing--but I think you may be correct that he is a bit isolated from other professionals in his field.
I meant something other than that, though he probably has isolated himself from OOP. I meant that he's isolated himself from the subject matter of other fields. Take for instance his explanation of the mouse as a pointer. That ties the mouse to language, or at least a part of language. Voice recognition can be used for issuing commands so with voice + mouse, you've got a whole language. One wonders where the keyboard fits in, but that's relatively unimportant. What's important is that language is a very popular topic in CS, so much so that we can consider it hardcore CS. Now compare this with my own analysis of the mouse vs keyboard as continuous vs discrete. That's from math. Apparently, math is far enough out from mainstream CS, and Crawford's area of interest, that he doesn't reach for it intuitively.
In the chapter on Listening, Crawford calls the mouse a pointing device. Well, everyone does this but he goes on to explain what it means. In natural language, there are many words (eg, pronouns) used to point to things. So in this very sentence, which follows the previous sentence, I am saying that. In contrast, the keyboard is used to describe things. This is a very insightful observation and I'm still trying to figure out its implications.
One of the differences between pointing using language and pointing using a mouse is that the former has a much larger vocabulary for pointing. Mice, and other members of the rodent family, typically have only two words for pointing; this (select) and those (drag-select). One wonders whether there are other words that would be useful. Especially since those can be reduced to 'from this to this' and so isn't a meaningfully distinct word. As it turns out, you can augment the mice's vocabulary by splitting this into (look) this way and (touch) this thing. I invented this mouse quasimode on the basis that it was associated with the Hand, now it looks like I have a theoretical justification for it. Ironically, what I really wanted as a mouse quasimode (help inquiry showing possible actions on an object) can't be justified this way.
See original on c2.com