Brian Marick's Software design evolution inspired by ecological and embodied cognition. post github
Our first design principle is “Favor direct control links from perception to action.” The more usual jargon is “direct perception”, but I (Brian Marick) found that not helpful as I tried to understand. “Direct control link” is from Ron McClamrock, who describes how flies launch themselves into the air thusly: […] podcast (7:28)
Here are the guidelines or principles or heuristics I (Brian Marick) will be using for early prototypes.
1. The app and the user (hereafter: “Brian”) are considered two independent (asynchronous) animals interacting via an Environment.
Figure 4. The system under consideration in developmental studies; no part can be considered independent of any other part. (Evidencing New Psych Forms)
For the sample app, the material environment is a Document (in the broad sense).
two-sides
Idea that the "physical" and the "mental" are two sides of a single reality.
is a terminus technicus of sociological systems theory coined by Luhmann.
Someone wrote in Two sides of wiki about the seemingly opposite purpose of Wikipedia and Fedwiki: …
~
2. The “app-animal” is divided into three systems. The perceptual system observes the environment, looking for new affordances. When one is seen, control is handed into the “control” system, which – typically – instructs the motor system, which changes the environment. This sequence is a direct control link, which is what I’ll be focusing on in early prototypes.
Die konditionierte ›Triproduktion‹ DES Sinnsystems
~
3. All modules are structured as a soup of actors that, ideally, communicate asynchronously. (Exceptions will come down to human weakness in managing complexity.) Prototypes will use Erlang processes. The app proper will use Swift actors, but they’ll be used as if they were as lightweight as Erlang processes. “Processes” is the term I’ll use going forward.
4. Perceptual processes will be indexical; they will be looking at something. Like, for example, a Paragraph. I will tend to use spatial metaphors for the objects of attention. For example, I think of a script as a series of paragraphs or groups of paragraphs, laid out in a linear fashion.
5. Perceptual processes are created by control processes that are, metaphorically, saying “I want you to pay attention to that and look for your affordance there.” When the affordance is seen, it will start a control process specific to that affordance and then exit. (The app-animal may want to watch for repeated affordances, but it will do that by having a control module recreate an indexical perceptual process.)
6. Perceptual processes may maintain state when an affordance requires observing a sequence of changes in the environment. As with all processes, the state will be minimal.
7. Perceptual processes will “get bored” over time and go away. I will prefer that to explicitly shutting them down. (There will be a layering of perceptual processes that will enable both this and indexicality, but I haven’t figured that out yet.)
8. Control processes will be ephemeral. They will react to an affordance by (1) changing what the perceptual system is attending to, and/or (2) instructing the motor system to change the environment. For convenience, I’m going to think of the control process as telling the motor system “create a new affordance – that is, an opportunity for action – for the human.” (I’m not sure about this.)
“The world is its own best model.” – Rodney Brooks
9.Because control processes are ephemeral, any information needed to respond to a later affordance must be stored in the environment. It can be retrieved in two ways. (1) A perceptual process may be fired to “keep track of it”. (2) When needed, a perceptual process will be started to scan the environment for the information needed. (In general, reacquiring information will be favored over keeping track of it.)
10. Ephemeral control processes also implies that plans are handled the way the Pengi episode explained podcast : each step will leave an affordance in the environment that will prompt the next step.
> When it comes to the perceptual system, I’ll be borrowing the idea of focused (or indexical) attention from last episode’s Pengi system. For example, when I create a new note, certain processes will spring into life and watch the text I type into it. When I’m moving around in the script proper, the particular paragraph the cursor is in will be watched carefully. (I’m going to make the cursor position part of the document; that is, part of the environment. It’s some of the movement the app-animal will be watching.) podcast
11. For the most part, I want the app-animal to detect affordances by observing the environment. For example, it should notice when one paragraph is split in two with
two blank lines between them. But I'll probably first implement a key chord that means "starting my split-a-paragraph editing thing now" which will both split the paragraph and also send the affordance to the paragraph's watcher. That way seems a better first step. And recognizing some affordances may be too hard, so I'll settle for having to remember to signal my intent to the app-animal.
Parts of the content reproduced here are © 2024 Brian Marick; see "Coding, New Hampshire Style" page for details.