More Excited than Proud

I've been coding in a style that I have on occasion described as "cheesy" for what I leave out.

Page with Nodule

To smooth workflows in wiki I often write little scripts that go into a box on a wiki age. To use the script you find the page then press a button in the box. Here is an example.

This script enriches a search result so that it can be used to plan a meeting.

We are finding them useful and easy to write. They are called "nodules" like the little bumps on the roots of a tree that fix nitrogen. Small but important.

Nodules wake up when a page is viewed. They look for some input pages and report what they find. If this looks right, the author can push the button and see something happen. In this case 39 pages are retrieved and summarized in a useful way. Promise.all makes this go quick enough up to a few hundred pages.

There are several things that make nodules simple. Often the workflow is already in place. We're just providing helpers. The navigation is built into wiki as is the how-to documentation. We've also short-cut a lot of coding practices like writing error messages. If this said "0 items" it is up to the user to figure out why. If it doesn't say anything at all it means you are not even close to providing what the nodule needs to run.

We are also inclined to leave out lots of html boiler plate. This could be a mistake in some cases. For example, we sometimes find we need a charset declaration when we use unicode. Here is the code for this nodule. I can't say I am proud of it but I find writing these exciting in a new way. github

# Blocks

We prefer straight-line code that suggests composition as a series of interlocking blocks shown here in javascript shortened into pseudocode.

Lines 1-2 — HTML Initial presentation.

<button onclick=details(event)>details</button> <div id=result></div>

Lines 6-7 — Retrieve and characterize input.

let page = await fetch(`/search...`) result = `${length} items in search`

Lines 10-13 — Extract list of pages from input.

let refs = page.story.filter(type == 'reference') let rows = refs.map(item => let {site,slug,id,title} = item return {site,slug,id,title}

Lines 15-17 — Retrieve all pages of interest.

let hits = await Promise.all(rows .map(row => fetch(`//${site}/${slug}.json`) .then(res => {result += '.'; return json()})

Lines 18-22 — Begin composition of result page.

let story = [ {type:'paragraph', text:`Items marked ...`} ] for (let i = 0; i<rows.length; i++) { let {site,slug,id,title} = rows[i]

Lines 23-31 — Text analysis to recover details.

let text = hits[i].story .filter(text.includes('ToConsider')) .map(...) .join(...) .split(...) .filter(...) ... story.push({type:'reference',site,title,id,text})

Line 33 — Open page of results, indexed to input.

open({title:`Considered ${date}`, story})