We consider compiler implementation considerations as concrete evidence for categorizing generative structures like pattern language and causal loop diagrams. matrix
In compilers, typically syntax provide local rules to recognize tokens but doesn't remember anything it has decided. The grammar is matched against tokens where one token lookahead plus memory of what has been seen (a parse stack) is enough information to recognize what grammar rule applies. The syntax and grammar will determine which of all possible inputs are allowed in the language.
Consider arithmetic expressions such as 5*(2+3)
.
Syntax rules would Identify numbers vs operators and can do that with fixed space.
The grammar rules would say that open and closed parens must match using its stack which will grow in proportion to the depth of paren nesting.
Expressions 5*((2+3))
and 5*(((2+3)))
are equivalent but take proportionally more stack to establish this. We can limit the size of the stack by say that parens can only be nested 10 deep.
We can ask, how many allowed ways are there to parenthesize this expression? This is like asking how big is this part of the language and the answer is quite big. Consider the many ways to add parens without violating our 10 max limit.
5*(2+3) (5)*(2+3) ((5))*(2+3) ((5))*((2+3)) ((5))*((2)+3) ((5))*(2+(3)) (5*(2+3)) ((5)*(2+3))
Here we haven't yet exhausted the possibilities going only 2 deep in parens. We can guess how many ways we can parenthesize this by noting there are 3 numbers plus 2 operations making 5 places to add up to 10 extra parens. I think that comes out to 5^10 which is close to 10 million.
We started by talking about what kinds things can be called a language versus syntax or grammar. I've made a counting argument based on how quickly my choices grow or how the memory required of a compiler grows. The same kind of arguments can be made about CLDs which seem most similar to a grammar but similarly could be said to describe systems that could have quickly billions of state.