Turing Machine

The key to Noam Chomsky’s original insight into the structure of language can probably be characterized as recognizing that natural language syntax can be modeled as a Turing Machine. In abstract form a Turing machine can be understood as a set of rules for writing erasing and rewriting strings of characters, in which these rules are also encoded as character strings that can be treated the same way. Rendering an operation in these terms makes it possible to automate any finite determinate process (from robotic behavior to mathematical calculations), which is why it contributed to the design of modern computers. Because of the power of this methodology, this insight was not only valuable for developing a formalism for modeling syntax, it also became a driving force for the development of the cognitive sciences.


Probably the most critical factor contributing to the structure of natural languages in addition to semiotic constraints is the need to communicate symbolically in real time.

Artificial Neural Networks perform arbitrary non-linear mappings between (potentially vague or misleading) inputs and outputs, and are typically used for pattern recognition and association. The computation is performed with a number of "neurons", which were inspired by, but rarely bear any resemblance to biological neurons.

Welcome Visitors stated circa 1996 that the Wiki pages hosted by Ward Cunningham are part of the Portland Pattern Repository and contain "an incomplete and casually written history of programming ideas". I'm suggesting that this page is used to highlight areas in which any Wiki reader feels that the current informally written history is either incomplete or unbalanced.