The capacity for a system to monitor and control its behavior using information communicated from the environment is called self-regulation. The majority of self-regulating systems operate on the basis of negative feedback. They compare the current state of the variable or variables they are monitoring with the desired state and make appropriate adjustments as a divergence occurs. The major characteristic of self-regulating systems is that control is implicit. The action of a variable going out of control triggers the regulatory mechanisms to bring it back under control.
The household thermostat is such a device. There is a setting and the purpose of the regulatory activity is to maintain it; signaling the furnace to go "on" when its sensors tell it that temperature has fallen below the desired level and "off" when it has been reached. A modem device might also signal the air conditioning to go "on" when the desired temperature was exceeded. The limits of self-regulation are evident in the thermostat example. The heating system can control itself only to the extent that the necessary information loops are designed into its sensors. It does not have the capacity to step outside the loop and, for example, contract for more insulation to be installed. Although self-regulatory devices may be quite simple, they may also be complex and may be set to probabilistic as well as deterministic feedback information. It may be geared to respond when it records the presence of the conditions under which error often occurs and make an adjustment before error actually happens. Or, it may be designed with enough flexibility to recover from disturbances that were not anticipated by its designer as well as from those which were anticipated. If it does so, we refer to it as ultrastable. Even very complex organizations such as families may act as if they were simple feedback-operated devices when for instance, children in a restaurant are shushed when they begin to talk too loudly rather than planning family outings to take their normal behavior into account. # SOURCE Early work by Weiner, Ashby, McCulloch and others investigating the properties of internally controlled systems. # EXAMPLES • the Watt steam governor • self-correcting missiles • a sprinkler system • a manager operating on attenuated information # NON-EXAMPLES • a farmer who reorganizes his crops on the basis of a proposed increase in the water available for irrigation • a marketplace operating under a time lag which postpones feedback until too late for adjustment • a system operating with a high degree of chance # PROBABLE ERROR • Using a self-regulatory mechanism to perform a more complex regulatory task than it was designed to perform. • Not taking advantage of existing self-regulatory capacities and overriding them # SEE Self-organizing system; Ultrastability; Requisite variety