Entropy

is a measure of the Disorder or decay of order

Work, such as that performed by thermal energy, causes a change that in turn depends on a distinction between one state and another. In an analogous process, a message involves a selection and differentiation of a particular message.

START 4958 YOUTUBE XWuUdJo9ubM Entropy and Before and After

Entropy measures the decay of the differentiation which permits work to be done or messages articulated. Because of its unidirectional progress, the most probable state is one where differentiation has been relaxed. ’Thus," Entropy is a measure of a system's inexorable tendency to move from a less to a more probable state". Entropy can be described in general or precise terms based on the area of application. An organization requires new infusions of effort to maintain its vitality and 1% offset its tendency to run down. An animal needs a continuous supply of food to maintain itself. Without it, it can no longer maintain its order and it moves toward maximum entropy - and death. The perception of disorder in a system varies according to the observer's definition offee system and its purposes. For example, in one organization, order may be perceived in terms of hierarchy while in another it will be seen in terms of a network of small self-organizing units. As a measure of disorder in a system, the conception of entropy and the ways it is measured differ somewhat depending on the system under consideration.

In mathematics, entropy is expressed as the logarithm of the probability of a certain state. In thermodynamics, entropy is the subject of the Second Law which says that heat in a system will tend to even out until none is available for doing work. If work is to continue, heat must be added to the system to offset the effect of entropy.

In statistical mechanics, entropy is a measure of disorder in the arrangement of atoms. The maximum disorder and the most probable-state is one of random distribution. The entropy increases as the degree of randomness increases. In cybernetics and information theory, the amount of information corresponds to the degree of order in the system and the entropy to the amount of disorder. Thus, entropy and information are seen as opposing concepts with entropy defined as information with a negative sign. A system maintains its order through the addition of information. Entropy may be seen in some contexts as a measure of uncertainty. In information theory, entropy may be calculated for a given source and subtracted from its maidmwm entropy* This gives its relative entropy. The redundancy of the message is figured at one minus relative entropy. # SOURCE For Thermodynamics, see Clausius; for statistical mechanics, see Boltzmann and Gibbs; For cybernetics and information theory see Wiener, N. (1961). Cybernetics; or, Control and communication in the animal and the machine. New York: M.I.T. Press. Shannon, C. E., & Weaver, W. (1964). The mathematical theory of communication. Urbana: University of Illinois Press. # EXAMPLES • on the way from the delicatessen to your desk, the hot coffee gets luke warm and your ice cream begins to melt Before you ordered them, electricity supplied the energy to keep the coffee hot in the urn and the ice cream cold in the freezer • holiday traffic is backed up for miles waiting for a backlog at the bridge to be cleared. The once orderly movement of cars is transformed to a collection of Frisbee games, picnics, people sitting on fenders, etc. • a puppy overturns the garbage and messes up the yard. • an untended garden becomes overgrown • a sports team's plays become sloppy if they have not had enough practice # NON-EXAMPLES • the members of a community organize themselves to maintain essential services after a flood has cut them off from normal sources of supply • a message is transmitted with a given degree of accuracy a system grows and becomes more specialized • an organization maintains and enhances its identity through participative planning sessions and training probable Error • viewing the progress of decay and disorder ilia system as an aberration, neglecting to consider the allocation of resources to maintain system order # SEE Information; Self-organization; Autopoeisis; Uncertainty