Side Effect

A Side Effect is when a computer function causes a non-local change to something other than its inputs and declared output. A benign example would be, say, logging. But according to the proponents of Functional Programming, because side effects are not explicit, they lead to many bugs; for instance, we all know how fragile are programs that rely on particular sequences of function calls to manipulate state. Aspect Oriented Programming, though, embraces the possibility of side effects that "crosscut" system concerns (again, logging may be a good example).

One example of this is noted in the Mythical Man Month ( pg.63 original edition, pg.65 anniversary edition). Fred Brooks called one type "curios" -- side effects of supposedly invalid operations. They ended up considering them as part of the definition when emulating that system.

There is some pressure to use the more apt terminology 'Intentional Effect' or Programming Effect to describe results intended by the programmer (such as logging or maintenance of data state). The meaning of 'Side Effect' is then limited to effects that are unintentional and undesirable. This would better match the term's use in the general public (e.g. with regards to drugs).


Heavy usage of databases could be considered relying on side-effects. This is because databases tend to model real-world nouns one-to-one. If you change the attribute for a person's weight, you are updating the one-to-one model to reflect the person's "state". This is a very effective technique and seems to end up being emulated to some extent in most non-trivial applications (see Database Not More Global Than Classes). Even FP will violate side-effect purity at certain points in practice to serve this idea. It is hard to model the real world if everything is always local. --top

Yes, database tables are side effects galore. They are like variables which store different states. This is why they came up with the term Rel Var since a database table is really like variable that changes over time. FP will violate side effect purity through Functional Loop Holes. I've always found the whole functional dogma to be extremely strange... because the whole point of using computers is to have all sorts of side effects. If Functional programmers could get their shit together and actually market functional programming to mean something useful, more people would try it out. Telling people that state is bad and that side effects are bad, is like telling a person riding a bicycle to stop peddling because peddling causes you to actually go places and do things.. it's the WRONG way to market a product - a bike is supposed to get you places, so you should peddle it. A computer is for storing data, modifying data, retrieving data, writing data - it's full of wonderful side effects. FP purity envy is worse than Discipline Envy.


Spyware and things that can get installed while surfing the internet in an insecure browser are a Side Effect. Limiting or making side effects transparent to the user should help solve a number of security issues.

Example: This program wishes to install components X, Y, and Z that will run every time you start your computer, do you wish to allow this?

Example2: This program wishes to email the contents of "bookmarks.html" to spambot@xyz.ru, do you wish to allow this?

Example3: This program wishes to perform (this perfectly legitimate operation), do you wish to allow this?

Problem: Frequency of legitimate operations >> Frequency of illegitimate operations, making for a difficult problem; e.g. even if 99.9% of all ops were classified correctly, and 0.01% of all ops are illegitimate, one would be queried for about 10x as many false positives as real positives. This will cause great frustration (akin to Windows Vista) and ingrain in many users the habit of allowing operations without reading them (as has been observed in the field).

What if we were to allow some form of "run this carefully"? kinda option to try things out?

You mean to audit software? That can help, though it still will encounter the same problems seen here. Also, you'll need to be sure that the spyware can't detect the mode and "play nice while the cops are watching". Some of these problems are handled in the Capability Security Model, in any case - the typical behavior in such cases is to simply reject ALL operations for which capabilities don't exist, but with an Power Box mechanism, one could call back to the user or appropriate user-agent when permissions are required in order to tweak security policy at runtime.



Events that may look like Side Effects at first might be caused intentionally with technology such as Integrated Circuit Transducers Category Jargon

See original on c2.com