Lazy Evaluation

Waiting until the last possible moment to evaluate an expression, especially for the purpose of optimizing an algorithm that may not use the value of the expression. Contrast this with Strict Evaluation.

Lazy Evaluation comes in handy when an expression is expensive or impossible to evaluate and may not need to be evaluated at all. It is also useful for recursively defining infinite data structures. Since each level of recursion is evaluated only as it is needed, data is only generated as it is consumed and the evaluation of the data structure can terminate when the consumption is completed.

The notion of Lazy Evaluation can be extended so that the value of an expression is used interchangeably with the expression itself. This extension of Lazy Evaluation and sharing the value of the evaluated expression is used to implement Call By Need Semantics. Call By Need Semantics essentially means that expressions are only evaluated once and then only if the evaluation is actually needed. All future instances of the expression are then exchanged directly for the calculated value. In some cases this can go beyond incremental optimization and actually reduce the computational complexity of an algorithm.

Lazy Evaluation requires the data used in the calculation to be available and meaningful at the time of evaluation. In pure Functional Programming Languages this condition is guaranteed because state changes aren't allowed. Procedural languages, however, present some challenges along with greater opportunities. Since these languages support changes in state we must ensure that the timing of Lazy Evaluation falls within the window of opportunity for the expression to be evaluated correctly. If we try to evaluate too soon the data may not be available. If we evaluate too late the data may no longer be valid. Lazy Evaluation gives us the opportunity to delay calculations until the data for them becomes available. Indeed, it also gives us the ability to instigate state-changing operations but delay their execution until an appropriate time.

Lazy Evaluation And Transaction Semantics can be combined to complement each other: Transaction Semantics can be used to isolate expressions from tentative changes in state and Lazy Evaluation can be used as a mechanism to delay those same changes until it is time to commit the transaction.


Many discussions of lazy evaluation concentrate on performance issues and on Lazy Evaluation Overhead. However, performance is of little importance in lazy evaluation. The most important thing with lazy evaluation is that it provides new means to solve programming tasks: for example, parts of program can easily communicate with the help of potentially-infinite data structures. Lazy Evaluation supports Horizontal Execution Of Vertical Constructs.


Lazy Evaluation speeds creation/initialization time at the cost of some things being slower later. It also saves memory by not creating things until they are needed with no real cost later in the program. People usually focus on performance when speaking of lazy evaluation, but memory savings are important as well.


Couldn't it be said that logical AND (&&) and OR (||) are the simplest examples of Lazy Evaluation? -- Patrick Parker

Yes, quite right. These are non-strict operators, i.e., they do not evaluate all their arguments right away but may leave some unevaluated if possible. Non-strictness is very much related to Lazy Evaluation and *every* language has to have a non-strict operator/function even if built in. For example a conditional statement (if) has to be non-strict about the block it is guarding. -- Thomas Kuehne

Well, non-strict operators can certainly be emulated in a strict language. Scheme Language is a strict language using Call By Value reduction, yet can emulate *if* as follows by nesting its arguments inside thunks (zero-arity functions).

(define (true x y) x)

(define (false x y) y)

(define-syntax my-if ; Defines syntactic abbreviation (syntax-rules (lambda) [(my-if c exp1 exp2) ((c (lambda () exp1) ; Choose a thunk and evaluate it (extra ()) (lambda () exp2)))]))

(my-if true (print "evaluating true branch") (print "evaluating false branch"))

gives "evaluating true branch"

(my-if false (print "evaluating true branch") (print "evaluating false branch"))

gives "evaluating false branch".

If you object to the Syntactic Sugar provided by Define Syntax, you can get the same effect by simply writing it out. For example, the first my-if above expands to

((true (lambda () (print "evaluating true branch")) (lambda () (print "evaluating false branch")))

True selects its first argument, which is a thunk (a function of no arguments), and the extra pair of parentheses then calls this function, causing the evaluation of its body. No primitive lazy operator is needed, unless we count lambda (which perhaps we should?)


I would argue (as have others) that lazy evaluation and Normal Order Evaluation are two different things; the difference is alluded to above. In lazy evaluation, evaluation of the argument is deferred until it is needed, at which point the argument is evaluated and its result saved (memoized). Further uses of the argument in the function use the computed value. The C/C++ operators ||, &&, and ? : are both examples of lazy evaluation. (Unless some newbie C/C++ programmer is daft enough to overload && or ||, in which case the overloaded versions are evaluated in strict order; which is why the && and || operators should NEVER be overloaded in C++).

In other words, each argument is evaluated at most once, possibly not at all.

Normal Order Evaluation, on the other hand, re-evaluates the expression each time it is used. Think of C macros, Call By Name in languages which support it, and the semantics of looping control structures, etc. Normal-order evaluation can take much longer than applicative order evaluation, and can cause side effects to happen more than once. (Which is why, of course, statements with side effects generally ought not be given as arguments to macros in C/C++)

If the argument is invariant and has no side effects, the only difference between the two is performance. Indeed, in a purely functional language, lazy eval can be viewed as an optimization of normal-order evaluation. With side effects present, or expressions which can return a different value when re-evaluated, the two have different behavior; normal order eval in particular has a bad reputation in procedural languages due to the difficulty of reasoning about such programs without Referential Transparency

Should also be noted that strict-order evaluation (as well as lazy evaluation) can be achieved in a language which supports normal-order evaluation via explicit memoing. The opposite isn't true; it requires passing in thunks, functions, or objects which can be called/messaged in order to defer/repeat the evaluation.

Yes, but you can do Continuation Passing Style transformation in a lazily evaluated language to force strict evaluation; and you can lift all expressions into subfunctions in a strict language to get lazy / normal-order evaluation.

By the way, common examples of normal-order evaluation are typesetting systems such as TeX. In these systems, having a side-effect happen more than once is usually the desired result.


See Functoids In Cpp for a way of implementing this in Cee Plus Plus.


See original on c2.com