by Santiago Escobar
A challenging problem in modern programming languages is the discovery of sound and complete evaluation strategies which are ``optimal'' w.r.t. some efficiency criterion (typically the number of evaluation steps and the avoidance of infinite, failing or redundant derivations) and which are easily implementable.
For orthogonal term rewriting systems (TRSs), Huet and Lévy's needed rewriting is optimal [HL91]. However, automatic detection of needed redexes is difficult (or even impossible) and Huet and Lévy defined the notion of ``strong sequentiality'', which provides a formal basis for the mechanization of sequential, normalizing rewriting computations [HL91]. In [HL91], Huet and Lévy defined the (computable) notion of strongly needed redexes and showed that, for the (decidable) class of strongly sequential orthogonal TRSs, the steady reduction of strongly needed redexes is normalizing. When strongly sequential TRSs are restricted to constructor systems (CSs), we obtain the class of inductively sequential CSs (see [HLM98]). Intuitively, a CS is inductively sequential if there exists some branching selection structure in the rules. Sekar and Ramakrishnan provided parallel needed reduction [SR93] as the extension of Huet and Lévy needed rewriting for non-inductively sequential CSs (namely, almost orthogonal CSs).
A sound and complete rewrite strategy for the class of inductively sequential CSs is outermost-needed rewriting [Antoy92]. The extension to narrowing is called outermost-needed narrowing [AEH00]. Its optimality properties and the fact that inductively sequential CSs are a subclass of strongly sequential programs explains why outermost-needed narrowing has become useful in functional logic programming as the functional logic counterpart of Huet and Lévy's strongly needed reduction. Weakly outermost-needed rewriting [AEH97] is defined for non-inductively sequential CSs. The extension to narrowing is called weakly outermost-needed narrowing [AEH97] and is considered as the functional logic counterpart of Sekar and Ramakrishnan's parallel needed reduction.
Unfortunately, weakly outermost-needed rewriting and narrowing do not work appropriately on non-inductively sequential CSs.
Consider Berry's program [SR93] where T and F are constructor symbols and X is a variable:
This CS is not inductively sequential since there is no branching selection structure in the rules. Although the CS is not inductively sequential, some terms can be reduced sequentially; where ``to be reduced sequentially'' is understood as the property of reducing only positions which are unavoidable (or ``needed'') when attempting to obtain a normal form [HL91]. For instance, the term B(B(T,F,T),B(F,T,T),F) has a unique `optimal' rewrite sequence which achieves its associated normal form T:B(T,F,X) = T B(F,X,T) = T B(X,T,F) = T
However, weakly outermost-needed rewriting is not optimal since, besides the previous optimal sequence, the sequenceB(B(T,F,T),B(F,T,T),F) -> B(B(T,F,T),T,F) -> T
is also obtained. The reason is that weakly outermost-needed rewriting partitions the CS into inductively sequential subsets R'={B(X,T,F) = T} and R''={{B(T,F,X) = T, B(F,X,T) = T} in such a way that the first step of the former (optimal) rewriting sequence is obtained w.r.t. subset R' whereas the first step of the latter (useless) rewriting sequence is obtained w.r.t. subset R''.B(B(T,F,T),B(F,T,T),F) -> B(T,B(F,T,T),F) -> B(T,T,F) -> T
Note that the problem also occurs in weakly outermost-needed narrowing. For instance, term B(X,B(F,T,T),F) has the optimal narrowing sequence
whereas the non-optimal narrowing sequence isB(X,B(F,T,T),F) ->_{id} B(X,T,F) ->_{id} T
B(X,B(F,T,T),F) ->_{{X:T}} B(T,T,F) ->_{id} T
As Sekar and Ramakrishnan argued in [SR93], strongly sequential systems have been widely used in programming languages and violations are not frequent. However, it is a cumbersome restriction and its relaxation is quite useful in some contexts. It would be convenient for programmers that such a restriction could be relaxed while optimal evaluation is preserved for strongly sequential parts of a program. The availability of optimal and effective evaluation strategies without such a restriction could encourage programmers to formulate non-inductively sequential programs, which could still be executed in an optimal way.
Consider the problem of coding the rather simple inequality
x + y + z + w <= hfor natural numbers x, y, z, w, and h. A first (and naive) approach could be to define an inductively sequential program:
Consider the (auxiliary) factorial function n!. The program is efficient for term t_{1}=solve(1,Y,0,10!,0) since False is returned in 5 rewrite steps. However, it is not very efficient for terms t_{2}=solve(10!,0,1,0+1,0) or t_{3}=solve(10!,0+1,0+1,W,1) since several rewrite steps should be performed on 10! before realizing that both terms rewrite to False.solve(X,Y,Z,W,H) = (((X + Y) + Z) + W) <= H 0 <= N = True s(M) <= 0 = False s(M) <= s(N) = M <= N 0 + N = N s(M) + N = s(M + N)
When a more effective program is desired for some input terms, an ordinary solution is to include some assertions by hand into the program while preserving determinism of computations.
For instance, if terms t_{1}, t_{2}, and t_{3} are part of such input terms of interest, we could obtain the following program
Note that this program is not inductively sequential but orthogonal, and some terms can still be reduced (or narrowed) sequentially. For instance, when we consider the previous term t_{3} for rewriting:solve(X,s(Y),s(Z),W,s(H)) = solve(X,Y,s(Z),W,H) solve(X,0,s(Z),W,s(H)) = solve(X,0,Z,W,H) solve(s(X),Y,0,W,s(H)) = solve(X,Y,0,W,H) solve(0,s(Y),0,W,s(H)) = solve(0,Y,0,W,H) solve(0,0,0,0,H) = True solve(s(X),Y,0,W,0) = False solve(X,0,s(Z),W,0) = False solve(0,s(Y),Z,0,0) = False solve(X,s(Y),s(Z),s(W),0) = False
or when we consider the term t_{4}=solve(X,Y,0+2,W,1) for narrowing:solve(10!,0+s(0),0+s(0),W,s(0)) -> solve(10!,0+s(0),s(0),W,s(0)) -> solve(10!,s(0),s(0),W,s(0)) -> solve(10!,0,s(0),W,0) -> False
Indeed, note that even in the case we add the following rule solve(X,s(Y),0,W,0) = False, which makes the program almost orthogonal, t_{3} is still sequentially rewritten and t_{4} sequentially narrowed.solve(X,Y,0+s(s(0)),W,s(0)) ->_{id} solve(X,Y,s(s(0)),W,s(0)) ->_{{Y:0}} solve(X,0,s(0),W,0) | ->_{{Y:s(Y')}} solve(X,Y',s(s(0)),W,0) ->_{id} False | ->_{{Y':0}} False | ->_{{Y':s(Y''),W:0,X:0}} False | ->_{{Y':s(Y''),W:s(W')}} False
Modern (multiparadigm) programming languages apply computational strategies which are based on some notion of demandness of a position in a term by a rule (see [AL02] for a survey discussing this topic). Programs in these languages are commonly modeled by left-linear CSs and computational strategies take advantage of this constructor condition (see [AFJV97,MR92]). Furthermore, the semantics of these programs normally promotes the computation of values (or constructor head-normal forms) rather than head-normal forms. For instance, (weakly) outermost-needed rewriting and narrowing consider the constructor condition though some refinement is still possible, as shown by the following example.
Consider the following TRS from [AG01] defining the symbol %, which encodes the division function between natural numbers.
Consider the term t=10! % 0, which is a (non-constructor) head-normal form. Outermost-needed rewriting forces the reduction of the first argument and evaluates 10!, which is useless. The reason is that outermost-needed rewriting uses a data structure called definitional tree which encodes the branching selection structure existing in the rules without testing whether the rules associated to each branch could ever be matched to the term or not. A similar problem occurs when narrowing the term X % 0, since variable X is instantiated to 0 or s. However, neither instantiation is really necessary.0 % s(N) = 0 s(M) % s(N) = s( (M - N) % s(N)) M - 0 = M s(M) - s(N) = M - N
In [Esc03], a refinement of the demandness notion associated to outermost-needed rewriting and narrowing is provided and extensions of the outermost-needed rewriting and the outermost-needed narrowing called natural rewriting and natural narrowing are provided. These extensions are able to avoid the two problems formerly mentioned. Intuitively, natural rewriting (and narrowing) always reduce (narrow) the most demanded position in a term. These definition apply to left-linear CSs. Regarding inductively sequential CSs, a bigger class of TRSs called inductively sequential preserving is introduced (where natural rewriting and narrowing are optimal). Indeed, the previous example shows that for the class of inductively sequential CSs, the natural rewriting strategy behaves even better than outermost-needed rewriting.
[AG01] T. Arts and J. Giesl. A collection of examples for termination of term rewriting using dependency pairs. Technical report, AIB-2001-09, RWTH Aachen, Germany, 2001.
[AFJV97] M. Alpuente, M. Falaschi, P. Julián, and G. Vidal. Specialization of Lazy Functional Logic Programs. In Proc. of the ACM SIGPLAN Conf. on Partial Evaluation and Semantics-Based Program Manipulation, PEPM'97, volume 32(12) of Sigplan Notices, pages 151-162, New York, 1997. ACM Press.
[Antoy92] S. Antoy. Definitional trees. In Proc. of the 3rd Int'l Conference on Algebraic and Logic Programming, ALP'92, volume 632 of Lecture Notes in Computer Science, pages 143-157. Springer-Verlag, Berlin, 1992.
[AEH97] S. Antoy, R. Echahed, and M. Hanus. Parallel evaluation strategies for functional logic languages. In Proc. of the Fourteenth International Conference on Logic Programming (ICLP'97), pages 138--152. MIT Press, 1997.
[AL02] S. Antoy and S. Lucas. Demandness in rewriting and narrowing. In Proc. of 11th International Workshop on Functional and (Constraint) Logic Programming WFLP'02, volume 76 of Electronic Notes in Theoretical Computer Science. Elsevier Sciences, 2002.
[Esc03] S. Escobar. Refining Weakly Outermost-Needed Rewriting and Narrowing. In Proc. of the Fifth ACM-SIGPLAN International Conference on Principles and Practice of Declarative Programming (PPDP'03), pages 113-123. ACM Press, 2003.
[Han02] M. Hanus. Curry: An Integrated Functional Logic Language, 2002.
[HAHKNSS02] M. Hanus, S. Antoy, K. Höppner, J. Koj, P. Niederau, R. Sadre, and F. Steiner. PAKCS 1.4.5: The Portland Aachen Kiel Curry System User Manual. Technical report, University of Kiel, Germany, 2002.
[HLM98] M. Hanus, S. Lucas, and A. Middeldorp. Strongly sequential and inductively sequential term rewriting systems. Information Processing Letters, 67(1):1--8, 1998.
[HL91] G. Huet and J.-J. Lévy. Computations in Orthogonal Term Rewriting Systems, {Part I + II}}. In Computational logic: Essays in honour of J. Alan Robinson, pages 395-414 and 415-443. The MIT Press, Cambridge, MA, 1991.
[MR92] J. Moreno-Navarro and M. Rodríguez-Artalejo. Logic Programming with Functions and Predicates: the Language BABEL. Journal of Logic Programming, 12(3):191--223, 1992.
[SR93] R. Sekar and I. Ramakrishnan. Programming in equational logic: Beyond strong sequentiality. Information and Computation}, 104(1):78--109, 1993.
ELP | GPLIS | DSIC | UPV |
Last update Sep 2003 # sescobar@dsic.upv.es