Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

The analogy between computer viruses and biological viruses, from which computer viruses get their name [7], has been clear for the past several decades. During that time there has been progress in both understanding the vast diversity of biological viruses, and in abstract approaches to understanding computer viruses. However, there has not been a great deal of effort to see if the formal efforts in theoretical computer science can be of any use to our understanding of biological viruses.

In this work, we use biological viruses as a motivation to extend some well-known results in theoretical computers viruses. In Cohen’s [7], a virus is a string which–read by a Turing Machine–reproduces either itself or a variant form of itself. In Adelman’s [1] initial formalism, the theory of computer viruses was placed in the theory of recursive functions. One well known result from both theories is that the general problem of viral detection is undecidable, implying that general computer immune systems based on viral detection can always be circumvented and there are no robust ways to modify a detector successfully [5].

In biological systems, there is some notion that a biological virus is less powerful, computationally, than the host it infects. Motivated by that analogy, here we show two cases where viruses, due to diminished computational capacity relative to their hosts, will not always win. But, first, what is a virus? We state after Adelman [1] and Bonfante, Kaczmarek, Marion [3] that a virus \(\mathbf v\) is a fix point in the sense of Kleene’s Second Recursion Theorem:

$$\begin{aligned}{}[\![ \mathbf{v} ]\!](x) = f(\underline{\mathbf{v}}, x) \end{aligned}$$

where \(f\) is called the propagation function which defines a viruses behavior in regard to its first argument. As justified in [3] or by Case and Moelius in [6], the model is strong enough to capture the virus mutability or even virus “factories”. It is shown that the different versions of the Recursion Theorem–weak, strong, extended, double, see Smullyan’s [12] for a precise terminology–correspond to different aspects of computer viruses.

Fixed points exist as long as the framework is universal [11], that is Turing complete, with a universal function and a specializer. Thus the following defense strategy: to block viruses, one may simply prevent the existence of fixed points. As we will see, the Recursion Theorem holds as long as there is a specializer, projections and composition. It is hard to avoid the two latter criteria. Thus, we focus on systems without specializers. We provide a solution based on cons-free programs as it has been developed in the past by Jones [9]. Whatever the choice of enumeration of programs, there is no specializer for cons-free programs. It is worth noticing that the language is logspace-complete making it relatively powerful computationally speaking.

We develop an other scenario, perhaps closer to the analogy with biology. Our idea is to strengthen the defense against viruses, not to avoid their existence. Indeed, biological viruses exist, but their hosts have some defense capabilities. In this scenario, we suppose that there is a (finite) set of known viruses. Then, each time a program enters the system, it is submitted to a program (an immune cell) which verifies whether it behaves like one of the viruses and remove it accordingly. Our scenario is close to the strategy of anti-virus software: they (are supposed to) recognize infected programs relative to a malware database which contains the (finite) set of known viruses. The existence of such a detector infringes Rice’s Theorem. Indeed, it corresponds to the decidability of program equivalence. Thus, and again, to get such a language, we will loose Turing completeness.

To our knowledge, in computer virology, only “negative” results have been established so far. They state more or less that there is no defenses against viruses. On the theoretical side, we refer the reader to the aforementioned work [1, 7] which where followed by Zuo and Zhou [13] and then by Case and Moelius [5]. On a more practical side, there are also many interesting approaches. For instance, Borello and Mé [4] showed how metamorphism can trick anti-virus software. Other escaping techniques involve encryption, self-reproduction and feints, see [8] for a full survey. This contribution is a first attempt to provide “positive” solutions.

1 Introduction

An alphabet is a finite set \(\varSigma \) of letters. Given an alphabet \(\varSigma \), let \({\mathbb {T}}_\varSigma \) be the set of binary trees with leaves in \(\varSigma \), that is the smallest set containing \(\varSigma \) and \((\mathtt{t}_1 \cdot \mathtt{t}_2)\) whenever \(\mathtt{t}_1, \mathtt{t}_2 \in {\mathbb {T}}_\varSigma \). The two functions \(\pi _1\) and \(\pi _2\) are the projections: \(\pi _i (\mathtt{t}_1 \cdot \mathtt{t}_2) = \mathtt{t}_i\) and \(\pi _i (\mathtt{c}) = \mathtt c\) with \(i \in \{1, 2\}\) and \(\mathtt{c} \in \varSigma \). The size of a tree \(\mathtt{t}\) is denoted \(|\mathtt{t}|\) and is defined by \(|\mathtt{c}| = 1\), \(\mathtt{c}\in \varSigma \), and \(|(\mathtt{t}\cdot \mathtt{u})| = |\mathtt{t}| + |\mathtt{u}| + 1\).

A word \(a_1 \cdot a_2 \cdots a_k\) in \(\varSigma ^*\) is encoded in \({\mathbb {T}}_{\varSigma \cup \{ \mathtt{{nil}}\}}\) as \((a_1 \cdot (a_2 \cdot (\cdots (a_k \cdot \mathtt{{nil}})\cdots )))\) where \(\mathtt{{nil}}\) is an atom used as an end-marker. This relates computations over words to the ones over trees.

Definition 1

Let \(\trianglelefteq \) be the sub-tree relation on \({\mathbb {T}}_\varSigma \), that is the smallest order (reflexive-transitive relation) such that for all \(\mathtt{t}, \mathtt{u}\in {\mathbb {T}}_\varSigma \):

  • \(\mathtt{t}\trianglelefteq \mathtt{t}\),

  • \(\mathtt{t}\trianglelefteq (\mathtt{t}\cdot \mathtt{u})\),

  • \(\mathtt{t}\trianglelefteq (\mathtt{u}\cdot \mathtt{t})\).

The embedding relation on \({\mathbb {T}}_\varSigma \) is defined to be the smallest order such that \(\mathtt{t}\trianglelefteq \mathtt{u}\Rightarrow \mathtt{t}\mathrel {{\underline{\blacktriangleleft }}}\mathtt{u}\), and closed by context: \(\mathtt{t}\mathrel {{\underline{\blacktriangleleft }}}\mathtt{t}' \wedge \mathtt{u}\mathrel {{\underline{\blacktriangleleft }}}\mathtt{u}' \Rightarrow (\mathtt{t}\cdot \mathtt{u}) \mathrel {{\underline{\blacktriangleleft }}}(\mathtt{u}\cdot \mathtt{u}')\).

Observe that \(\mathtt{{nil}}\trianglelefteq (\mathtt{{nil}}\cdot \mathtt{{nil}})\). The difference between the sub-tree relation and the embedding one is exemplified by \((\mathtt{{nil}}\cdot (\mathtt{{nil}}\cdot \mathtt{{nil}})) \mathrel {{\underline{\blacktriangleleft }}}((\mathtt{{nil}}\cdot \mathtt{{nil}})\cdot (\mathtt{{nil}}\cdot \mathtt{{nil}}))\) but \((\mathtt{{nil}}\cdot (\mathtt{{nil}}\cdot \mathtt{{nil}})) \not \trianglelefteq ((\mathtt{{nil}}\cdot \mathtt{{nil}})\cdot (\mathtt{{nil}}\cdot \mathtt{{nil}}))\).

Let \(\triangleleft \) and \(\mathrel {\blacktriangleleft }\) denote respectively the strict order relative to \(\trianglelefteq \) and \(\mathrel {{\underline{\blacktriangleleft }}}\). From the definition, first, it is clear that if \(\mathtt{t}\mathrel {\blacktriangleleft }\mathtt{u}\), then \(\mathtt{u}\not \trianglelefteq \mathtt{t}\). And, second, if \(|\mathtt{t}| > |\mathtt{u}|\), then \(\mathtt{t}\not \trianglelefteq \mathtt{u}\).

We present (a slight variant of) While, a generic imperative language introduced by Jones  [9]. We suppose a given alphabet \(\varSigma \) contains an atom \(\mathtt{{nil}}\). Moreover, we suppose given (a denumerable set of) variables \({\mathtt { Var}} \ni \mathtt{X}_0, \mathtt{X}_1, \ldots \). In the following, \(\mathtt{X}, \mathtt{Y}\) serve as generic variables. The syntax of While is given by the following grammar:

$$ \begin{array}{lcl} \mathtt{Expressions}\ni \mathtt{E, F} &{} :{:}= &{} \mathtt{X}\mid \mathtt{t} \mid \mathtt{\mathtt{cons }\ E\ F} \mid \mathtt{\mathtt{hd }\ E} \mid \mathtt{\mathtt{tl }\ E} \mid \mathtt{=?\ E\ F}\\ \mathtt{Commands} \ni \mathtt{C, D} &{} :{:}= &{} \mathtt{X}:= \mathtt{E} \mid \mathtt{C\ ;\ D} \mid \mathtt{while }\ \mathtt{E}\ \mathtt{do }\ \mathtt{C}\\ \mathtt{Programs} \ni P &{} :{:}= &{} \mathtt{read }\ \mathtt{X}_1, \ldots ,\mathtt{X}_n ;\ \mathtt{C};\ \mathtt{write }\ \mathtt{Y}\end{array} $$

where \(\mathtt{t} \in {\mathbb {T}}_\varSigma \).

1.1 Semantics of While

A configuration, next called a store, is a function \(\sigma : {\mathtt { Var}} \rightarrow {\mathbb {T}}_\varSigma \). The set of stores is denoted \({\mathfrak {S}}_\varSigma \), or shorter, \({\mathfrak {S}}\) when \(\varSigma \) is clear from the context. Given a configuration \(\sigma \in {\mathfrak {S}}\) a variable \(\mathtt{X}\) and \(\mathtt{t}\in {\mathbb {T}}_\varSigma \), \(\sigma [\mathtt{X}\mapsto \mathtt{t}]\) is the store equal to \(\sigma \) on all variables but \(\mathtt{X}\) for which it is set equal to \(\mathtt{t}\).

The semantics of an expression \(\mathtt{E}\) applied on a configuration \(\sigma \) is denoted \([\![ \mathtt{E} ]\!]\sigma \) and defined by the equations:

where for equality \(\simeq \), \(\mathtt{{nil}}\) serves as false and \((\mathtt{{nil}}\cdot \mathtt{{nil}})\) as true. Each command \(\mathtt{C}\in \mathtt{Commands}\) updates the store, that is \([\![ \mathtt{C} ]\!] : {\mathfrak {S}}\rightarrow {\mathfrak {S}}\) which is defined recursively as follows:

$$\begin{aligned}{}[\![ \mathtt{X}\ :=\ \mathtt{E} ]\!]\sigma= & {} \sigma [\mathtt{X}\mapsto [\![ \mathtt{E} ]\!]\sigma ]\\ [\![ \mathtt{C}; \mathtt{D} ]\!]\sigma= & {} [\![ \mathtt{D} ]\!]([\![ \mathtt{C} ]\!]\sigma )\\ [\![ \mathtt{while }\ {\mathtt{E}}\ \mathtt{do }\ \mathtt{C} ]\!]\sigma= & {} \sigma \qquad \qquad \qquad \quad \,\,\text {if}[\![ \mathtt{E} ]\!]\sigma = \mathtt{{nil}}\\ [\![ \mathtt{while }\ {\mathtt{E}}\ \mathtt{do }\ \mathtt{C} ]\!]\sigma= & {} [\![ \mathtt{C}\ ; \mathtt{while }\ {\mathtt{E}}\ \mathtt{do }\ \mathtt{C} ]\!]\sigma \;\;\; \text { otherwise} \end{aligned}$$

The program \({\mathbf {p}}\triangleq \mathtt{read }\ \mathtt{X}_1, \ldots , \mathtt{X}_n\ \mathtt{C}\ ; \mathtt{write }\mathtt{Y}\) computes the following function. Given \(\mathtt{t}_1, \ldots , \mathtt{t}_n\), in the initial configuration \(\sigma _0(\mathtt{t}_1, \ldots , \mathtt{t}_n)\), all variables are set to \(\mathtt{{nil}}\), except \(\mathtt{X}_1, \ldots , \mathtt{X}_n\) which are respectively set to \(\mathtt{t}_1, \ldots , \mathtt{t}_n\). Then \([\![ {\mathbf {p}} ]\!](\mathtt{t}_1, \ldots , \mathtt{t}_n)\) is defined to be \(([\![ \mathtt{C} ]\!]\sigma _0(\mathtt{t}_1, \ldots , \mathtt{t}_n))(\mathtt{Y})\).

1.2 While as an Acceptable Language

Let \(\{ {\mathbf {assign}} , {\mathbf { seq}}, {\mathbf {while} }, {\mathtt { Var}}, {\mathbf { quote}}, {\mathbf {cons} }, {\mathbf {hd} }, {\mathbf {tl} }, {\mathbf { iseq}}, \mathtt{{nil}}\}\) denote 10 distinct elements of \({\mathbb {T}}_\varSigma \). The representation \(\underline{{\mathbf {p}}}\) of a program in While is defined recursively:

$$\begin{array}{rclp{1cm}rcl} \underline{0} &{} = &{} \mathtt{{nil}}&{}&{}\underline{\mathtt{X}_i} &{} = &{} ({\mathtt { Var}} \cdot \underline{i})\\ \underline{n+1} &{}=&{} (\mathtt{{nil}}\cdot \underline{n}) &{}&{}\underline{\mathtt{t}}&{} = &{} ({\mathbf { quote}}\cdot t) \\ \underline{()} &{}= &{}\mathtt{{nil}}&{}&{} \underline{\mathtt{hd }\ \mathtt{E}} &{} = &{} ({\mathbf {hd} }\cdot \underline{\mathtt{E}})\\ \underline{(x_1,\ldots )} &{}=&{}(\underline{x_1} \cdot \underline{(\ldots )})&{}&{} \underline{\mathtt{tl }\ \mathtt{E}} &{} = &{} ({\mathbf {tl} }\cdot \underline{\mathtt{E}})\\ \underline{\mathtt{X}_i := \mathtt{E}} &{} = &{} ({\mathbf {assign}} \cdot (\underline{\mathtt{X}_i} \cdot \underline{\mathtt{E}}))&{}&{} \underline{\mathtt{cons }\ \mathtt{E}\ \mathtt{F}} &{} = &{} ({\mathbf {cons} }\cdot (\underline{\mathtt{E}} \cdot \underline{\mathtt{F}}))\\ \underline{\mathtt{C}; \mathtt{D}} &{} = &{} ({\mathbf { seq}} \cdot (\underline{\mathtt{C}} \cdot \underline{\mathtt{D}}))&{}&{} \underline{=?\ \mathtt{E}\ \mathtt{F}} &{} = &{}({\mathbf { iseq}}\cdot (\underline{\mathtt{E}} \cdot \underline{\mathtt{F}}))\\ \underline{\mathtt{while }\ \mathtt{E}\ \mathtt{do }\ \mathtt{C}} &{} = &{} ({\mathbf {while} }\cdot (\underline{\mathtt{E}} \cdot \underline{\mathtt{C}}))\\ \end{array}$$

and for a program, we define \(\underline{\mathtt{read }\ \mathtt{X}_1, \ldots , \mathtt{X}_n ;\ \mathtt{C};\ \mathtt{write }\ \mathtt{Y}} = (\underline{(\mathtt{X}_1, \ldots , \mathtt{X}_n)}\cdot (\underline{\mathtt{C}}\cdot \underline{\mathtt{Y}}))\).

More generally speaking, the representation of a programming language is an injective function from the set of programs (here While) to its corresponding data set (here \({\mathbb {T}}_\varSigma \)).

As shown by Jones [9], there is a universal program \(\mathbf{u} \in \mathtt{While}\), that is a program \(\mathbf u\) such that for any program \({\mathbf {p}}\in \mathtt{While}\) and any data \(\mathtt{t}\in {\mathbb {T}}_\varSigma \): \([\![ \mathbf u ]\!](\underline{{\mathbf {p}}},\mathtt{t}) = [\![ {\mathbf {p}} ]\!](\mathtt{t})\). For all \(m, n \in {\mathbb {N}}\), there is a specializer \(\mathbf{s\_m\_n}\), that is a program \(\mathbf{s\_m\_n}\) such that for all \(m+n\)-ary program \({\mathbf {p}}\), for all \(\mathtt{t}_1, \ldots , \mathtt{t}_{m+n} \in {\mathbb {T}}_\varSigma \), \(\lceil \!\lceil [\![ \mathbf{s\_m\_n} ]\!](\underline{{\mathbf {p}}},\mathtt{t}_1, \ldots , \mathtt{t}_m) \rceil \!\rceil (\mathtt{t}_{m+1}, \ldots , \mathtt{t}_{m+n}) = [\![ {\mathbf {p}} ]\!](\mathtt{t}_1, \ldots , \mathtt{t}_{m+n})\). Finally, it is Turing-complete. Such a language is said to be acceptable in Jones/Roger’s terms. As such, it is isomorphic to any other acceptable language as shown by Rogers:

Theorem 1

(Rogers [11]). Two acceptable languages are isomorphic.

That is there is a bijective computable function transforming programs in the first language to programs in the second one with equivalent semantics.

For any acceptable language, Kleene’s second recursion theorem is known to hold. We recall:

Theorem 2

(Kleene’s Second Recursion Theorem). For any \(k+1\)-ary program \({\mathbf {p}}\), there is a \(k\)-ary program \({{\mathbf {e}}}\) satisfying for all inputs \(\mathtt{t}_1, \ldots ,\mathtt{t}_k \in {\mathbb {T}}_\varSigma \): \( [\![ {{\mathbf {e}}} ]\!](\mathtt{t}_1,\ldots ,\mathtt{t}_k) = [\![ {\mathbf {p}} ]\!](\underline{{{\mathbf {e}}}},\mathtt{t}_1,\ldots ,\mathtt{t}_k).\)

Proof

For later use, we give a proof for \(k=1\). The proof for \(k>1\) follows the same schema. For the specializer \(\mathbf{s\_1\_1}\triangleq \mathtt{read }\ \mathtt{X}_0,\mathtt{X}_1 ; \mathtt{C}_{\mathbf{s\_1\_1}} ; \mathtt{write }\ \mathtt{Y}\), for all binary program \({\mathbf {p}}\in {\mathcal {P}}\) and \(\mathtt{t}, \mathtt{t}' \in {\mathbb {T}}_\varSigma \), \(\lceil \!\lceil [\![ \mathbf{s\_1\_1} ]\!](\underline{{\mathbf {p}}},\mathtt{t}) \rceil \!\rceil (\mathtt{t}') = [\![ {\mathbf {p}} ]\!](\mathtt{t}, \mathtt{t}')\). Let \({\mathbf {p}}= \mathtt{read }\ \mathtt{X}_0',\mathtt{X}'_1 ; \mathtt{C}_{{\mathbf {p}}} ; \mathtt{write }\ \mathtt{Y}'\). By renaming variables, we suppose without loss of generality that it does not share variables with \(\mathbf{s\_1\_1}\). Then, let \({{{\mathbf {r}}}}_{{\mathbf {p}}}\) be the program:

$$\begin{array}{l} \mathtt{read }\ \mathtt{X}''_0,\mathtt{X}''_1 ; \\ \mathtt{X}_0 := \mathtt{X}''_0 ; \ \mathtt{X}_1 := \mathtt{X}''_0 ;\\ \mathtt{C}_{\mathbf{s\_1\_1}} ; \\ \mathtt{X}'_0 := \mathtt{Y};\ \mathtt{X}'_1 := \mathtt{X}''_1 ;\\ \mathtt{C}_{{\mathbf {p}}} ;\\ \mathtt{write }\ \mathtt{Y}' \end{array}$$

with \(\mathtt{X}''_0, \mathtt{X}''_1\) some fresh variables. Then, it is clear that for all \({{\mathbf {q}}}\in {\mathcal {P}}\) and all \(\mathtt{t}\in {\mathbb {T}}_\varSigma \), \([\![ {{{{\mathbf {r}}}}}_{{\mathbf {p}}} ]\!](\underline{{{\mathbf {q}}}},\mathtt{t}) = [\![ {\mathbf {p}} ]\!] ( [\![ \mathbf{s\_1\_1} ]\!](\underline{{{\mathbf {q}}}},\underline{{{\mathbf {q}}}}),\mathtt{t})\). Let \({{\mathbf {e}}}\triangleq \overline{[\![ \mathbf{s\_1\_1} ]\!](\underline{{{{\mathbf {r}}}}_{{\mathbf {p}}}},\underline{{{{\mathbf {r}}}}_{{\mathbf {p}}}})}\), we get:

$$\begin{array}{rcll} [\![ {{\mathbf {e}}} ]\!](t) &{}=&{}\lceil \!\lceil [\![ \mathbf{s\_1\_1} ]\!](\underline{{{{{\mathbf {r}}}}}_{{\mathbf {p}}}},\underline{{{{{\mathbf {r}}}}}_{{\mathbf {p}}}}) \rceil \!\rceil (\mathtt{t}) &{}\text {by def. of }\lceil \!\lceil . \rceil \!\rceil \\ &{}=&{} [\![ {{{\mathbf {r}}}}_{{\mathbf {p}}} ]\!](\underline{{{{\mathbf {r}}}}_{{\mathbf {p}}}},\mathtt{t}) &{}\text {by def. of }\mathbf{s\_1\_1}\\ &{}=&{}[\![ {\mathbf {p}} ]\!]([\![ \mathbf{s\_1\_1} ]\!](\underline{{{{\mathbf {r}}}}_{{\mathbf {p}}}},\underline{{{{\mathbf {r}}}}_{{\mathbf {p}}}}),\mathtt{t})&{} \text {by remark above}\\ &{}=&{} [\![ {\mathbf {p}} ]\!](\underline{{{\mathbf {e}}}},\mathtt{t}) &{} \text {since } \underline{\overline{[\![ \mathbf{s\_1\_1} ]\!](\underline{{{{\mathbf {r}}}}_{{\mathbf {p}}}},\underline{{{{\mathbf {r}}}}_{{\mathbf {p}}}})}} = [\![ \mathbf{s\_1\_1} ]\!](\underline{{{{\mathbf {r}}}}_{{\mathbf {p}}}},\underline{{{{\mathbf {r}}}}_{{\mathbf {p}}}}) \end{array}$$

As justified by Bonfante, Kaczmarek and Marion in [3], a virus can be formalized as follows:

Definition 2

(Computer Virus). Given a computable function \(B\) called the propagation function, a virus is a program \(\mathbf v\) such that \([\![ \mathbf{v} ]\!](\mathtt{t}) = B(\underline{\mathbf{v}}, \mathtt{t})\) for all \(\mathtt{t}\in {\mathbb {T}}_\varSigma \).

In other words, it is a fixed point for a propagation function. Thus, as shown in [3], the second recursion theorem of Kleene implies that for any propagation function there is a corresponding virus. In other words, the theorem provides a virus compiler, and there are no general ways to avoid them. In the remaining, we restrict While to get around computer viruses. We propose two strategies to that end. First, we delineate a programming language in which the Recursion Theorem does not hold. As shown by the proof of the Recursion Theorem, the existence of a specializer, of composition and projection is sufficient to prove the Theorem. Thus, we find a programming language without specializer.

The second strategy consists in finding a language for which fixed point exists, but viruses can be detected. By detection we mean program equivalence as justified by Adleman in [1].

2 On Cons-Free Programs

\(\mathtt{While}_{\setminus \mathtt{\{cons\}}}\) is the language while restricted to expressions of the shape:

$$\mathtt{Expressions}\ni \mathtt{E, F} :{:}= \mathtt{X}\mid \mathtt{t} \mid \mathtt{\mathtt{hd }\ E} \mid \mathtt{\mathtt{tl }\ E} \mid {\mathtt{E}=?\ \mathtt{F}}$$

Such programs where initially considered by Jones under a complexity perspective. He proved that they compute exactly logspace predicates. We show that the Second Recursion Theorem does not hold in \(\mathtt{While}_{\setminus \mathtt{\{cons\}}}\).

In this section, when \(\mathtt{t}\in {\mathbb {T}}_{\varSigma }\) and \(S \subseteq {\mathbb {T}}_\varSigma \), the notation \(\mathtt{t}\trianglelefteq S\) means \(\exists \mathtt{t}' \in S : \mathtt{t}\trianglelefteq \mathtt{t}'\). When \(S, S'\) are two sets, the notation \(S \trianglelefteq S'\) states for \(\forall \mathtt{t}\in S, \exists \mathtt{t}' \in S' : \mathtt{t}\trianglelefteq \mathtt{t}'\). For a store \(\sigma \), let \({\text {Rg}}(\sigma ) = \{ \sigma (\mathtt{X}) \mid \mathtt{X}\in {\mathtt { Var}} \}\).

Definition 3

Let \(\mathtt{E}\) be an expression, we denote by \(c(\mathtt{E})\) the set of all constants occurring in \(\mathtt{E}\); formally, by induction: \(c(\mathtt{X}) = \emptyset \), \(c(\mathtt{t}) = \{ \mathtt{t}\}\), \(c(\mathtt{hd }\ \mathtt{E}) = c(\mathtt{tl }\ \mathtt{E}) = c(\mathtt{E})\) and \(c(\mathtt{cons }\ \mathtt{E}\ \mathtt{F}) = c(\mathtt{E}{=?}\ \mathtt{F}) = c(\mathtt{E}) \cup c(\mathtt{F})\).

The definition is extended to commands: \(c(\mathtt{X}:= \mathtt{E})=c(\mathtt{E})\), \(c(\mathtt{C\ ;\ D}) = c(\mathtt{C}) \cup c(\mathtt{D})\), \(c(\mathtt{while }\ \mathtt{E}\ \mathtt{do }\ \mathtt{C}) = c(\mathtt{E}) \cup c(\mathtt{C})\) and finally to programs by the equation \(c(\mathtt{read }\ \mathtt{X}_1, \ldots ,\mathtt{X}_n; \mathtt{C};\ \mathtt{write }\mathtt{Y}) = c(\mathtt{C})\).

Proposition 1

Given a program \({\mathbf {p}}\in \mathtt{While}_{\setminus \mathtt{\{cons\}}}\) of arity \(n\) and \(\mathtt{t}_1,\cdots {,}\, \mathtt{t}_n\) some elements of \({\mathbb {T}}_\varSigma \), \([\![ {\mathbf {p}} ]\!]({\mathtt{t}_1,\cdots {,} \mathtt{t}_n})\trianglelefteq c({\mathbf {p}})\cup \{\mathtt{t}_1,\cdots {,} \mathtt{t}_n\}\cup \{ (\mathtt{{nil}}\cdot \mathtt{{nil}})\}\) whenever \([\![ {\mathbf {p}} ]\!](\mathtt{t}_1, \ldots , \mathtt{t}_n)\) is defined.

Proof

A very similar result occurs in Jones [9]. It is by induction on the structure of programs.

Proposition 2

Given a program \({\mathbf {p}}\in \mathtt{While}_{\setminus \mathtt{\{cons\}}}\) and \(t\in {\mathbb {T}}_\varSigma \) if \({\mathbf {p}}\) computes the constant function equal to \(\mathtt{t}\), then either:

$$\begin{aligned} \mathtt{t}\trianglelefteq (\mathtt{{nil}}\cdot \mathtt{{nil}})\ \ \ or\ \ \mathtt{t}\trianglelefteq c({\mathbf {p}}) \end{aligned}$$

Proof

Applying Proposition 1 to the program \({\mathbf {p}}\): \([\![ {\mathbf {p}} ]\!]({\mathtt{{nil}}}) \trianglelefteq c({\mathbf {p}}) \cup \{ \mathtt{{nil}}, (\mathtt{{nil}}\cdot \mathtt{{nil}})\}\). But, again, \(\mathtt{{nil}}\trianglelefteq (\mathtt{{nil}}\cdot \mathtt{{nil}})\), thus \([\![ {\mathbf {p}} ]\!]({\mathtt{{nil}}}) \trianglelefteq c({\mathbf {p}}) \cup \{ (\mathtt{{nil}}\cdot \mathtt{{nil}})\}\).

2.1 \(\mathtt{While}_{\setminus \mathtt{\{cons\}}}\) Does Not Contain a Specializer

Theorem 3

Whatever the choice of a representation, and in particular for the one given in the preceding section, there is no specializer in \(\mathtt{While}_{\setminus \mathtt{\{cons\}}}\).

Proof

We assume the existence of a specializer \(\mathbf{s\_1\_1}\). Let us define two programs \({\mathbf {p}}\) and \(\mathbf q\): \({\mathbf {p}}\triangleq \mathtt{read }\ \mathtt{X}_1,\mathtt{X}_2;\ \mathtt{Y}= \mathtt{X}_1;\ \mathtt{write }\ \mathtt{Y}\) and \(\mathbf{q}\triangleq \mathtt{read }\ \mathtt{X}_1,\mathtt{X}_2;\ \mathtt{if }\ (\mathtt{X}_2 =?\ \mathtt{{nil}})\ \mathtt{Y}:= \mathtt{X}_1\ \text {else }\mathtt{Y}:= \mathtt{X}_2;\ \mathtt{write }\ \mathtt{Y}\).

Consider some \(\mathtt{t}\in {\mathbb {T}}_\varSigma \), we apply Proposition 1 to the program \(\mathbf{s\_1\_1}\) and the inputs \(\underline{{\mathbf {p}}}, \mathtt{t}\) we obtain:

$$\begin{aligned} \left\{ \begin{matrix} [\![ \mathbf{s\_1\_1} ]\!](\underline{{\mathbf {p}}}, \mathtt{t}) \trianglelefteq C&{} or \\ [\![ \mathbf{s\_1\_1} ]\!](\underline{{\mathbf {p}}}, \mathtt{t}) \trianglelefteq \mathtt{t}&{} \ \end{matrix}\right. \end{aligned}$$

with \(C=((\mathtt{{nil}}\cdot \mathtt{{nil}})\cdot \underline{{\mathbf {p}}} \cdot c_1 \cdots c_m)\) and \(c(\mathbf{s\_1\_1})=\{c_1,\cdots ,c_m\}\).

Given \(\mathtt{t}\ne \mathtt{t}'\), \(\lceil \!\lceil [\![ \mathbf{s\_1\_1} ]\!]({\mathbf {p}},\mathtt{t}) \rceil \!\rceil (\mathtt{{nil}}) = \mathtt{t}\ne \mathtt{t}' = \lceil \!\lceil [\![ \mathbf{s\_1\_1} ]\!]({\mathbf {p}},\mathtt{t}') \rceil \!\rceil (\mathtt{{nil}})\). Thus, \(\mathtt{t}\mapsto [\![ \mathbf{s\_1\_1} ]\!]({\mathbf {p}},\mathtt{t})\) is an injective function. From that, we state that the set \(S_C = \{\mathtt{t}\in {\mathbb {T}}_\varSigma \mid | [\![ \mathbf{s\_1\_1} ]\!]({\mathbf {p}},\mathtt{t}) | \le |C|\}\) is finite. We set \(N_1 = \max \{|\mathtt{t}| \mid \mathtt{t}\in S_C\}\). Then, for all \(|\mathtt{t}| > N_1\), we can state that \(| [\![ \mathbf{s\_1\_1} ]\!]({\mathbf {p}},\mathtt{t}) |>|C|\) which in turn means that \([\![ \mathbf{s\_1\_1} ]\!](\underline{{\mathbf {p}}}, \mathtt{t}) \trianglelefteq \mathtt{t}\).

Since for any \(\mathtt{t}\ne \mathtt{t}'\), \(\lceil \!\lceil [\![ \mathbf{s\_1\_1} ]\!](\mathbf{q},\mathtt{t}) \rceil \!\rceil (\mathtt{{nil}}) = \mathtt{t}\ne \mathtt{t}' = \lceil \!\lceil [\![ \mathbf{s\_1\_1} ]\!](\mathbf{q},\mathtt{t}') \rceil \!\rceil (\mathtt{{nil}})\), the function \(\mathtt{t}\mapsto [\![ \mathbf{s\_1\_1} ]\!](\mathbf{q},\mathtt{t})\) is injective. Thus, we use the same approach: there exists an integer \(N_2\) such that \(|\mathtt{t}| > N_2\) implies \([\![ \mathbf{s\_1\_1} ]\!](\underline{\mathbf{q}}, \mathtt{t}) \trianglelefteq \mathtt{t}\). We set \(N=\max (N_1,N_2)\) and we get :

$$\begin{aligned} |\mathtt{t}| > N \Rightarrow ([\![ \mathbf{s\_1\_1} ]\!](\underline{{\mathbf {p}}}, \mathtt{t}) \trianglelefteq \mathtt{t}\ \ and \ \ [\![ \mathbf{s\_1\_1} ]\!](\underline{\mathbf{q}}, \mathtt{t}) \trianglelefteq \mathtt{t}) \end{aligned}$$

Given \(n \in {\mathbb {N}}\), wet define \(n_{\downarrow \trianglelefteq } = \{ \mathtt{t}\in {\mathbb {T}}_\varSigma \mid \mathtt{t}\trianglelefteq \underline{n}\}\) with the encoding of integers defined in the preceding section. Observe that we have the equality:

$$\begin{aligned} n_{\downarrow \trianglelefteq } = \{\underline{k} \mid 0 \le k \le n\}. \end{aligned}$$
(1)

For all \(i,j \in {\mathbb {N}}\), let \(k\) such that \(k \ne i\) and \(k \ne j\). We have \([\![ [\![ \mathbf{s\_1\_1} ]\!](\underline{{\mathbf {p}}},\underline{i}) ]\!](\underline{k}) = \underline{i} \ne \underline{k} = [\![ [\![ \mathbf{s\_1\_1} ]\!](\underline{\mathbf{q}},\underline{j}) ]\!](\underline{k})\) which means that for all \(i, j \in {\mathbb {N}}\): \([\![ \mathbf{s\_1\_1} ]\!](\underline{{\mathbf {p}}},\underline{i}) \ne [\![ \mathbf{s\_1\_1} ]\!](\underline{\mathbf{q}},\underline{j})\).

Consider some \(M > N\). Recall that both \(\mathtt{t}\mapsto [\![ \mathbf{s\_1\_1} ]\!]({\mathbf {p}},\mathtt{t})\) and \(\mathtt{t}\mapsto [\![ \mathbf{s\_1\_1} ]\!](\mathbf{q}, \mathtt{t})\) are injective. Due to the aforementioned remark, the set \(S_{MN} = \{[\![ \mathbf{s\_1\_1} ]\!]({\mathbf {p}},\underline{i}) \mid N < i \le M\} \cup \{ [\![ \mathbf{s\_1\_1} ]\!](\mathbf{q}, \underline{i}) \mid N < i \le M\}\) contains exactly \(2 \times (M-N)\) elements. However, all these elements verify \(S_{MN}\trianglelefteq \underline{M} \in M_{\downarrow \trianglelefteq }\), but by Eq. 1, the set \(M_{\downarrow \trianglelefteq }\) contains only \(M+1\) elements which leads to \(2 \times (M-N) \le M+1\). The inequality does not hold for \(M = 2 \times (N+1)\).

The non-existence of a specializer does not mean that there are no fixed points in \(\mathtt{While}_{\setminus \mathtt{\{cons\}}}\), for instance the nowhere defined program

$$\mathtt{read }\ \mathtt{X}_1\ ; {\mathbf {while} }\ (\mathtt{{nil}}\cdot \mathtt{{nil}})\ \mathtt{do }\ \mathtt{X}_1 :=\ \mathtt{X}_1\ ;\ \mathtt{write }\ \mathtt{X}_1$$

is a fixed point for the program

$$\mathtt{read }\ \mathtt{X}_1, \mathtt{X}_2\ ; {\mathbf {while} }\ (\mathtt{{nil}}\cdot \mathtt{{nil}})\ \mathtt{do }\ \mathtt{X}_1\ :=\ \mathtt{X}_1\ ; \mathtt{write }\ \mathtt{X}_1.$$

There are other fix-point constructions which are not based on specializers, some are found in Smullyan’s [12], but we wish to mention here the approach due to Moss [10] which is based on a very elementary framework, text register machines.

Nevertheless, there are programs for which there is no fixed points. In other words, the Recursion Theorem does not hold. For instance, for the homeomorphic representation of programs presented in \(\mathtt{While}_{\setminus \mathtt{\{cons\}}}\):

Proposition 3

There is no Quine in \(\mathtt{While}_{\setminus \mathtt{\{cons\}}}\).

Proof

Ad absurdum, suppose there is a Quine \(\mathbf{q}\) in \(\mathtt{While}_{\setminus \mathtt{\{cons\}}}\). It is a fixed point for the program \(\mathbf{pi_1} \triangleq \mathtt{read }\ \mathtt{X}_1, \mathtt{X}_2\ ; \mathtt{X}_1\ := \mathtt{X}_1\ ; \mathtt{write }\ \mathtt{X}_1\). Since \([\![ \mathbf{q} ]\!](\mathtt{{nil}}) = \underline{\mathbf{q}}\), since \(|\underline{\mathbf{q}}| > |(\mathtt{{nil}}\cdot \mathtt{{nil}})|\) (as it is the case for any programs), we can state with Corollary 1 that \(\underline{\mathbf{q}} \trianglelefteq c(\mathbf{q})\). Let \(\mathtt{t}\in c(\mathbf{q})\) such that

$$\begin{aligned} \underline{\mathbf{q}} \trianglelefteq \mathtt{t}. \end{aligned}$$
(2)

With the homeomorphic encoding we chose, we can state that \(({\mathbf { quote}}\cdot \mathtt{t}) \trianglelefteq \underline{\mathbf{q}}\). Thus

$$\begin{aligned} \mathtt{t}\triangleleft ({\mathbf { quote}}\cdot \mathtt{t}) \trianglelefteq \underline{\mathbf{q}} \end{aligned}$$
(3)

The two inequalities 2 and 3 are not compatible. The conclusion follows.

Quines are interesting in Adleman’s perspective. They correspond to the ‘Imitate’ scenario. The ‘Infection’ scenario would not be possible due to Proposition 1. Thus, programs in \(\mathtt{While}_{\setminus \mathtt{\{cons\}}}\) cannot be infected in his view. The reader may notice that the proof here depends on the choice of the representation of programs. Indeed, it is not difficult to define an encoding for which there is a Quine. Simply modify \(\underline{-}\) so that the encoding of \(\mathbf{nil} \triangleq \mathtt{read }\ \mathtt{X}_1\ ; \ \mathtt{X}_1\ := \mathtt{{nil}}\ ; \mathtt{write }\ \mathtt{X}_1\) is set to \(\mathtt{{nil}}\). Then, \([\![ \mathbf{nil} ]\!](\mathtt{t}) = \mathtt{{nil}}\) which is the required equation. Nevertheless, there are no other Quines. To end the remark, observe that program representation can be on the defense side, not on the virus writer’s one.

3 Tiny, a Whippersnapper Programming Language

Tiny is the language While restricted to expressions of the shape:

$$\mathtt{Expressions}\ni \mathtt{E, F} :{:}= \mathtt{X}\mid \mathtt{t} \mid \mathtt{\mathtt{cons }\ E\ F} \mid \mathtt{\mathtt{hd }\ E} \mid \mathtt{\mathtt{tl }\ E}$$

and commands to:

$$\mathtt{Commands} \ni \mathtt{C, D} :{:}= \mathtt{X}:= \mathtt{E} \mid \mathtt{C\ ;\ D}.$$

Obviously, Tiny is not Turing complete. Actually, it is a very weak fragment of computable functions: it contains only functions computable in constant time. However, the Recursion Theorem holds, surprisingly, in Tiny. For the representation of programs that we defined, the Recursion Theorem holds:

Theorem 4

Given a \(k\)-ary program \({\mathbf {p}}\in \mathtt{Tiny}\), there is a \(k-1\)-ary program \({{\mathbf {e}}}\in \mathtt{Tiny}\) such that for all \(\mathtt{t}_1, \ldots , \mathtt{t}_k \in {\mathbb {T}}_\varSigma \) : \([\![ {{\mathbf {e}}} ]\!](\mathtt{t}_1, \ldots , \mathtt{t}_k) = [\![ {\mathbf {p}} ]\!](\underline{{{\mathbf {e}}}},\mathtt{t}_1, \ldots , \mathtt{t}_k)\).

Proof

Again, we give a proof for \(k=1\). The other cases are left to the reader. If we come back to the previous proof of the Recursion Theorem, it is clear that the program \({{{\mathbf {r}}}}_{{\mathbf {p}}}\) is in \(\mathtt{Tiny}\) whenever both \({\mathbf {p}}\) and \(\mathbf{s\_1\_1}\) are in Tiny. Since \({\mathbf {p}}\in \mathtt{Tiny}\) by hypothesis, \({{{\mathbf {r}}}}_{{\mathbf {p}}}\) is in \(\mathtt{Tiny}\) if there is a specializer within \(\mathtt{Tiny}\). This is actually the case: define \(\mathbf{s\_1\_1}\triangleq \)

$$\begin{array}{ll} \mathtt{read }\ \mathtt{X}_0, \mathtt{X}_1 ; \\ \mathtt{C}:= \mathtt{hd }\ (\mathtt{tl }\ \mathtt{X}_0) ; &{}\text {// the representation of the body of }\mathtt{X}_0\\ \mathtt{X}:= \mathtt{hd }\ (\mathtt{hd }\ \mathtt{X}_0) ; &{}\text {// the rep. of the first input variable of }\mathtt{X}_0\\ \mathtt{X}_L := \mathtt{tl }\ (\mathtt{hd }\ \mathtt{X}_0) ; &{}\text {// the remaining variables of }\mathtt{X}_0\\ \mathtt{Y}:= \mathtt{tl }\ (\mathtt{tl }\ \mathtt{X}_0) ;&{} \text {// the rep. of the output variable of }\mathtt{X}_0\\ \mathtt{E}:= \mathtt{cons }\ {\mathbf { quote}}\ \mathtt{X}_1; &{}\text {// the rep. of the value }\mathtt{t}\text { of }\mathtt{X}_1\\ \mathtt{C}_0 := \mathtt{cons }\ {\mathbf {assign}} \ (\mathtt{cons }\ \mathtt{X}\ \mathtt{E}) ; &{} \text {// the rep of }\mathtt{X}:= \mathtt{t}\\ \mathtt{C}:= \mathtt{cons }\ {\mathbf { seq}}\ (\mathtt{cons }\ \mathtt{C}_0\ \mathtt{C}); &{}\text {// the rep. of }\mathtt{X}:= \mathtt{t}\ ; \mathtt{C}\\ \mathtt {P} := \mathtt{cons }\ \mathtt{X}_L (\mathtt{cons }\ \mathtt{C}\ \mathtt{Y}) ; &{}\text {// the packaging of the new (unary) program} \\ \mathtt{write }\ \mathtt {P} \end{array}$$

This is a specializer. Indeed, let \({\mathbf {p}}= \mathtt{read }\ \mathtt{X}'_0,\mathtt{X}'_1 ; \mathtt{C}_{{\mathbf {p}}} ; \mathtt{write }\ \mathtt{Y}'\). For all \(\mathtt{t}\in {\mathbb {T}}_\varSigma \), \([\![ \mathbf{s\_1\_1} ]\!](\underline{{\mathbf {p}}},\mathtt{t}) = \underline{\mathtt{read }\ \mathtt{X}'_1 ; \mathtt{X}'_0 := \mathtt{t}\ ; \mathtt{C}_{{\mathbf {p}}} ; \mathtt{write }\ \mathtt{Y}'}\). Thus, for all \(\mathtt{t}' \in {\mathbb {T}}_\varSigma \): we have \(\lceil \!\lceil [\![ \mathbf{s\_m\_n} ]\!](\underline{{\mathbf {p}}},\mathtt{t}) \rceil \!\rceil (\mathtt{t}') = [\![ {\mathbf {p}} ]\!](\mathtt{t}, \mathtt{t}')\) as required.

So, \({{{\mathbf {r}}}}_{{\mathbf {p}}}\) is in \(\mathtt{Tiny}\). Since the fixed point \({{\mathbf {e}}}= \overline{{{\mathtt {S}}}_1^1({{{\mathbf {r}}}}_{{\mathbf {p}}},{{{\mathbf {r}}}}_{{\mathbf {p}}})}\) is in \(\mathtt{Tiny}\), the proof ends as a corollary of the following Lemma:

Lemma 1

If \({\mathbf {p}}\in \mathtt{Tiny}\), for all \(\mathtt{t}\in {\mathbb {T}}_\varSigma \): \(\overline{[\![ \mathbf{s\_1\_1} ]\!](\underline{{\mathbf {p}}},\mathtt{t})} \in \mathtt{Tiny}\).

Proof

Recall that \([\![ \mathbf{s\_1\_1} ]\!](\underline{{\mathbf {p}}},\mathtt{t}) =\underline{\mathtt{read }\ \mathtt{X}'_1 ; \mathtt{X}'_0 := \mathtt{t}\ ; \mathtt{C}_{{\mathbf {p}}} ; \mathtt{write }\ \mathtt{Y}}\). Since \({\mathbf {p}}\) is in \(\mathtt{Tiny}\), \([\![ \mathbf{s\_1\_1} ]\!](\underline{{\mathbf {p}}},\mathtt{t})\) is itself the representation of a program in Tiny.

3.1 Program Equivalence in Tiny

From the above, there are viruses in Tiny. However, with the scenario made in the introduction, we can protect a system which is based on Tiny. Protection amounts to problem equivalence decision. It is the following. Given two programs \({\mathbf {p}}, {{\mathbf {q}}}\), does \([\![ {\mathbf {p}} ]\!] = [\![ {{\mathbf {q}}} ]\!]\)? In general—in particular for a Turing-complete Language—, such a decision is not computable (as a direct consequence of Rice’s Theorem). But, for Tiny, there is a simple decision procedure.

Theorem 5

Equivalence of programs is computable for programs in \(\mathtt{Tiny}\).

Proof

Composing expressions, one may reduce programs in Tiny to just one expression. The semantics of an expression can be explicitly expanded. Then, equivalence is equality of the semantics. We provide in appendix an algorithm that compute the semantics of expressions. However,

Proposition 4

Equalence of programs in \(\mathtt{Tiny}\) is not in \(\mathtt{Tiny}\).

Lemma 2

Any program in \(\mathtt{Tiny}\) is monotonic, that is if \(\mathtt{t}_i \mathrel {{\underline{\blacktriangleleft }}}\mathtt{t}'_i\) for all \(i \le n\), then \([\![ {\mathbf {p}} ]\!](\mathtt{t}_1, \ldots , \mathtt{t}_n) \mathrel {{\underline{\blacktriangleleft }}}[\![ {\mathbf {p}} ]\!](\mathtt{t}'_1, \ldots , \mathtt{t}'_n)\).

Proof

We have seen above that any program in \(\mathtt{Tiny}\) is equivalent to some program of the shape \(\mathtt{read }\ \mathtt{X}_1, \ldots , \mathtt{X}_n; \mathtt{Y}:= \mathtt{E}; \mathtt{write }\ \mathtt{Y}\), thus we restrict our attention to these ones. Since \(\pi _1\) and \(\pi _2\) are monotonic, the result holds by an immediate induction on \(\mathtt{E}\).

Proof

(Proposition 4 ). Let us come back to the proof of the Proposition. Ad absurdum, suppose that there is some program \(\mathbf eq \in \mathtt{Tiny}\) such that \([\![ \mathbf{eq} ]\!](\underline{{\mathbf {p}}_1},\underline{{\mathbf {p}}_2}) \ne \mathtt{{nil}}\) iff \([\![ {\mathbf {p}}_1 ]\!] = [\![ {\mathbf {p}}_2 ]\!]\) for all \({\mathbf {p}}_1, {\mathbf {p}}_2 \in \mathtt{Tiny}\). Let \({\mathbf {p}}_1 \triangleq \mathtt{read }\ \mathtt{X}; \mathtt{Y}:= \mathtt{{nil}}; \mathtt{write }\ \mathtt{Y}\). It is in tiny, thus, \([\![ \mathbf eq ]\!](\underline{{\mathbf {p}}_1}, \underline{{\mathbf {p}}_1}) \ne \mathtt{{nil}}\) since \({\mathbf {p}}_1\) is equivalent to itself. Observe that \({\mathbf {p}}_2 \triangleq \mathtt{read }\ \mathtt{X}; \mathtt{Y}:= \mathtt{cons }\ \mathtt{{nil}}\ \mathtt{{nil}}; \mathtt{write }\ \mathtt{Y}\) which is also in Tiny verifies \(\underline{{\mathbf {p}}_1} \mathrel {{\underline{\blacktriangleleft }}}\underline{{\mathbf {p}}_2}\). By Lemma 2, we can state that \([\![ \mathbf eq ]\!](\underline{{\mathbf {p}}_1}, \underline{{\mathbf {p}}_1}) \mathrel {{\underline{\blacktriangleleft }}}[\![ \mathbf eq ]\!](\underline{{\mathbf {p}}_1}, \underline{{\mathbf {p}}_2})\). In turn, that means \([\![ \mathbf eq ]\!](\underline{{\mathbf {p}}_1}, \underline{{\mathbf {p}}_2}) \ne \mathtt{{nil}}\). But \({\mathbf {p}}_1\) and \({\mathbf {p}}_2\) are not equivalent: \([\![ {\mathbf {p}}_1 ]\!](\mathtt{{nil}}) \ne [\![ {\mathbf {p}}_2 ]\!](\mathtt{{nil}})\), thus a contradiction.

4 Conclusion

Thought it is conceptually deep, the Recursion Theorem can be difficult to utilize for practical applications. In the context of computer viruses, it can often have a negative flavor. To our mind, our work opens a new branch of research which constructively studies fixed point within constrained computation systems. Types, logics and weak arithmetics arise as good candidates for that sake.

We end with a side remark about the efficiency of fixed points. Let us cite Hansen, Nikolajsen, Träff and Jones in [2]: “[...] running a fixed-point program to compute the factorial of n results in n levels of interpretation, each one slowing down execution by a large constant factor”. This leads the authors to introduce a self-reflection statement that, supposedly, enables efficient fixed points.

The fixed points presented above never involve any interpretation layer. The construction of the specializer shows that it only introduce a constant time overhead with respect to the initial program. Therefore, the complexity of the fixed point \(\mathbf e\) is equal to the one of the program \({{{\mathbf {r}}}}_{{\mathbf {p}}}\). It involves the code of \(\mathbf{s\_1\_1}\) which is in Tiny, and thus takes constant time and so few assignment which do not increase the size of their inputs. In the end, we see that the program \(\mathbf e\) is as efficient as \({\mathbf {p}}\) on its input up to a constant factor.