Keywords

1 Introduction

We consider the problem of constructing a spanning tree in a self-stabilizing manner. Numerous self-stabilizing spanning tree constructions have been studied until now, e.g., the spanning tree may be arbitrary (see [4]), depth-first (see [5]), breadth-first (see [6]). Deterministic solutions to these problems have been investigated in either fully identified networks [2], or rooted networks [5]. Here, we consider rooted connected networks. By “rooted” we mean that one process, called the root and noted r, is distinguished from the others. All other processes are fully anonymous. We focus on the construction of a Breadth-First Search (BFS) spanning tree in such a rooted connected network, i.e., a spanning tree in which the (hop-)distance from any node to the root is minimum. Spanning tree construction is a fundamental task in communication networks. Indeed, spanning trees are often involved in the design of routing [10] and broadcasting tasks [3], for example. Moreover, improving the efficiency of the underlying spanning tree algorithm usually also implies an improvement of the overall solution.

We consider here the atomic state model, also called the locally shared memory model with composite atomicity. In this model, the daemon assumption accepted by the algorithm is crucial since it captures the asynchrony of the system. More generally, self-stabilizing solutions are also discriminated according to their stabilization time (usually in rounds) and their memory requirement.

Related Work. There are many self-stabilizing BFS spanning tree constructions in the literature. Maybe the first one is that of Chen et al. [4]. It is proven in the atomic state model under the central unfair daemon and no time complexity analysis is given. The algorithm of Huang and Chen [11] is proposed in the atomic state model, yet under a distributed unfair daemon. In [8], the stabilization time of this algorithm is shown to be \(\varTheta (n)\) rounds in the worst case, where n is the number of processes. Another algorithm, implemented in the link-register model, is given in [9]. It uses unbounded process local memories. However, it is shown in [8] that a straightforward bounded-memory variant of this algorithm, working in the atomic state model, achieves an optimal stabilization time in rounds, i.e., \(O({\mathcal {D}})\) rounds where \({\mathcal {D}}\) is the network diameter. In [1], Afek and Bremler design a solution for unidirectional networks in the message-passing model, assuming bounded capacity links. The stabilization time of this latter algorithm is O(n) rounds. The algorithm given in [7] has a stabilization time \(O({\mathcal {D}}^2)\) rounds, assuming the atomic state model and a distributed unfair daemon. All aforementioned solutions [1, 4, 7, 9, 11] also achieve silence. Two other non-silent, a.k.a. talkative, self-stabilizing BFS spanning tree constructions have been proposed in the atomic state model. The algorithm in [6] is proven under the distributed unfair daemon and has a stabilization time in O(n) rounds. In [12], the proposed solution assumes a distributed weakly fair daemon and its stabilization time is not investigated. Except for [12], in all these aforementioned algorithms, each process has a distance variable which keeps track of the current level of the process in the BFS tree. Thus, these BFS spanning tree constructions have a space complexity in \(\varOmega (\log ({\mathcal {D}}))\) bits per process. In contrast, the solution given in [12] does not compute any distance value (actually, the construction is done using synchronization phases). Consequently, the obtained memory requirement only depends on local parameters, i.e., \(\varTheta (\log (\delta _p))\) bits per process p, where \(\delta _p\) the local degree of p. In other words, the space complexity of this algorithm is intrinsically \(\varTheta (1)\) bits per edge. Moreover, the algorithm does not need a priori knowledge of any global parameter on the network such as \({\mathcal {D}}\) or n. It is worth noticing that today it is still the best self-stabilizing BFS spanning tree construction in terms of memory requirement.

Contribution. We fill the blanks in the analysis of the memory-efficient self-stabilizing BFS spanning tree construction given in [12]. Precisely, we study a slightly modified (maybe simpler) version of the algorithm. This variant still achieves a memory requirement in \(\varTheta (1)\) bits of memory per edge. We prove its self-stabilization under the distributed unfair daemon, the weakest scheduling assumption. Moreover, we establish a stabilization time in \(O({\mathcal {D}}\cdot n^2)\) rounds, where \({\mathcal {D}}\) is the network diameter and n the number of processes. All the technical material has been omitted from this brief announcement and is available online at the following address:

https://arxiv.org/abs/1907.07944

2 The Algorithm

Our algorithm executes infinitely many tree constructions in sequence: at the end of each construction, the root of the network (r), called here the legal root, initiates a new one. The algorithm alternatively builds 0-colored and 1-colored BFS spanning trees. The color is used to eventually distinguish processes belonging to the legal tree (rooted at r) from those which do not. We denote by \(r\_color\) the current color of the legal tree.

One of the difficulty to build a BFS tree without using a distance variable is to ensure that the path from any process to r in the tree is minimal. To solve this issue, each construction is split into phases synchronized at r: when r detects the end of the current phase, it initiates the next one, if the construction is not yet complete; otherwise, r starts a new tree construction. Once the system is stabilized, each construction is made of at most \({\mathcal {D}}\) phases: during the kth phase, all processes at distance k from r join the current tree (by choosing as a parent a neighbor at distance \(k-1\) from r). However, during the convergence, a tree construction can contain up to n phases.

Shared Variables. In the following, for every process u, we denote by N(u) the set of u’s neighbors and by dist(u) the distance from u to r.

Each non-root process u maintains the following variables:

  • \(TS.u \in N(u) \cup \{\perp \}\): the output parent pointer of u. Once the system is stabilized, TS variables are constant and distributedly define a BFS spanning tree rooted at r. More precisely, when the system is stabilized, \(dist(u) = dist(TS.u)+1\) if \(u \ne r\), \(dist(u) = 0\) otherwise.

  • \(P.u \in N(u) \cup \{\perp \}\): another parent pointer. Once the system is stabilized, if \(P.u \ne \perp \), then \(P.u = TS.u\). Actually, P.u is used to inform the pointed neighbors about whether or not the construction of the subtree rooted at u is terminated. Precisely, if at end of a phase u is childless, i.e., \(\forall v \in N(u)\), \(P.v \ne u\), then the subtree construction rooted at u is done.

  • \(C.u \in \{0, 1\}\): the color of u. Once the system is stabilized, the processes in the current tree have color \(C.r = r\_color\), while other processes have color \(\overline{r\_color}\).

  • \(S.u \in \{Idle, Working, Power, WeakE, StrongE\}\): the status of u. Status WeakE and StrongE are only used to correct errors. Process u is said to have an Erroneous status if \(S.u \in \{WeakE, StrongE\}\). Only processes of status Power can gain new children. Once the system is stabilized, during the kth phase, only processes at distance \(k-1\) from r, i.e., leaves of the tree under construction, will acquire the status Power. If u is an internal process in the tree under construction and is participating to the current phase, then u has the status Working. If u is not involved in the current tree construction (i.e., \(P.u = \perp \)), or the current phase is either not started or finished in the subtree of u, then u has the status Idle.

  • \(ph.u \in \{a, b\}\): the current phase. This variable is used to distinguish any two consecutive phases. In particular, it allows to solve the following ambiguity. A process in the tree is Idle either when it has completed or has not yet started the current phase. In order to distinguish these two cases, we use the phase variable. If the phase value of an Idle process u is the same as that of its parent P.u, then u has finished the current phase. Otherwise, it has not yet initiated the current phase.

The root r maintains the same variables, except P and TS: r does not have a parent. Moreover, \(S.r \in \{Power, Working, StrongE\}\): Status WeakE does not exist for r.

Tree Construction. This part is identical to [12]. The current tree construction is over when r becomes childless, i.e., \(\forall v \in N(r), P.v \ne r\). Then, r starts a new tree construction: it changes its color and initiates the first phase by taking the status Power.

A phase is divided into three stages: forwarding, tree expansion, and backward. The processes in the tree under construction first propagate the phase initiated by r to the leaves of the tree under construction (forwarding part). Then, the neighbors of the leaves that are not in the tree join it (tree expansion part). Finally, non-root tree nodes backtrack to inform r of the end of the tree expansion (backward part).

In the forwarding stage of the kth phase, along the tree under construction, internal nodes take the status Working and copy the phase value of r in a top-down fashion; then, the leaves (at distance \(k-1\) from r) take the status Power and also copy the phase value of r. This stage requires h rounds, where \(h\le n\) is the height of the tree rooted at r under construction.

In the tree expansion stage of the kth phase, all processes at distance k from r join the tree under construction: they choose a neighbor of status Power as a parent (by updating their variables P and TS), copy both the phase value and color of their new parent, and take the status Idle. Each tree expansion stage requires O(1) round (n.b., this stage may be slow down by a constant number of rounds due to local corrections).

In the backward stage of the kth phase, the processes of status Power finalize the kth phase by switching their status to Idle when the current phase is over in their neighborhood: all their neighbors are in the tree (they have the color \(r\_color\)). The Working processes then finalize the current phase by switching their status to Idle bottom-up when their children have finished the current phase (they have the status Idle and the same phase value as them). Moreover, when the subtree rooted at a process u is completely built (i.e., \(\forall v \in N(u)\) we have \(P.v \ne u\)), u additionally sets its P variable to \(\perp \). The backward part of tree phase construction requires \(h-1\) rounds.

When the kth phase is over (i.e., all processes of the legal tree have the status Idle, the same phase as r, and their color is \(r\_color\)), r initiates the \(k+1\) phase if the construction is not done (otherwise, it starts a new tree construction): r changes its phase value and takes the status Working.

Thus, a phase lasts O(n) rounds and a tree construction costs \(O(n^2)\) rounds.

Error Correction. There are two major correction tasks: one is to remove the illegal trees, the other is to break the (parent) cycles. We solve these two tasks using mechanisms that are simpler than those proposed in [12].

The error correction is based on conflicts, faulty process, and illegal root detection. There are two kinds of conflicts, the (weak) conflicts and the strong ones.

When a process detects a conflict, is faulty, or is an illegal root, it switches to an Erroneous status, either StrongE (if it is a strong conflict), or WeakE (in all other cases), and leaves its tree by setting their P variable to \(\perp \). Then, its children take status WeakE and leave the tree, and so on: these corrections are propagated toward the leaves. The Erroneous detached processes (i.e., processes without parent and children) recover by changing their status to Idle. We have two cases. If the process has the status WeakE, it directly recovers. Otherwise, it waits until none of its neighbors has the status Power (these latter, if any, are enabled to take the status WeakE and leave their tree, as explained later).

Let u be a non-root process. First, if u has children but no parent, it is an illegal root. Then, u is faulty if it has a parent, but its state is not consistent with that of its parent. In either case, u takes the status WeakE and leaves its tree, as previously explained.

The major difficulty for the error correction is to remove parent cycles while no distance value is available. A process having a parent assumes that it is in the legal tree and its color is equal to \(r\_color\) (even if it is inside a cycle). Based on this assumption, it detects a conflict when one of its neighbors does not have its color but has the status Power (both processes cannot be inside the legal tree). Once a conflict detected by a process \(u\ne r\), it takes WeakE and leaves its tree. Hence, the potential cycle is transformed into an illegal tree.

If a process \(u \ne r\) has in its closed neighborhood two Power processes, but not of the same color, then u detects a strong conflict: both processes cannot be in the legal tree, u takes the status StrongE, and leaves its tree. The root r detects a strong conflict if one of its neighbors v has the status Power but either v has not its color, or r is childless. When the root detects a strong conflict it simply takes the status StrongE (recall that r has no P variable and so cannot leave its tree). All Power neighbors of a StrongE process take the status WeakE and leave their tree. So, a StrongE process has to wait until the status Power vanishes from its neighborhood before recovering.

Overview of the Round Complexity. After O(n) rounds, no process in the legal tree may detect a conflict, and so create a new illegal branch that may contain Power processes. After one more O(n) rounds, only process in the legal tree may have the status Power. Let \(\gamma \) be a configuration where only processes in the legal tree may have the Power status. After at most d complete tree constructions from \(\gamma \), processes at distance less than d from r never more belong to any cycle or illegal tree, i.e., they are stabilized. So after at most \({\mathcal {D}}+1\) complete tree construction from \(\gamma \), there is no cycle or illegal tree at all: all processes are stabilized. Any complete tree construction requires \(O(n^2)\) rounds. So, the round complexity of our algorithm is \(O({\mathcal {D}}\cdot n^2)\) rounds.