Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

1 The Statement of Problem and Some Familiar Results

Let us discuss stationary open discrete system

$$\begin{aligned} X_{k+1}=AX_k. \end{aligned}$$
(4.1)

We will find the full rank matrix \(B\) of control actions with \(n\times p\) size such that the following closed stationary system

$$\begin{aligned} X_{k+1}=AX_k+BU_k+F_k \end{aligned}$$
(4.2)

will be completely controllable.

Definition 4.1

Characteristic of completely controllable for system (4.1) is the minimal number \(p\in \mathbb {N}\) such that the system (4.2) can make completely controllable by the choice of full rank matrix \(B\) of \(n\times p\) size.

On \(2010\) the article [1] was published in journal Doklady Akademii Nauk. There the structural minimization problem discussed and the following result obtained:

Theorem 4.1

Characteristic of completely controllable of system (4.1) is equivalent to the maximal geometric multiplicity of eigenvalues of \(A\).

More over, in this article the method of constructing the matrix \(B\) was established for the case where the Jordan canonical form of \(A\) was given.

In this article we obtain the method to find the characteristic of completely controllable for (4.1) and constructing the matrix \(B\) without evaluation of eigenvalues of \(A\).

2 Definitions and Some Preliminary Transformations

Suppose that \(A\) is square matrix with \(n\times n\) size, \(\lambda _j\) are eigenvalues of \(A\), \(q_A(x)\) is the minimal polynomial for \(A\) and \(d_A(x)=\det (xE-A)\). We note that the polynomial \(q_A(x)\) can be found without calculations of eigenvalues of \(A\); \(d_A(x)\) is characteristic polynomial of \(A\), multiplied by \((-1)\) powering relevant (so that the leading coefficient equal to 1) hence, this polynomial can be obtained without calculations of eigenvalues of \(A\). Let

$$\begin{aligned} q_A(x)=\prod _{j=1}^m(x-\lambda _j)^{k_j}. \end{aligned}$$

We denote

$$\begin{aligned} q(x,A,\ge r)=\prod _{j:k_j\ge r}(x-\lambda _j), \end{aligned}$$
$$\begin{aligned} q(x,A,=r)=\prod _{j:k_j=r}(x-\lambda _j). \end{aligned}$$

Let us note that the polynomial \(q(x,a,\ge r)\) can be found without factorization of \(q_A(x)\). For example,

$$\begin{aligned} q(x,A,\ge 1)=\frac{q_A(x)}{\text{ g.c.d. }(q_A(x),q'_A(x))}, \end{aligned}$$

and \(q(x,A,\ge 2)\) can be obtained by the same formula, where \(q_A(x)\) changes by \(\text{ g.c.d. }(q_A(x),q'_A(x))\) and so forth. We can evaluate the polynomials \(q(x,A,=r)\) by the polynomials \(q(x,A,\ge r)\):

$$\begin{aligned} q(x,A,\ge r)=q(x,A,=r)q(x,A,\ge r+1). \end{aligned}$$

Similarly we define \(d(x,A,\ge r)\) and \(d(x,A,=r)\).

3 The Method to Obtain the Characteristic of Completely Controllable

Let us denote \(A_1=q(A,A,\ge 1)\). Because the polynomial \(d_A(x)\) is the divisor for \((q(x,A,\ge 1))^n\) and \(d_A(A)=0,\) the following identity holds: \(A_1^n=0\). More over, from the definition of the eigenvector of \(A\) and the Jordan canonical form for the matrix \(A_1\) it follows that the eigenvectors \(\{v_1,\dots ,v_s\}\) of the matrix \(A\) form the basis in the kernel of \(A_1\).

If \(A_1=0\) then the matrix \(A\) has the basis consists of it’s eigenvectors. Hence, the geometric multiplicity of any eigenvalue of the matrix \(A\) is equivalent to it’s algebraical multiplicity. Thus, \(p=\max \{t\mid d(x,A,\ge t)\ne 1\}\) in this case.

Suppose that \(A_1\ne 0\), \(\mathop {{\mathrm{Ker}}}\nolimits (A_1)=\mathop {{\mathrm{Lin}}}\nolimits \{v_1,\dots ,v_s\}\). Let us note that we can find vectors of the basis of the kernel of \(A_1\) as orthogonal complement of the linear envelope of the set of rows of \(A_1\). Suppose that \(v_{s+1},\dots ,v_n\) is basis of the set of columns of \(A_1^T\). Let \(C_1\) be the matrix constructed of vectors \(v_1,\dots ,v_n\) as columns. Let \(j\le s\) be the fixed index and \(e_1,\dots ,e_n\) is a basis consists of unit vectors. By virtue of definition of the matrix \(C_1\) we have \(C_1e_j\in \mathop {{\mathrm{Lin}}}\nolimits \{v_1,\dots ,v_s\}\), \(AC_1e_j\in \mathop {{\mathrm{Lin}}}\nolimits \{v_1,\dots ,v_s\}\), \(C_1^{-1}AC_1e_j\in \mathop {{\mathrm{Lin}}}\nolimits \{e_1,\dots ,e_s\}\). Hence the matrix \(C_1^{-1}AC_1\) is sectional upper triangular:

$$\begin{aligned} C_1^{-1}AC_1=\left( \begin{array}{cc} M_{11}&{}M_{12}\\ 0 &{} M_{22}\end{array}\right) . \end{aligned}$$
(4.3)

Further from the arguments given above it follows that there is one-to-one correspondence between the eigenvectors of \(A\) and the eigenvectors of \(M_{11}\). Hence, the characteristics of completely controllable for matrix \(A\) and \(M_{11}\) are equivalent. Let us note that \(M_{11}\) has the basis consists of eigenvectors.

Remark 4.1

The set \(\mathop {{\mathrm{Ker}}}\nolimits (d(A,M_{11},=t))\) is the linear envelope of all eigenvectors of \(A\) such that their correspond eigenvalues has geometric multiplicity of exactly \(t\).

4 Auxiliary Statements

To describe a method of constructing a matrix \(B\) without finding eigenvalues of the matrix \(A\), we need two lemmas. The proof of the first of them is trivial, and we omit it.

Theorem 4.2

Let \(\lambda _1, \dots , \lambda _s\)—eigenvalues of a matrix \(A\), vector \(h\in \mathop {{\mathrm{Ker}}}\nolimits (\prod \limits _ {j=1} ^s (A-\lambda _jE)) \), \(h\ne 0\). Then if \(q\in \mathbb {N} \) of such that vectors \(h, ~Ah, ~A^2h, \dots , A^ {q-1} h\) are linearly independent, and vectors \(h, ~Ah, ~A^2h, \dots , A^qh\) linearly dependent, there will be eigenvectors \(z_1, \dots , z_q\in \mathop {{\mathrm{Ker}}}\nolimits (\prod \limits _{j=1} ^s (A-\lambda _jE)) \) such that \(h\,=\,z_1\,+\,z_2 \,+\,\dots \,+\,z_q\). Back, if there are eigenvectors \(z_1, \dots , z_q\in \mathop {{\mathrm{Ker}}}\nolimits (\prod \limits _{j=1} ^s (A-\lambda _jE)) \) such that \(h={z_{1}}+{z_{2}}+\cdots +{z_{q}}\), vectors \(h, ~Ah, ~A^2h, \dots , A^ {q-1} h\) are linearly independent, and vectors \(h, ~Ah, ~A^2h, \dots , A^qh\) linearly dependent.

Theorem 4.3

Let \(\pi _k(x)\) is a polynomial of degree \(t_k\) such that \(\mathop {{\mathrm{Ker}}}\nolimits (\pi _k(A))\) is the linear envelope of the eigenvectors of the matrix \(A\) of geometric multiplicity of exactly \(k\). Then, without finding the eigenvalues of the matrix \(A\) can be built vectors \(w_{1},\dots ,w_{CC}\) such that \(\mathop {{\mathrm{Ker}}}\nolimits (\pi _k(A))=\mathop {{\mathrm{Lin}}}\nolimits \{w_{1},Aw_{1},\dots ,A^{t_k-1}w_{1},\dots ,A^{t_k-1}w_{kk}\}\).

Proof

We give an algorithm that allows each step reduce one of \(k\) or \(t_k\) by \(1\). Let us take an arbitrary non-zero vector \(h_1\in \mathop {{\mathrm{Ker}}}\nolimits (\pi _k(A))\). There is a \(q\in \mathbb {N}\) such that the vectors \(h_1,ACAh_1,\dots ,A^{q-1}h_1\) are linearly independent, and vectors \(h_1,ACAh_1,p (A^2h_1,\dots ,A^qh_1\) are linearly dependent. On Lemma 1, we obtain that \(q\le t_k\). If \(q=t_k\), then \(w_{1}=h_1\), and, demanding further orthogonality of the vector \(h_2\) to all vectors \(A^jh_1\), we obtain that \(k\) has decreased by \(1\). If \(q<t_k\), then, by Lemma 1, there are eigenvectors \(z_1,\dots ,z_q\) such that \(h_1=z_1+\dots +z_q.\) Add orthogonal vectors \(v_{q+1},\dots ,v_n\) in the system of vectors \(v_1=h_1\), \(v_2=Ah_1,\dots ,v_q=A^{q-1}h_1\) to obtain the basis of all space. We write the matrix \(C_2\), the columns of which are vectors \(v_1,\dots ,v_n\). As in the previous section, the matrix \(C_2AC_2^{-1}\) is upper triangular. Let us denote by \(N_{11}\) its upper the left bloc. Left to note that there exists a polynomial \(\tilde{p}(x)\) such that

$$\begin{aligned} \pi _k(x)=d(x,N_{11},\ge 1)\tilde{p}(x). \end{aligned}$$

Thus, the problem for the polynomial \(\pi _k(x)\) is reduced to the problem for polynomials \(d(x,N_{11},\ge 1)\) and \(\tilde{p}(x)\), the sum of which degrees is equal to \(t_k\). This means that in this case we have managed to reduce the \(t_k\) at least by \(1\).

5 The Absence of Associated Vectors Case

Let us discuss the method of constructing \(B\) in the case when the matrix \(A\) has a basis consists of the eigenvalues. In this case we construct polynomials \(\pi _k(x)= d(x,A,=k)\) and, using lemma 2, we obtain vectors \(w_{1k},\dots ,w_{kk}\) for any of these polynomials. Let us denote

$$\begin{aligned} b_j=\sum \limits _{k=j}^pw_{jk}. \end{aligned}$$

Left to notice that the matrix \(B\) with the columns \(b_1,\dots ,b_p\) is sought-for matrix.

6 The Case of General Position

Let \(\tau \) is the degree of polynomial \(q_A(x)\). Let \(V_1=\mathop {{\mathrm{Ker}}}\nolimits (q(A,A,\ge 1))\), \(V_2\) are the set of all vectors from \(\mathop {{\mathrm{Ker}}}\nolimits (q(A,A,\ge 2))\), orthogonal to \(V_1\), \(V_3\) are the set of all vectors from \(\mathop {{\mathrm{Ker}}}\nolimits (q(A,A,\ge 3))\), orthogonal to \(V_1+V_2\), etc. Let us notice that for finding basis in any \(V_j\) it suffices to use orthogonalization method. Let \(W_1\) is the set of orthogonal to \(AV_2\) vectors from \(V_1\), \(W_2\) is the set of orthogonal to \(AV_3\) vectors from \(V_2\), etc.. The basis in each of the spaces \(W_j\) can be found by using orthogonalization method. Let us consider the mapping

$$\begin{aligned} g:\sum \limits _{j=1}^\tau W_j\rightarrow V_1. \end{aligned}$$

By this mapping the vector \(gw_j\in V_1\) is associated to the vector \(w_j\in W_j\) such that \(gw_j\in V_1\) is orthogonal projection of vector \(A^{j-1}w_j\) on \(V_1\). The mapping \(g\) is invertible: it is sufficient to note that \(g\) is linear, has zero kernel, and to set the basis in any of \(W_j\), and the result of mapping \(g\) on this basis.

Let us describe the method of constructing the matrix \(B\) in the case of general position. As well as the case of absence of associated vectors, we construct the polynomials \(\pi _k(x)=d(x,M_{11},=k)\) and, using the lemma 2 for each polynomial, we obtain the vectors \(w_{1k},\dots ,w_{kk}\). Further, we put

$$\begin{aligned} \tilde{b}_j=\sum \limits _{k=j}^pw_{jk}, \end{aligned}$$
$$\begin{aligned} b_j=g^{-1}\tilde{b}_j. \end{aligned}$$

Left to notice that the matrix \(B\) with the columns \(b_1,\dots ,b_p\) is sought-for matrix.