1 Introduction

Pipelines are the basic infrastructure frequently used for transporting oil, gas, or liquids for industries and homes. Pipelines are generally aging over time and are facing the danger of breakage because of cracks, corrosion, and third-party damage from excavating and digging, among others. Thus, periodic maintenance is required to prevent these problems. However, pipelines are typically installed underground or intricately intertwined, thereby making it quite difficult for workers to access them. The aforementioned problems can be solved by utilizing in-pipe robots, which can autonomously navigate and perform tasks inside the pipelines. However, the robot system should be able to recognize the pipeline configuration in advance for it to be deployed inside the pipelines. The robot must be navigated to autonomously determine the suitable driving directions based on the information of the pipeline elements and perform the control action. In addition, the information obtained if the robot passes through unknown pipelines can be used to construct the overall pipeline configuration. Several in-pipe robots have been developed to date. MAKRO uses infrared ray (IR) sensors and stereo-cameras for autonomous navigation inside sewer pipelines [3,4,5,6,7]. It also uses the image processing technique to recognize the inlets of branches as landmarks because they are always located on the side of the sewer pipeline. KANTARO is a single module robot equipped with a stereo-camera on the front side and a laser scanner on the back side [8,9,10,11]. KANTARO recognizes manholes or inlets through pattern matching of the camera images. The distance from a manhole can be measured by considering the similarities and differences between the two images in a stereo-camera. The MRINSPECT series provides an even more robust method for recognizing the pipeline elements by using a CCD camera and an illuminator for image processing [12,13,14,15,16,17,18,19]. The MRINSPECT creates the landmark and recognizes pipeline elements by using the image from the camera. However, it is vulnerable to pipeline lightening and surface conditions. In reality, most of the underground pipelines do not allow robots to communicate with the outside world or use external sensors, such as that in a global positioning system (GPS). Thus, recognizing the pipeline geometry is extremely important to navigate inside the pipelines. This study proposes a method to recognize the geometry inside the pipelines by using a monocular camera and PSD sensors. The recognition method using position sensitive device (PSD) sensors and the concept of the method using both a monocular camera and PSD sensors were presented in our previous research [1, 2]. The present work extends the previous research, including the detection method of an elbow, which is more simple than the previous one, the method of estimating the radius of the curvature of the elbow, and the integrated recognition system. The method can be used to recognize pipeline elements, such as a straight pipe, an elbow, and a branch. The technique can even distinguish the T-branch and the miter. Moreover, more accurately determining the opening direction of the elbow and the branch is possible. Accordingly, we develop a sensor suite and implement it in MRINSPECT VI, which is an in-pipe robot, to experimentally validate the proposed method.

This paper is organized as follows: The motivation for this research is addressed in Sect. 2; the recognition algorithm is explained in Sect. 3; the sensor suite design is introduced in Sect. 4; Sect. 5 briefly summarizes the recognition method using the shadow image and addresses its advantages and limitations; the recognition method using the PSD sensors is explained in Sect. 6; the experiment results are included in Sect. 7; and the conclusions and future work are provided in Sect. 8.

Fig. 1
figure 1

The shape of the T-branch (a) and the miter (b)

2 Motivations

Pipelines generally consist of several primary elements, such as straight pipelines, elbows, and branches. Branches are divided into a T-branch and a miter. T-branches in pipelines are commonly used in initial pipeline construction. On the contrary, miters are installed when existing pipelines are changed or rebuilt. For example, the miter is used in the case of adding another pipeline to the existing ones by tentatively welding or installing the launcher to the pipelines for deploying the in-pipe robot.

The shapes of the T-branch and the miter are geometrically discriminated, as shown in Fig. 1. The T-branch is similar to two elbows that are reversely attached to each other, whereas the miter is similar to two straight pipelines welded at a right angle to each other. Thus, a robot can pass through the T-branch along the same path as in an elbow. However, it is totally different in the case of the miter because of its sharp edge. Thus, a robot needs to perform an even more elaborate control to pass through the miter compared to when it passes through a T-branch. In the case of Explorer [20], which is one of the most well-known, in-pipe robots commercially available, the robot requires complicated path planning and significant amount of time to pass through the miter. Sometimes, even smaller errors result in the failure of the robot operation. Therefore, the strategy to move through the miter should be totally different from that in the T-branch. However, researchers have not yet thoroughly investigated how to handle these elements. In fact, preliminary recognition should be performed to pass through these elements because the path direction should be more accurate than before. In addition, a more exact pipeline recognition technology is necessary.

The method of employing a monocular camera to recognize the interiors of a pipeline was previously reported in [12,13,14]. This method used image processing with the shadow in the image of a camera. Thus, the shape can be distinguished to find the direction of the elbow’s path and the T-branch. However, the T-branch and the miter cannot be distinguished using this method. In addition, image processing techniques are quite vulnerable to environmental conditions, such as lighting and surface status. Meanwhile, the camera allows for pipeline elements to be recognized ahead of time, which is significantly advantageous. In this paper, we propose a method of applying range sensors, such as PSD sensors, to improve the robustness and accuracy of recognition. The drawback of the image processing is compensated using PSD sensors. As a result, we can more accurately recognize the pipeline geometry.

Fig. 2
figure 2

Schematic algorithm

3 Proposed recognition algorithm

3.1 Overviews

Fig. 2 shows that the pipeline elements are classified into elbows, T-branch types I and II, and miter types I and II depending on their geometries and the direction of the robot’s approach. The most important thing is to recognize the pipeline elements in advance. In this regard, the image processing technique is advantageous. The other sensors can only work in the vicinity of the pipeline element because they have a limitation in their sensing range. In the case of the distance sensors (e.g., IR, ultrasonic, and PSD sensors), the pipeline elements can be recognized in advance if there is a wall in front of the robot. However, these sensors cannot be used in the case of a branch type I, as shown in Fig. 2. In contrast, a camera can be used to recognize all the pipeline elements. However, the direction acquired by the camera can be inaccurate because the camera is very much sensitive to environmental conditions. Therefore, we used the information acquired by the image processing to classify the pipeline elements in advance. The image information provides supplementary information to detect even more details on the pipeline geometry, such as the direction of the openings.

Fig. 3
figure 3

Procedure of the integrated sensing system

Distance information is useful to find a more accurate geometry and the details of the pipeline interior. A PSD sensor, an ultrasonic sensor, and a laser range finder can be considered as a candidate to measure the distance data. However, ultrasonic sensors are not appropriate inside the pipelines because of the interferences. Laser range finders are usually used for navigation algorithm of mobile robot [21,22,23,24]. But, they have large size and consume much computing resources. Thus, the PSD sensor is employed in this work. Moreover, the angle of the hole direction is detected if the pipeline elements are classified using image processing. The T-branch and the miter are also distinguished using the PSD sensors. The details of the proposed method are explained in Sect. 6.

3.2 Recognition method for the pipeline environments

The proposed method uses a camera to sense at long distances and uses a PSD sensor for close distances. Various pipeline elements are classified into four types by using both sensors. The hole direction of the pipeline elements is also accurately determined. The actual situations when the robot travels step by step through the pipeline in Fig. 3 are explained as follows:

  1. Step 1

    The situation here is such that the robot travels through a straight pipeline. Image processing always works in this case, and the PSD sensor data are not important until any other pipeline element is recognized.

  2. Step 2

    In this step, the camera recognizes a pipeline element by using the CCD camera and image processing. The shadow image is matched with a sample image, and the element type and the pipeline hole direction can be identified. The PSD sensor data are still not used.

  3. Step 3

    In this step, the robot travels closer to the element although the sensor suite does not yet enter the starting point of the element. The estimation using the shadow is terminated because the size of the shadow area becomes too wide to be matched. At this stage, the PSD sensor returns meaningless values as a result of its range. When the sensor suite moves into the pipe element, the sensor provides reasonable values and step 4 begins.

  4. Step 4

    In the final step, only the PSD sensor data are used for the recognition. The hole direction and its classification are sensed using the PSD sensors as the sensor suite enters the element.

The details will be explained in the next sections.

4 Sensor hardware design

The design requirements of the sensor hardware should be considered in advance to develop it. First, it needs to pass through the pipeline elements without collision. Second, the distance between the inner wall of the pipeline and the PSD sensor of the sensor suite should be larger than the minimum range of the PSD sensor.

Fig. 4 shows the relationship for the external dimension. The width W refers to the diameter of the sensor suite. The height h refers to that of the sensor suite. The diameter D is that of the pipeline. Assuming that the sensor suite passes through the pipeline along the centerline, the condition can be expressed using the Pythagorean theorem as follows:

$$\begin{aligned} h_\mathrm{max}=2\sqrt{(R+D/2)^{2}-(R+W/2)^{2}}. \end{aligned}$$
(1)

The sensor suite size was determined using two variables, h and W. For example, assuming that the width W of the sensor suite was 95 mm, the maximum height of the sensor suite was determined to be 250 mm using Eq. (1). The height of the sensor suite in this study was set as 130 mm to produce an even smoother movement and remove the wasted space. The second condition refers to the minimum distance of the PSD sensor, which was measured between the sensor suite and the pipeline elements. The elbow in this case was a more harsh environment than the miter because the sensor suite must be located in the closer position to detect the hole direction (Fig. 5). However, the diameter of the position of the side PSD sensors was set to 85 mm because it had the shortest distance in the direction opposite to the hole.

Fig. 4
figure 4

Condition of the sensor suite dimension at the miter

Fig. 5
figure 5

Condition of the sensor suite dimension for the PSD sensor

Fig. 6
figure 6

Sensor suite of the MRINSPECT VI

Figure 6 shows that the sensor suite is composed of a camera (Logitech Webcam C905), LED lights, and 10 PSD sensors (two in the front and eight in the side and along the radial direction, SHARP GP2Y0A41SK) with the sensing range from 40 mm to 300 mm. The sensor suite was attached in the front side of the robot (i.e., MRINSPECT VI) shown in Fig. 7. The analog datum acquired via each of the PSD sensors was converted into digital datum by using a 10-bit A/D converter. An IMU sensor (Xsens Inc.) mounted in the sensor suite was used to estimate the pose of the sensor suite.

The front PSD sensor of the sensor suite can be used to estimate the radius of the curvature of the elbow and measure the distance from the robot to the front of the obstacle. The side PSD sensor of the sensor suite detects the radial distances to the inside pipeline wall. The number and layout of the PSD sensors in the case of the side PSD sensors should be carefully determined because it is directly related to the sensing algorithm. The angle between the side PSD sensor and the adjacent one is \(45^{\circ }\). Consequently, eight PSD sensors are sufficient to recognize most of the pipeline elements by using the recognition method addressed in the following section.

Fig. 7
figure 7

Driving module of the MRINSPECT VI

5 Recognition of pipeline elements by using shadow image

The reader is referred to [12] for the detailed explanation on the recognition algorithm by using shadow images. A brief introduction is presented in this section as follows: First, the image is converted to a grayscale to simplify the image obtained by the camera. The image is then also converted to a binary image based on a threshold, and the noise is filtered out, as shown in Fig. 8a. Each shadow extracted in the image is given an ID (Fig. 8b) to remove other small shadows or noises. The unnecessary shadows are removed through size filtering. The center point(\({x_{avg}}\), \({y_{avg}}\)) of the labeled region with respect to the center frame of the image is calculated as shown in Fig. 8c to estimate the hole direction. We can then calculate the angle \({\theta }\) and the length L between the center points of the screen and of the shadow region. We can estimate the hole direction from the current robot position by using the angle \({\theta }\). Finally, as shown in Fig. 8d, the pattern matching is used to compare the candidate shadow and the sample image. The shadow is determined to be a real path shadow or not.

Fig. 8
figure 8

Steps in image processing. a Extracting the shadow region of the image; b labeling; c extracting information of the shadow region; and d pattern matching

Fig. 9
figure 9

Detecting branch type II in the T-branch (a) and the miter (b)

The pipeline shape is recognized through the pipeline-type decision process if the shadow is a real path shadow. Figure 2 shows the decision process. The abovementioned process uses the shadow to classify the pipeline elements among elbow, T-branch type I, and T-branch type II.

The first advantage of using the shadow image is that additional equipment is not needed because a camera is definitely installed in the in-pipe robot to monitor the interior of the pipelines. However, some limitations should also be considered. First, distinguishing the additional geometric features of the pipeline by using only the information obtained through image processing is difficult. Figure 9 shows the results of sensing by using the shadow-based method. Both the T-branch and miter can be recognized as a T-branch type II. Second, a significant drawback exists. That is, it can only be recognized when the shadow is similar in size to that of the sample image because it uses a pattern matching method. Hu’s invariant moments on pattern matching were applied to overcome this problem [25,26,27]. However, this method can only be used when the ideal shadow is obtained. Rusted spots in the pipelines are sometimes recognized as an elbow. Therefore, using this method in a real setting is difficult. Finally, the size and shape of the shadow are noted to be different according to the lighting conditions. In this case, the path direction is incorrect. Some of these weaknesses can be overcome if a linear laser pattern is used instead of a shadow during image processing [13]. However, the method cannot also distinguish the T-branch from the miter. Thus, we use image processing with the shadow to classify the type of pipeline elements in the study. The information acquired through this method will be supplemented by another sensor.

Fig. 10
figure 10

Characteristics of the measurements by using the PSD sensors. a Straight pipe; b elbow pipe; c branch (miter) pipe type I; and d branch (miter) pipe type II

6 Recognition of the pipe configuration by using the PSD sensors

6.1 Detection of the pipeline elements

The type of pipeline elements was determined by another recognition method using PSD sensors to recognize the precise configuration of the pipelines, including the hole direction and the radius of the curvature.

Table 1 Detection of the pipeline element type

The pipeline elements were divided into straight, elbow, branch type I, and branch type II. Figure 10 shows the situation, in which the sensor suite recognizes the pipeline elements. The front PSD sensor for a straight pipeline detected nothing because it was looking at the far front, and no wall existed within the range of the PSD sensor. The side PSD sensors measured the distance to the pipeline walls, which were similar to each other. Both the front and side PSD sensors in the elbow detected the pipeline walls if they were within the range of the PSD sensor. The branch was finally classified into types I and II according to the direction, in which the robot approaches. Thus, the front PSD sensor and one of the side PSD sensors cannot detect the walls in the case of a branch type I. Meanwhile, the front PSD sensor and two side PSD sensors located in the opposite direction cannot detect the walls for the branch type II. Table 1 presents a summary of the aforementioned PSD responses.

6.2 Straight pipeline

The front PSD sensor looks at the far front if the other pipeline elements do not exist in the straight pipeline. Hence, the data for the front PSD sensor were very low or 0. In addition, the side PSD sensors read similar values for each other because the sensor suite was located almost in the middle of the pipeline. Moreover, these data do not very change when a sensor suite moves forward. In general, recognizing a straight pipe is not important. However, it is very useful in recognizing the starting position of a straight pipeline after passing again through elbows or T-branches.

6.3 Branch

Assuming that the sensor suite was at the center of the branch, the section measured by the PSD sensors was as depicted in Fig. 11. The sensor suite was considered to be located on the center of the two parallel lines. Thus, the sensor angle on the sides of the PSD sensor near the hole direction (the direction of the opening) was always \(90^{\circ }\). The rectangle diagonal made by the two data lines was always located on the hole direction. Each value of the PSD sensor was assumed as a vector to build the mathematical model. The vector’s direction was the PSD sensor’s direction. Moreover, the magnitude was equal to the distance. The vector sum direction between the \(i+1\)th and \(i-1\)th PSD sensors was then always in the hole direction if the ith PSD sensor was placed near the hole direction. In addition, we can now calculate the angle between the directions of the first PSD sensor and the hole because the PSD sensors were fixed on the robot. We should first find the PSD sensor to be located near the hole direction before the calculation. We can then obtain each of the data from the PSD sensor depending on the distance when the sensor suite was inserted in the branch. The distance measured by this sensor was farther than a fiducial value if the ith PSD sensor was placed near the hole direction. The sensor’s value at that time was low or almost 0. Conversely, a hole near the sensor’s direction is found if the data from the sensor was low or 0.

Fig. 11
figure 11

PSD sensor’s direction on the section

We used the distance values measured by the \(i-1\)th and \(i+1\)th PSD sensors to calculate the hole direction angle. We then calculated the angle between the direction vector for the \(i+1\)th PSD sensor and the direction vector for the hole by using these two distance values, which can be expressed as follows:

$$\begin{aligned} \alpha = \arccos \left( l_{i+1}/\sqrt{(l_{i+1})^{2}+(l_{i-1})^{2}}\right) . \end{aligned}$$
(2)

The ith PSD sensor’s angle from the first PSD sensor is given by:

$$\begin{aligned} \theta _{i}=(i-1)\times 45. \end{aligned}$$
(3)

Therefore, the \(i+1\)th PSD sensor’s angle from the first PSD sensor is obtained as follows:

$$\begin{aligned} \theta _{i+1}=i\times 45. \end{aligned}$$
(4)

We can then calculate the angle of the hole direction from the direction of the first PSD by using \({\theta _{i+1}}\) and \({\alpha }\) as follows:

$$\begin{aligned} \theta _{\mathrm{hole-branch}}=i\times 45-\alpha . \end{aligned}$$
(5)

6.4 Elbow

Applying the same method as that in the branch was not possible in the elbow because the sensor suite was assumed to be located in the elbow’s center. However, the sensor suite crashed into the wall before reaching the elbow’s center, as shown in Fig. 12.

Fig. 12
figure 12

Situation in the elbow

Development was needed for the method to detect the elbow. The elbow was characterized with the direction of the opening relative to the approaching direction of the robot. The mathematical model of the cross-section relative to the approaching direction was first developed to detect the direction of the opening. The mathematical model of the elbow consisted of a circle rotated \(90^{\circ }\) on the X-axis. Thus, the equation of the circle having the center point (0, \(2r\times d\), 0) can be represented as follows:

$$\begin{aligned} \begin{bmatrix}x\\ y\\ z\end{bmatrix}=\begin{bmatrix}r\cos {\beta }\\(r\sin {\beta }+2r\times d)\\ 0\end{bmatrix}, \end{aligned}$$
(6)

where d is the radius of the curvature. The range of the angle \(\beta \) is given by the following equation:

$$\begin{aligned} 0^{\circ }\le \beta \le 360^{\circ }. \end{aligned}$$
(7)

The circle should then be rotated on the X-axis to make the elbow shape. Therefore, the equation of the elbow is expressed as the equation of the circle multiplied by the rotation matrix on the X-axis.

$$\begin{aligned} \begin{bmatrix} {x}^{'}\\ {y}^{'}\\ {z}^{'} \end{bmatrix}= & {} \begin{bmatrix} 1&\quad 0&\quad 0 \\ 0&\quad cos\alpha&\quad -sin\alpha \\ 0&\quad sin\alpha&\quad cos\alpha \end{bmatrix} \begin{bmatrix} {r}cos\beta \\ {r}sin\beta +2r\times d\\ 0 \end{bmatrix}\nonumber \\= & {} \begin{bmatrix} {r}cos\beta \\ {({r}sin\beta +2r\times d)}cos\alpha \\ {({r}sin\beta +2r\times d)}sin\alpha \end{bmatrix} , \end{aligned}$$
(8)

where the range of angle \(\alpha \) is given by the following equation:

$$\begin{aligned} 0^{\circ }\le \alpha \le 90^{\circ }. \end{aligned}$$
(9)

However, the section calculated by Eq (8) was different from that measured by the PSD sensors. The equation of the elbow is represented as follows because the section measured by the PSD sensors was parallel with the XY plane:

$$\begin{aligned} \begin{bmatrix}{x}^{''}\\{y}^{''}\\ {z}^{''}\end{bmatrix}=\begin{bmatrix}r\cos {\beta }\\(r\sin {\beta }+2r\times d)\cos ({\arcsin ({K})})\\ H\end{bmatrix}, \end{aligned}$$
(10)

where K is the angle \(\alpha \) calculated by \(\beta \) and H, which was calculated as follows:

$$\begin{aligned} K=\frac{H}{r\sin {\beta }+2r\times d}. \end{aligned}$$
(11)
Fig. 13
figure 13

Mathematical modeling of the elbow

Therefore, the cross-section geometry can be represented with Eqs. (6)–(11). Figure 13 shows the results. In these equations, H represents the length as the sensor suite moves into the elbow, as measured using the front PSD sensor. The new x and y coordinates can be calculated using the abovementioned equations when the sensor suite moves in the elbow along the constant length of H in the Z direction, as shown in Eqs. (10) and (11). The conditions for the variables were assumed as follows to confirm the equation: The pipe diameter was 150 mm; the radius of the curvature was 1.5 D; and H was 100 mm. The graph looked like an ellipse, while \(\beta \) changed from \(0^{\circ }\) to \(360^{\circ }\), when Eqs. (10) and (11) were used (Fig. 14). The new point of origin \(O^{'}\) at this moment was the center of the sensor suite. This point was equal to (0, \(2r\times d\), H). The angle between the new coordinates (\(x^{''}\), \(y^{''}\), H) and the new point of origin (0, \(2r\times d\), H) was then indicated as \(\gamma \). The PSD sensor’s data were related to \(\gamma \). First, \(\gamma _{1}\) was assumed with \(0^{\circ }\), and \(\gamma _{2}{\sim }\gamma _{8}\) corresponded to \(45^{\circ }\) increases. The sum of the vectors can then be calculated if each point was assumed to be the point of the vector. This vector direction was similarly located in the hole direction. The results of the vector sum were shown in Fig. 14, while \(\gamma _{1}\) changes from \(0^{\circ }\) to \(45^{\circ }\) to check for all the cases. In these equations, \(l_{i}\) represents the distance measured for the ith PSD sensor. An error occurred for the direction of the angle calculated using this method, but it was negligible.

Fig. 14
figure 14

Recognition principle of the elbow direction

The principle of the abovementioned process was complicated, but the process to calculate the elbow’s direction was very simple. As mentioned in Sect. 6.2, the distance data were measured from the PSD sensors, and each PSD sensor was installed at regular angles. Therefore, the distance data can change for the vectors, and the sum of vectors can also be calculated using these vectors. Each vector of the ith PSD sensor is expressed as follows to detect the angle direction from the first PSD sensor, as in the case of the branch:

$$\begin{aligned} \begin{bmatrix}x_{i}\\y_{i}\end{bmatrix}= \begin{bmatrix}l_{i}\cos (45\times (i-1))\\l_{i}\sin (45\times (i-1))\end{bmatrix}. \end{aligned}$$
(12)

Therefore, the sum of the vectors is represented by the following equation:

$$\begin{aligned} \begin{bmatrix}x_\mathrm{{sum}}\\y_\mathrm{{sum}}\end{bmatrix}= \begin{bmatrix}\sum \nolimits _{i=1}^{8}l_{i}\cos (45\times (i-1))\\\sum \nolimits _{i=1}^{8}l_{i}\sin (45\times (i-1))\end{bmatrix}. \end{aligned}$$
(13)

As shown in Fig. 15, the hole direction angle from the first PSD sensor was easily detected as follows:

$$\begin{aligned} \theta _\mathrm{{hole}-\mathrm {elbow}}=\arctan (y_\mathrm{{sum}}/x_\mathrm{{sum}}). \end{aligned}$$
(14)

This method was less sensitive in using the longest and second distances than the existing one. Moreover, the method can be used regardless of the pipeline diameter and the radius of the curvature, which was greatly advantageous in the pipeline environment.

Fig. 15
figure 15

Measured section of the elbow

Fig. 16
figure 16

Measurement principle of the radius of curvature

Fig. 17
figure 17

Front view of the T-branch (a) and the miter (b)

Fig. 18
figure 18

Principle of distinguishing the T-branch (a) and the miter (b)

Table 2 Theoretical distance for detecting the elbow’s radius of curvature

Another characteristic feature of the elbow is the curvature. The elbows have different curvatures depending on the applications. However, they are not arbitrary. The determined radii of curvatures (e.g., 1D, 1.5D, and 3D) were used herein. Thus, they can be detected by calculating the different distances between the values of two front PSD sensors for each case. The two PSD sensors should be aligned with the hole direction before the radius of the curvature is identified. The sensor suite was set as shown in Fig. 16. In the figure, d indicates the elbow’s radius of the curvature, while r is the radius of the pipeline. t represents the distance between the PSD sensor and the center of the sensor suite designed to be 20 mm. The theoretical distance between the sensors and the wall of the pipelines \(x_1\) and \(x_2\) was then calculated as follows:

$$\begin{aligned} {x}_{1}= & {} ({d}\times {2r}+{r})\text{ sin }\left\{ \text{ arccos }{\left( \frac{({d}\times {2r}+{t})}{({d}\times {2r}+{r})}\right) }\right\} .\end{aligned}$$
(15)
$$\begin{aligned} {x}_{2}= & {} ({d}\times {2r}+{r})\text{ sin }\left\{ \text{ arccos }{\left( \frac{({d}\times {2r}-{t})}{({d}\times {2r}+{r})}\right) }\right\} . \end{aligned}$$
(16)

The difference for \(x_1\) and \(x_2\) can be computed by substituting the values of the 6- and 8-inch pipelines to r and d, respectively. The result is summarized in Table 2. The values in the table were matched by the difference measured in the PSD sensor to identify the radius of the curvature.

6.5 Distinction between the T-branch and the miter

Figure 17 shows that distinguishing between the T-branch and the miter is difficult using images from the camera. Figure 18 illustrates that the geometric difference between the T-branch and the miter was the radius of the curvature. The T-branch had a radius in its curvature, but the radius of the curvature in the miter was zero. Thus, the pipeline direction in the miter rapidly changed. The PSD sensors can be used to detect this difference. The distance from the PSD sensor to the wall was measured while the robot with the sensor suite moves slowly by using the side PSD sensors (Fig. 18). In other words, it was the miter if the value of the data changed above a given limit; otherwise, it was the T-branch. The measured distance slowly increased in the T-branch and rapidly increased in the miter. The PSD sensor was very useful in detecting the differences in the curvature.

7 Experiments

7.1 Outline of the MRINSPECT VI

MRINSPECT VI (Multifunction Robot for INSPECTion of pipeline) shown in Fig. 7 was used in the experiments. This robot was designed to drive in \(150{\sim }200\hbox { mm}\) gas pipelines. The robot comprised three passive and three active wheels, a wall-pressing mechanism, and a multi-axial differential gear mechanism. The wheel arms were \(120^{\circ }\) apart from each other. The robots can be driven with only a single motor in an elbow pipe without additional control effort, while all the three wheels continue to drive using power. Please refer to [19] for the details of the robot.

Fig. 19
figure 19

Side PSD sensor data in the T-branch

Fig. 20
figure 20

Side PSD sensor data in the miter

Fig. 21
figure 21

Side PSD sensor data in the straight pipeline

Fig. 22
figure 22

Results of the integrated system at the elbow

Fig. 23
figure 23

Results of the integrated system at branch type II

Fig. 24
figure 24

Results of the integrated system at branch type I

Table 3 Distance value (original pose)
Table 4 Distance value (rotated \(20.0^{\circ }\))
Table 5 Result of searching for the hole direction
Table 6 Distance value in the elbow pipe

7.2 Distinction of the T-branch and the miter

The robot with a sensor suite drove in the pipeline during the experiments. Each element at the end of the pipeline was connected. The PSD sensor data were stored while the robot moved in the pipeline. The detecting method of the pipe elements was also performed. The robot was stopped and the result of the detecting method was checked using the front PSD sensor when the sensor suite reached the pipe element. Figures. 19 and 20 show the experiment results. The third and seventh PSD sensors were placed in the hole direction. The ADC data of the PSD sensor slowly decreased in the T-branch and dramatically changed in the miter, as expected. The current values of the PSD sensor for the hole direction were compared to the previous values to distinguish the T-branch and the miter. More than 50 changes were detected in the miter. In other words, the pipe was considered to be a miter if more than 50 changes were detected. The variation in the ADC value at that time was used for comparison instead of the distance value, which was why it was more simple and can provide more consistent results. Therefore, the method used only simple subtraction. Furthermore, the processing speed was extremely fast. We can distinguish the T-branch and the miter pipe by using the absolute value of variation. Accordingly, we tested the T-branch and the miter for twenty times to confirm whether or not this method works well. We obtained consistent results, which served to validate the proposed method.

7.3 Recognition of the pipeline geometry

We can confirm that the result for the straight pipeline was similar to the expected graph in Fig. 21. The distance data measured by the side PSD sensors were similar to each other. Each of the data only had some oscillations because of noise although the sensor suite moved forward.

The robot’s original pose in the experiments for the branch recognition was measured using the IMU sensor (Xsens Inc.). The sensor was read when the robot came to the center of the T-branch. The distance value for the sensor was 340 mm if the data from the PSD sensor were very low or zero. The hole direction was near the PSD sensor direction if the distance value for the PSD sensor was more than 230 mm. In the original pose, the third and seventh PSD sensor’s directions were placed in the hole direction. Table 3 shows the distance values for the PSD sensors. Each distance value is listed in Table 4 after rotating the robot for approximately \(20^{\circ }\) counterclockwise.

We can find the hole direction from the first PSD sensor by substituting these data into Eqs. (2)–(5). Table 5 shows the results. The errors were approximately \(1^{\circ }\) to \(4^{\circ }\). The error sources were the noises from the PSD sensors and the robot position. The measured distances were included in this error. Therefore, the calculated angles were also included in the error. We used a Kalman filter on each PSD sensor data to minimize this noise. The shape of the T-branch bottom was not flat, and the robot slightly rotated when it went into the T-branch, which consequently generated some errors. However, the errors were very small and negligible.

Table 6 lists the PSD sensor data for the elbow. The hole direction from the first PSD sensor can be determined by substituting these data in Eqs. (12)–(14). As a result, the direction angle was calculated to be \(230.7^{\circ }\). The original direction angle of the hole was \(230^{\circ }\). Therefore, it had an error of \(0.7^{\circ }\). The results of the repeated experiments exhibited errors between \(0^{\circ }\) and \(3^{\circ }\).

7.4 Integrated recognition system

The experiment for the integrated recognition system was performed for the 8-inch pipeline. The system was tested in the elbow, branch type I, and branch type II. The integrated recognition system was composed of four steps as the sensor suite approached the pipeline elements. The results of the recognition for each step were the output to the real-time video. The PSD sensor value was also recorded. The results of the integrated recognition system were as follows: in the case of the elbow and branch type II, the result for both the recognition methods indicated the exact pipeline element and the hole direction (Figs. 22 and  23. However, a recognition error for branch type I existed. As shown in Fig. 24, the error was found in step 2, which was a typical drawback of the recognition method using a camera because of the failure in recognizing a reflective area. The PSD sensor distance was stored while traveling the pipeline. This distance was represented in the graph. The recognition method using the PSD sensor can be re-confirmed when the recognition of the pipeline elements using the PSD sensor was completed by checking the PSD sensor distance.

8 Conclusions

This study proposed a method to recognize the pipeline elements and their geometry by using a monocular camera and PSD sensors. The method was extremely light in computation and advantageous because it did not require any significant change in the existing hardware. This method also used inexpensive PSD sensors and simple calculations to provide the necessary information for an autonomous navigation inside the pipelines. We plan to use this method on newly designed, multi-modular, in-pipe robots in the future. The method will be embedded in a robot, and its effectiveness will be proven in a real environment.