1 Introduction

Navigation is the task of determining the pose (i.e., position and orientation) of a vehicle in space and time, which is a crucial capability for all kinds of field robots including aerial, ground and marine robotic vehicles. Mobile robots generally carry a suite of proprioceptive motion sensors (e.g., odometers, inertial sensors). However, the measurements from these motion sensors need to be integrated in time to determine the robot’s position and orientation, which results in unbounded drift errors. The vehicles that are operated in an open outdoor environment can acquire position information from Global Positioning System (GPS). GPS provides absolute position fix information, and thus drift errors due to dead reckoning can be effectively suppressed. However, GPS signals are not available and reliable in indoor environments. In indoor environments, beacons or pseudolite systems may be used for navigation and localization; however, their availability cannot always be assumed due to various limitations such as range and line-of-sight restrictions, the requirement for pre-deployment and additional installation cost.

To deal with navigation for mobile robots in GPS-denied or GPS-restricted environments, various alternative navigation strategies have been proposed. One of the most common approaches is to use visual information for navigation and localization. A vision-based system is an attractive option for mobile vehicle applications, since the system can be easily implemented using low-cost cameras and can be used in both indoor and outdoor environments with no need for any external supporting infrastructure for localization. However, vision does not provide explicit depth information, which causes considerable difficulties in accurately perceiving the environment. Hence, vision-based navigation in unstructured environments still remains a challenging and active research area.

Fortunately, grid line patterns are commonly found (or can be easily provided if necessary) on the surface of floors or ceilings in many indoor environments. These grid patterns are generally either parallel to or orthogonally intersecting each other, as shown in Fig. 1. If these grid lines are detected and properly classified, the geometric information provided by the grid line pattern can be used to localize the vehicle’s position and also correct its heading with respect to the grid line structure.

Fig. 1
figure 1

Structured surface environments with tile grid patterns

There exists an extensive set of literature on the study of vision-based navigation which addressed relative localization with respect to surrounding features [16]. The progress made in vision-based navigation and localization for mobile robots was summarized in many survey papers [710]. More recently, vision-based navigation using RGBD (red, green, blue and depth) sensors with depth measurement capability (e.g., Microsoft Kinect) was studied in [1113].

Grid-based navigation has also been addressed in some literature [1416] in which grid patterns are used as deterministic information. Buschmann et al. [14] and Yean et al. [15] proposed the positioning output as the current line number in a discrete manner and perceived the information of the grid lines by the two light sensors and the line scan camera, respectively. The key observation in [16] is that the type of the crossing line event is identified by the three sensors and the discrete supporting rule table which is derived depends on the robot’s heading angle assumed to be always precise. In [17], Mächler suggested odometry correction based on grid lines by introducing the position probability distribution (PPD) model and showed the feasibility of using grid lines for navigation. The study pointed out that the angular deviation is more important and difficult to deal with than the forward motion deviation. However, there was no discussion about the probabilistic approach of the crossing line event and the heading angle with respect to the grid line in the above works.

More recently, the authors of this paper proposed a probabilistic navigation approach using grid lines with no need for pre-deployed beacons in [18], which is a preliminary version of this work. In this study, the probabilistic measurement model and experimental results have been extensively improved, and more detailed discussion is presented.

In grid-based navigation, all the grid lines look almost identical to each other, which requires the decision between multiple hypotheses. This incurs difficulties when using Gaussian filter algorithms such as a Kalman filter or an information filter, since Gaussian filters approximate uncertainties using unimodal Gaussian distribution functions. In addition, the motion model of a wheeled mobile vehicle is inherently nonlinear, which leads to a nonlinear estimation problem.

To take into account the aforementioned issues, this study proposes a probabilistic navigation approach considering the vehicle kinematics model and the repetitive nature of grid line patterns in the context of particle filtering (i.e., nonlinear, non-Gaussian, multiple-hypothesis filtering). The particle filter (PF) algorithm does not require the simplifying assumptions of linear models and unimodal Gaussian distributions, and thus it provides a suitable filtering framework to deal with nonlinear estimation problems with multiple hypotheses.

In this study, we mainly focus on developing a probabilistic measurement model for drift-free navigation of a mobile robot in a grid-structured environment considering the non-Gaussian nature of grid sensing with the assumption that the grid size (grid spacing) has a fixed and known value. However, technically, the proposed approach can be formulated in the context of simultaneous localization and mapping (SLAM) by introducing the grid spacing as unknown parameters into the navigation filter.

Grid detection is assumed to be a binary measurement process, which means that the sensor (e.g., a down-looking camera) provides information on whether grid lines exist or not. In addition, the relative angle between the grid structure and the camera’s orientation is found through image processing. The navigation filter for fusing the vehicle’s motion and grid line measurements is implemented, and the performance of the filter and its real-time capability is shown through an experiment with a wheeled mobile robot.

2 Problem statement

For a robot moving on the surface with tile grid patterns, its navigation performance can be improved by incorporating the geometric information of the grid lines. For example, when the robot detects a line, the position and the heading angle of the robot can be corrected based on the grid line structures under the assumption that detected grid lines are spaced with known distances and they are either parallel or orthogonally intersecting to each other.

To formulate the present navigation problem into an online estimation problem, the mathematical models of the system dynamics and the measurement process need to be determined. Their state-space forms are described in the following.

2.1 System dynamics model

The proposed navigation approach assumes that a wheeled mobile robot moves on the floor surface by controlling its longitudinal velocity and heading rate. The kinematic motion model of the robot can be expressed as:

$$\begin{aligned} \dot{{\mathbf {x}}}=\left[ \begin{array}{c} \dot{x}\\ \dot{y}\\ \dot{\psi } \end{array}\right] =\left[ \begin{array}{c} Vcos\psi \\ Vsin\psi \\ r \end{array}\right] +{\mathbf {w}}. \end{aligned}$$
(1)

Here, x and y are the position coordinates of the robot in the horizontal plane, and \(\psi \) is the heading angle. V and r are the longitudinal velocity and the heading rate, which constitute the control input vector \({{\mathbf {u}}} = [\; V \quad r \;]^T\). The process noise \({\mathbf {w}}\) is assumed to be zero-mean Gaussian distributed. The coordinate system of the mobile robot is shown in Fig. 2.

Fig. 2
figure 2

The coordinate system of a mobile robot moving on a flat surface

2.2 Measurement model

If a grid line is detected by a down-looking camera, it indicates that the robot is currently located somewhere on the grid lines. In addition, under the assumption that all the lines are either parallel or perpendicularly intersecting, the robot’s heading angle can be corrected using the direction of the detected grid line. The orientation of the detected line in the camera’s image represents the relative heading of the robot with respect to the grid structure.

The measurement equation is shown in (2).

$$\begin{aligned} {\mathbf {z}}=\left[ \begin{array}{c} z_{x}\\ z_{y}\\ z_{\psi } \end{array}\right] +{\mathbf {v}} \end{aligned}$$
(2)

where \(z_{x}\) and \(z_{y}\) denote the position measurements and \(z_{\psi }\) is the heading angle measurement. \({\mathbf {v}}\) is the measurement noise which is assumed to be a zero-mean Gaussian distribution. This study assumes that these measurements become available when the robot detects a grid line. Specifically, a down-looking camera provides a sequence of images of the floor surface, and the presence of a grid line and the angle between the robot’s heading and the detected grid line structure can be used to determine the robot’s pose through online image processing.

However, the grid lines are not distinctively labeled and look nearly identical. The observation (or belief) that the robot is currently on one of many grid lines is not enough to determine the robot’s position in the entire grid structure. Moreover, if the robot’s heading is not accurately known, it may even be difficult to determine whether the detected xy grid line is either in the x direction or in the y direction in the grid structure. Therefore, the use of multi-modal probabilistic distributions is considered to represent the probability distribution of the robot’s position and heading angle.

3 Navigation filter implementation

This study aims to enhance navigation performance using grid line measurements which provide information on the position and heading angle of the robot with respect to the grid line structure. The vehicle kinematics model is nonlinear, which requires the use of nonlinear filtering techniques. Moreover, the grid line structure has repetitive patterns, and all the grid lines basically appear identical and indistinguishable to each other. Therefore, it is necessary to consider multi-modal probabilistic distributions to deal with the grid-based navigation problem. Conventional Gaussian nonlinear filtering techniques (e.g., extended Kalman filter, extended information filter) represent the uncertainties in the system using unimodal probabilistic distributions, and they are not effective in handling the estimation problems involving multiple hypotheses.

This study proposes employing a PF for online navigation using grid line patterns. The PF algorithm is known to be robust and effective in dealing with nonlinearity and multiple hypotheses. The particle filtering represents the system’s uncertainty distributions using a cloud of particles, and each particle represents its own hypothesis on the robot’s pose. Thus, the PF algorithm can provide a suitable filtering framework to handle the proposed grid-based navigation problem for which multiple hypotheses need to be considered.

The proposed probabilistic model with measurement uncertainty involves the joint probability density function (pdf) of the position and relative heading measurements through grid line detection. Whether the detected grid line is either in the x direction or in the y direction cannot be determined without knowing the robot’s heading, which leads to the probabilistic dependency of the position measurement on the heading measurement. The proposed measurement update scheme is a two-step approach: 1) update the robot’s heading estimate using the relative angle of the detected grid line in the image, and then 2) update the robot’s xy position estimate given the estimated heading angle of the robot. The measurement model is represented as the multiplication of three conditional pdfs.

$$\begin{aligned} p({{\mathbf {z}}}\,|\,{{\mathbf {x}}}) = p(z_{x}\,| \, z_{\psi },{{\mathbf {x}}}) \; p(z_{y}\,| \, z_{\psi },{{\mathbf {x}}}) \; p(z_{\psi }\,|\,{{\mathbf {x}}}) \end{aligned}$$
(3)

where, \(z_x\), \(z_y\) and \(z_{\psi }\) denote measurement events associated with the motion states in the subscript. Note that \(z_x\) and \(z_y\) are assumed to be conditionally independent given \(z_{\psi }\) and \({\mathbf {x}}\), and these position measurements and \(z_{\psi }\) are assumed to be conditionally independent given \({\mathbf {x}}\). Algorithm 1 shows a pseudocode of the proposed grid-based navigation algorithm using a PF.

figure a

An important part of the algorithm implementation is the likelihood modeling associated with the probability of measurement \({\mathbf {z}}\) under the given particle set \({\mathbf {x}}\), which determines the weight of each particle for computing the filter’s posterior belief. The sensor model in the algorithm is the combination of the position measurement model and the heading measurement model, whose detailed modeling procedure is described in the following.

3.1 Heading measurement model

Grid lines are either parallel or perpendicularly intersecting at 90 degree angles. Using this information, the robot’s heading information can be updated using the orientation of the detected grid line in the captured image, which defines the relative heading of the robot with respect to the grid line structure. Figure 3 illustrates the case that the line is detected while the robot is moving in the positive x-direction.

Fig. 3
figure 3

Definitions of relative angles between the robot and the grid line structure

Fig. 4
figure 4

Four cases of possible line-crossing configurations. a Case 1: The robot detects the y-grid line while moving in the positive x-direction. b Case 2: The robot detects the y-grid line while moving in the negative x-direction. c Case 3: The robot detects the x-grid line while moving in the positive y-direction. d Case 4: The robot detects the x-grid line while moving in the negative y-direction

Here, \(\psi \) is the robot’s heading angle relative to the reference frame and \({\psi }_g\) is the orientation of the grid line relative to the same reference frame, and \(\theta \) is the complementary angle of the angle between the heading and the detected grid line. For convenience, the reference frame is assumed to be aligned with the x-axis of the grid line structure (e.g., \({\psi }_g=0\)), which leads to \({\psi }={\pi }/2 - {\theta }\).

The intersecting angle can be defined differently depending on the robot’s heading angle and whether the line is parallel to the x-axis or to the y-axis. For this, four possible heading configurations are considered, based on which the likelihood of heading measurements is defined.

Fig. 5
figure 5

Heading angle pdfs of four line-crossing cases when the relative angle measurement of the detected grid line \(\theta = -25^\circ \)

Four different cases of line crossing are illustrated in Fig. 4. The relationship between the heading angle and the orientation of the detected grid line can be expressed as follows.

(4)
Fig. 6
figure 6

The probability functions associated with the type (either in the x or y direction) of the detected grid line vary with \(\alpha \). Note that \(p(c_{1} \cup c_{2}) + p(c_{3} \cup c_{4}) = 1\) for all \(\alpha \)

Fig. 7
figure 7

The procedure of determining the heading angle probability when \(\psi _{prior}=15^{\circ },\,\theta =-25^{\circ }\): a The probability distribution of the prior heading estimate, b the weights for four possible heading configurations, c the heading measurement probability distribution considering the determined weights, d the probability distribution of the updated heading estimate

The uncertainty of the orientation measurement is assumed to follow a Gaussian distribution. Then, the uncertainty of the robot’s heading for each case also follows a Gaussian distribution. The heading angle measurement probability can be expressed by

(5)

Here \(c_{i}\) denotes the case, and \(z_\psi ^i\) is the reference heading angle (\(i = 1,...,4\)) which is determined by the given \(\theta \). \(\eta _\psi \) is the normalizing factors, and \(\sigma _\psi \) is the heading uncertainty parameter. The probabilities of these four heading cases define the pdfs, each of which is distributed with its center at the corresponding \(z_\psi ^i\), as shown in Fig. 5.

The combined probability of the heading angle measurement is represented as the weighted sum of the heading angle pdfs of all four cases.

$$\begin{aligned} p(z_{\psi }|{{\mathbf {x}}}) = [\, w_{c_1} \; w_{c_2} \; w_{c_3} \; w_{c_4} \,] \; \left[ \begin{array}{c} p_{c_1}({z_{\psi }}|{{\mathbf {x}}})\\ p_{c_2}({z_{\psi }}|{{\mathbf {x}}})\\ p_{c_3}({z_{\psi }}|{{\mathbf {x}}})\\ p_{c_4}({z_{\psi }}|{{\mathbf {x}}}) \end{array}\right] \end{aligned}$$
(6)

Four different weights are introduced: \(w_{c_1}\), \(w_{c_2}\), \(w_{c_3}\) and \(w_{c_4}\). These weights are determined using the orientation of the detected grid line. For example, if the detected grid line is parallel to the x-axis, the line crossing case is likely to be either case 3 or case 4. Similarly, the grid line that is parallel to the y-axis indicates that it is likely to be either case 1 or case 2. The direction of the detected grid line can be estimated using the orientation of the grid line appeared in the image and the prior estimate of the robot’s heading.

The direction of the detected grid line is defined as

$$\begin{aligned} \alpha = \psi + \theta + \pi /2. \end{aligned}$$
(7)

According to (4), if \(\alpha \) is close to \(n\pi \) where n is an integer, the probability that the detected line is in the y-direction increases. Likewise, \(\alpha \) is close to \((n+1/2)\pi \), the probability that the line is in the x-direction increases. The probability of the line crossing cases given by \(\alpha \) is expressed as:

$$\begin{aligned} p(c_{1} \cup c_{2})= & {} \frac{1}{2}\{1-\cos {2\alpha }\} \end{aligned}$$
(8)
$$\begin{aligned} p(c_{3} \cup c_{4})= & {} \frac{1}{2}\{1+\cos {2\alpha }\} . \end{aligned}$$
(9)

The change of probability varying with \(\alpha \) is modeled using cosine functions, and Fig. 6 shows these probability functions.

The four weights can be determined as follows.

$$\begin{aligned} \left( \begin{array}{c} w_{c_{1}}\\ w_{c_{2}}\\ w_{c_{3}}\\ w_{c_{4}} \end{array}\right) = \eta _c \left( \begin{array}{c} p(c_{1} \cup c_{2}\,;\alpha )\\ p(c_{1} \cup c_{2}\,;\alpha )\\ p(c_{3} \cup c_{4}\,;\alpha )\\ p(c_{3} \cup c_{4}\,;\alpha ) \end{array}\right) \end{aligned}$$
(10)

where \(\eta _c\) is the normalizer given by \( \eta _c = [\,2\{p(c_{1} \cup c_{2}) + p(c_{3} \cup c_{4})\}\,]^{-1} =0.5\).

Figure 7 shows an illustrative example of computing the heading measurement probability when the orientation of the detected line is \(-25^\circ \) with the prior heading estimate of \(15^\circ \) (\(\theta = -25^\circ \), \(\psi ^- = 15^\circ \)). For the corresponding \(\alpha = 80^\circ \), \(p(c_{1} \cup c_{2}) = 0.97\) and \(p(c_{3} \cup c_{4}) = 0.03\). By combining the weights and heading angle pdfs in (4) and Fig. 5, the heading angle pdf is updated as Fig. 7d.

3.2 Position measurement model

Since all the grid lines look similar and cannot be distinguished one from the other, a multi-modal pdf is introduced to represent multiple hypotheses. Specifically, a mixture of Gaussians is used to model the sensing likelihood of the robot’s current position.

Equations (11) and (12) define the pdfs in the form of the mixture of Gaussian distributions, each of which represents the probability of detecting a grid line parallel either to the x-axis or to the y-axis in the given grid line structure.

$$\begin{aligned} f_x (z_x ; z_x^i)= \eta _x \left( \sum _{i} \exp {\left\{ -\frac{(z_x-z_x^i)^2}{2\sigma _x^2}\right\} } \right) \end{aligned}$$
(11)
$$\begin{aligned} f_y (z_y ; z_y^i)= \eta _y \left( \sum _{i} \exp {\left\{ -\frac{(z_y-z_y^i)^2}{2\sigma _y^2}\right\} } \right) \end{aligned}$$
(12)

where \(f_x (z_x ; z_x^i)\) and \(f_y (z_y ; z_y^i)\) are the functions of the measured grid line positions \(z_x\) and \(z_y\), parametrized by their reference positions \(z_x^i\) and \(z_y^i\). \(\eta _x\) and \(\eta _y\) are the normalizing factors, and \(\sigma _x\) and \(\sigma _y\) represent the measurement uncertainty parameters.

The grid lines in the y-direction are separated in the x-direction, and thus the detection of a grid line along the y-axis provide the information on the x-position of the robot. Likewise, the grid line in the x-direction provides position information in the y-direction. Considering the measurement uncertainty properties, the position measurement model is expressed by combining two probability distributions, one for the probability of sensing a grid line and the other one for erroneous or random measurement. For example, the probability distribution of the x-position measurement is represented combining two probability density distributions shown in Fig. 8.

Fig. 8
figure 8

The probability of sensing a grid line is modeled using a mixture of gaussian model whose peaks are separated by the grid line spacing in (a). The probability of erroneous or false sensing is represented as a function uniformly distributed over the spacing in (b). a \(p_{grid}(z_{x}|{{\mathbf {x}}})\). b \(p_{rand}(z_{x}|{{\mathbf {x}}})\)

These two probabilities can be expressed as

(13)
(14)

The resulting x-position measurement probability is represented by a weighted sum of \(p_{grid}(z_{x}|{{\mathbf {x}}})\) and \(p_{rand}(z_{x}|{{\mathbf {x}}})\).

$$\begin{aligned} p\left( z_{x}|z_{\psi },{{\mathbf {x}}}\right) = \left[ \, w_{x} \quad (1-w_{x}) \,\right] \; \left[ \begin{array}{c} p_{grid}\left( z_{x}|{{\mathbf {x}}}\right) \\ p_{rand}\left( z_{x}|{{\mathbf {x}}}\right) \end{array}\right] \end{aligned}$$
(15)

where \(w_{x} = p(c_{1} \cup c_{2})\). The weight of \(p_{grid}(z_{x}|{{\mathbf {x}}})\) is determined using the orientation of the detected grid line defined in (8).

Similarly, the position y measurement probability is represented as follows.

$$\begin{aligned} p(z_{y}|z_{\psi },{{\mathbf {x}}}) = [\, w_{y} \quad (1-w_{y}) \,] \; \left[ \begin{array}{c} p_{grid}(z_{y}|{{\mathbf {x}}})\\ p_{rand}(z_{y}|{{\mathbf {x}}}) \end{array}\right] \end{aligned}$$
(16)

where \(w_{y} = p(c_{3} \cup c_{4})\).

Additionally, the proposed algorithm is able to be expanded to non-uniform grid patterns under the premise that the geometric information of the patterns exists in advance. The non-uniform grid patterns can be also used for navigation measurements by modifying the heading and position measurement models depending on the grid pattern. In the case of the heading measurement model, the orientation of the given grid line can be considered into Eq. (4) (e.g., \({\psi }={\pi }/2 - {\theta } + {\psi _g}\)). The varying size of the grid pattern can be adapted to the position measurement model by redesigning \(z_x^i\) and \(z_y^i\) in Eqs. (11) and (12).

4 Experimental result

Experimental tests in an indoor environment using a wheeled mobile robot were performed to demonstrate the feasibility of the proposed grid-based navigation algorithm. A mobile robot (Kobuki Yujin Robotics, also known as TurtleBot 2) equipped with odometer and gyro sensors was used for the experimental validation.

To detect grid patterns on the floor surface, a down-looking camera and a light source were installed in the robot. The light was used to maintain the constant intensity of illumination for consistent performance in detecting grid lines using computer vision. The robot is also equipped with an RGBD sensor (i.e., Microsoft Kinect), and the depth measurement data were stored during the test runs. However, the depth measurements were used for visualizing the navigation performance by constructing the occupancy grid map of the test environment in a post processing step, not for online navigation. Figure 9 shows the robot platform, and Table 1 shows the system and algorithm settings for the experiment.

The experiment was performed in an indoor hallway environment. The hallway stretches approximately 60 meters with varying widths in different sections. The floor plan and some pictures of the environment are shown in Fig. 10.

The seam lines between the tiles on the floor surface provide the grid line patterns that can be utilized by the robot for grid-based navigation. The size of each grid cell is 0.95 m \(\times \) 1.05 m. For reliable line detection from the obtained camera image, the Hough transform algorithm was used [19]. Figure 11 shows a line crossing situation during the actual indoor experiment. The image processing for line detection was performed online, and the algorithm is capable of identifying two intersecting grid lines around grid crossing points.

Fig. 9
figure 9

Mobile robot platform (Turtlebot2 by Yujin Robotics)

Table 1 System and algorithm settings
Fig. 10
figure 10

The hallway environment where the indoor navigation experiment was performed. a The reference robot trajectory with the camera viewpoints. b The camera images of the test environments from the viewpoints

Fig. 11
figure 11

Line detection using a camera image captured by the down-looking camera

Fig. 12
figure 12

Comparison between the trajectory by dead reckoning (dotted) and the trajectory by the proposed grid-based navigation (solid)

Fig. 13
figure 13

Position estimation and distance error comparison between dead reckoning (dotted) and the proposed grid-based navigation (solid) with respect to the reference trajectory (dashed). a Position x. b Position y. c Distance error

The robot was driven to the end of the hallway and back, following the walls on the side. The dead-reckoning trajectory was obtained using odometry and heading sensor measurements. Then, in addition to these motion sensor measurements, the proposed algorithm incorporates grid information into the navigation filter for grid-based pose correction. The resulting trajectories are compared in Fig. 12.

Figure 12 shows that the dead-reckoning trajectory is considerably distorted as drift errors grow in time. The mobile robot was driven to come back to the start point after traversing the hallway. However, the start point and the goal point, which are supposed to be close to each other, show a large deviation in position.

On the other hand, the proposed grid-based navigation provided a satisfactory result. The position and heading angle fixes obtained from the grid lines effectively prevented the error from growing excessively large, which leads to a successful loop closure.

The time trajectories of the estimated position and distance error are compared with the reference trajectory in Fig. 13. As shown in the figure, the error in the dead reckoning increases rapidly, whereas the error in the proposed navigation remains bounded in time.

Fig. 14
figure 14

Reconstructed 2D maps of the test environment. a Dead reckoning. b The proposed algorithm

For more clear comparison, the occupancy grid maps of the indoor environment were created using the estimated robot trajectory by the proposed grid-based navigation approach and the depth measurements to the surrounding wall provided by the RGBD sensor equipped to the mobile robot. Figure 14a shows the map by dead reckoning and Fig. 14b shows the map by the proposed grid-based navigation. The results clearly demonstrate a performance improvement by the proposed approach in terms of navigation and mapping.

5 Conclusion

This study introduced a probabilistic indoor navigation approach for a wheeled mobile robot using grid line information available in a structured indoor environment. The results demonstrated the feasibility of improving navigation performance by showing experimental validation results.

The measurement probability models were newly developed and the use of a PF was proposed to effectively deal with multi-modal probabilistic distributions, since all the lines in a grid structure are basically identical and thus indistinguishable. After the formulation, navigation test runs using a wheeled mobile robot in an indoor environment were carried out. The results shows that the use of grid information with the particle filter successfully enhanced the overall navigation performance.

This study assumes that the grid map structure is given or the spacing of the grid structure is known. However, this assumption is not necessarily required, and the lack of map information does not limit the application of the proposed grid-based navigation approach. When navigating in the environment with grid lines, the grid spacing can be estimated by introducing the spacing parameter into the filter structure as an additional state to be estimated. Unknown grid patterns can also be used to enhance the navigation performance in the SLAM framework by generating the grid map and localizing the vehicle’s position with respect to the map that is being built simultaneously. Further research may be directed towards the goal of extending the present grid-based work from the perspective of SLAM.