## Abstract

In the recently published researches in the object localization field, 3D object localization takes the largest part of this research due to its importance in our daily life. 3D object localization has many applications such as collision avoidance, robotic guiding and vision and object surfaces topography modeling. This research study represents a novel localization algorithm and system design using a low-resolution 2D ultrasonic sensor array for 3D real-time object localization. A novel localization algorithm is developed and applied to the acquired data using the three sensors having the minimum calculated distances at each acquired sample, the algorithm was tested on objects at different locations in 3D space and validated with acceptable level of precision and accuracy. Polytope Faces Pursuit (PFP) algorithm was used for finding an approximate sparse solution to the object location from the measured three minimum distances. The proposed system successfully localizes the object at different positions with an error average of ±1.4 mm, ±1.8 mm, and ±3.7 mm in *x*-direction, *y*-direction, and *z*-direction, respectively, which are considered as low error rates.

## 1 Introduction

In the recent decade, localization techniques and algorithms, in general, attract the researchers due to its importance in our daily life, starting from object mapping, obstacle detection and localization in the 3D space, objects topography, and collision avoidance for modern vehicle navigations systems. Ultrasonic sensors, in general, are used as cheap and accurate sensors for non-contact measurements of distance in many fields of research and applications. Due to their low price, good sensing range, robustness against environmental conditions, and their usability even with the transparent objects, makes them very interesting instead of using optical sensors [1–3]. In addition, objects localization and tracking in 3D space are used in robots and smart vehicle navigation systems besides computer vision systems to increase the accuracy and reliability of the system [3]. The modern localization systems use array (grid) of sensors for localization of objects in general instead of single or dual sensors to increase the sensitivity, accuracy, reliability, and robustness of the system [4, 5].

Ultrasonic sensors array is scanned continuously within a suitable frame rate, where for each scanned frame the distances are calculated, and localization algorithm is applied. In each frame, the ultrasonic sensors are triggered, and the ultrasound waves are sent and sensors waiting for echoes to calculate the distances [6, 7]. Continuous real object localization requires the developer to create a grid with known coordinates as references for accurate localization of the objects. US sensors Grid coordinates will affect the actual resolution of the system and therefore the finite location accuracy of the detected objects. Usually, the origin of the system grid coordinates is referenced in the upper-left corner of the grid with the first coordinates having the index of (1, 1) or (0,0), and the increment between the rows and columns is defined by the resolution of the grid, which is usually the distance between sensor center. Figure 1 shows the global coordinate for sensors grid [8–11].

This paper represents a novel and real-time 3D localization system and algorithm of objects using a 2D array of ultrasonic sensors. The main contribution of our study is to introduce a new robust, accurate and low-cost method for 3D localization using a low-resolution 2D array of sensors.

### 1.1 Paper structure

The paper is structured as follows. In Section 2 we provide a literature review about the related works in the study area. Section 3 describes the proposed system configuration and Section 4 deals with the proposed localization algorithm based on distance measurement. The experimental evaluations and results are discussed in detail in Section 5, and Section 6 summarizes the main conclusions and contributions of the paper and provides suggested ideas for future work.

## 2 Review of literature

HS Kim et al. proposed a newly enhanced beacon system based on ultrasonic sensors to be used for indoor localization. In addition, the system can be used for estimating the direction of the mobile robot. The system uses two ultrasonic sensors which operate at two different frequencies with enhanced algorithm based on Kalman filter to improve the localization [12]. V Kunin et al. design a sensor array for localization with FPGA data acquisition. They designed an anechoic chamber to ensure a clean environment during the experimental tests from external noises and reverberation echoes. The system is used for object localization and direction of arrival estimation using different sensors geometries [13].

A genetic algorithm for localization using ultrasonic sensors was developed and designed by L Moreno et al., the system is designed toward the mobile robot's navigation. The proposed system algorithm is based on iterative non-linear filter algorithms, which try to find matches between the currently observed geometries beacons and an a-priori map of beacon locations in order to find the correct current position and orientation of the robots. Another system design for robot's navigation based on ultrasonic sensors array was proposed by S Kim et. Al. The system uses an array of two or more ultrasonic sensors and based on distance measurements and received signals from beacon the direction and orientation of robots are calculated [14, 15].

Instead of using a genetic algorithm or Kalman filter B Ilias, et al. developed a hybrid system that used NWA and KNN methods for object localization. The system uses homogenous ultrasonic sensors array, the results show that the system has an accuracy of localization from 80 to 90% for basic wall shape and 78% for real laboratory environment [16]. WY Mu et al., developed a novel ultrasonic sensor ranging and scanning algorithm for omnidirectional localization. Unlike conventional algorithms, the developed algorithm continuously calculates the relative position of the ultrasonic sensor to enhance the accuracy of the system even in the inclined plate. The proposed system successfully localizes the objects with ±3.33 mm error in frontal sensors, ±6.21 mm for lateral sensors and ±0.20 error for posture [17].

A vehicle autonomous localization method for the coal mine tunnel local area is proposed by Zirul Xu et al., the system is based on vision sensors and ultrasonic system for guiding the vehicle. The system uses the vision sensors to read the UPC-A barcodes while the ultrasonic sensors used to guide the vehicle through the tunnel by measuring the distance between the vehicle and the walls and obstacles. The system's global coordinate reference, which is the upper left inner corner of each frame, contains a barcode tag [18]. Instead of using conventional ultrasonic sensors Woojin Seo and Kwang-Ryul Baek use the ultrasonic anemometer with inertial measurement unit (IMU) to estimate the position of the mobile robot. They proposed a new approach and equations to use ultrasonic anemometer sensors, which decrease the resulting error. The approach is based on collecting the Velocity data from the ultrasonic anemometer and the acceleration and angular velocity data from the IMU then applying the Kalman filter to estimate the mobile robot position [19].

Jan Steckel and Herbert Peremans proposed a closed-loop control of the mobile robot using a 3D sonar sensors system, the system uses the layered control architecture consisting of multiple parallels, loosely coupled and the reactive control laws. They provide a solution for various positions like collision avoidance and obstacle avoidance. The system simulation results show that the system success is in guiding the mobile robot with high accuracy and robustness [20].

## 3 System configuration

### 3.1 Basic concept

The proposed system depends basically on the ultrasonic sensor's directional response, where the ultrasonic sensor, in general, uses the transmitted sound wave and its echo to determine the distance of the object ahead of them. The used ultrasonic sensor is HC-SR04, which covers a range from 2 to 400 cm and uses a separated trigger and echo channels with angle of 30^{°}, as shown in Fig. 2.

The proposed system contains 27 equidistance ultrasonic sensors distributed in 9 rows and 3 columns formation and covers an area of 40 cm *40 cm plate. This configuration, as shown in Fig. 3 makes the system able to localize the object in 3D space with a low error rate. The distances between the centers of the columns are 10 cm and the distance between the centers of the rows is 2.6 cm. First, the synchronous transceiver used to send signals and then receive the reflected echo from the object to measure the distances and based on these measurements the 3D location is obtained. The 3D calculations algorithm was written using MATLAB® scientific programming language.

## 4 Configuration of hardware

The 27 transceiver sensor modules of HC-SR04 type from Elecfreaks company with measurement range up to 4 meters and low power consumption, these sensors are connected to the MATLAB via an Arduino microcontroller board of type Arduino Mega 2560 from Arduino company, which is an Atmel AVR® microcontroller-based open-source board that is able to program using Arduino integrated development environment (IDE) from the same company, Arduino Mega 2560 is the used version in this paper and is one of the widely used advanced Arduino boards. This board was used as an interface, shown in Fig. 4 to calculate the object's distances, which are in turn sent to the MATLAB program to calculate the object position using the proposed algorithm. The Arduino board is connected to the computer through the USB cable. The sensors are fixed on a wooden plate and positioned against the space of the object being detected.

### 4.1 System algorithm

In this section, we will discuss the algorithm of the proposed system. The system starts with calculating the distance between each ultrasonic sensor in the array and the object, building the 2D matrix of distance values, finding the minimum three distances read in the sensor array, and finally applying the localization algorithm for estimating the object's location. Figure 5 shows the algorithm diagram.

### 4.2 Distances array

The 27 ultrasonic sensors connected to the Arduino board are triggered to measure the distance between each sensor and the object. The sensor distance ranges between 0 and 400 cm. The distance measurements are sent to the MATLAB software using serial functionality in the Arduino board as 1D array consists of 27 channels. Then finally, the MATLAB software is used to implement the object localization algorithm as described in Section 4.2 below.

### 4.3 Object localization algorithm

The system algorithm starts by converting the received 27 distance measurements to a 9 × 3 matrix as shown in Eq. (1), where this matrix will be used for the next calculations to localize an object with coordinates

The second step is to find a minimum of three distances values (*x*, *y*) indices from distances matrix above.

From the geometrical setup of the three sensors and the object, shown in Fig. 6 the corresponding calculated distances are given as

*X*

_{i}

*, Y*

_{i}), (

*X*

_{j}

*, Y*

_{j}), and (

*X*

_{k}

*, Y*

_{k}) are the

*x-y*coordinates of the first, second, and third sensors that read the minimum distances respectively. Which is represented in matrix form as:

The linear system can be solved as:

*x*,

*y*, and

*z*coordinates of the object are calculated as:

**$\hspace{0.17em}\hspace{0.17em}\hspace{0.17em}\hspace{0.17em}{x}_{2}={X}_{j}\pm \sqrt{{A}_{j}}$**

*,*

*,***,**$\text{\hspace{0.17em}\hspace{0.17em}\hspace{0.17em}\hspace{0.17em}\hspace{0.17em}\hspace{0.17em}}{y}_{2}={Y}_{j}\pm \sqrt{{B}_{j}}$

**,**$\text{\hspace{0.17em}\hspace{0.17em}\hspace{0.17em}\hspace{0.17em}\hspace{0.17em}\hspace{0.17em}\hspace{0.17em}}{y}_{3}={X}_{k}\pm \sqrt{{B}_{k}}\hspace{0.17em}$

The sign of addition in the above equations depends on whether the

### 4.4 Algorithm for solving the sparse system

Since the W matrix is a sparse matrix, the system of equations cannot be solved using the traditional inverse solution. To solve the set of the proposed equation system we have applied the Polytope Faces Pursuit (PFP) algorithm. The algorithm tries to find an approximate sparse solution of

## 5 Experiments

### 5.1 Environment

The testing environment consists of the localization system and the object to be detected with no obstacles between them, the object size is 5 × 5 cm with a laser pointer fixed on the center of the object. The laser pointer is used to compare the measured localized location with the real location, Fig. 7 shows the object with the laser pointer fixed on the center of it.

The object is fixed on a handling arm in the 3D space for testing purposes. The object is tested at different *x, y,* and *z* locations as shown in Fig. 8. as an example.

### 5.2 Results and analysis

The system shows its capability to detect and localize the object in the 3D space with very acceptable error rate, considering the system's low cost and simple construction. Table 1 displays the detected object measured coordinates compared with its real coordinates at different locations and their absolute errors. Figure 9A–D show some examples of displaying the object measured location in 3D space as a result of using the proposed localization system.

Example of measurements

Location number | Real location (Cm) | Detected location (Cm) | Absolute error (cm) | Relative difference | ||||||||

X | Y | Z | X | Y | Z | X | Y | Z | X | Y | Z | |

1 | 0.0 | 0.0 | 10 | 0.3 | 0 | 9.7 | 0.3 | 0.0 | 0.3 | 200 | 0 | −3.046 |

2 | −10 | 13 | 12 | −10 | 12.6 | 12.3 | 0.0 | 0.4 | 0.3 | 0 | −3.125 | 2.469 |

3 | −10 | −15 | 8.0 | −9.5 | −14.8 | 07.5 | 0.2 | 0.2 | 0.5 | 5.128 | 1.342 | −6.452 |

4 | 10 | −14 | 25 | 10 | −13.9 | 24.8 | 0 | 0.1 | 0.2 | 0 | 0.717 | −0.803 |

5 | −6.0 | −4.0 | 35 | −6.1 | −4.3 | 34.5 | 0.1 | 0.3 | 0.5 | −1.653 | −7.229 | −1.439 |

6 | 0.0 | −5.0 | 40 | 0.3 | −5.0 | 38.8 | 0.3 | 0.0 | 1.2 | 200 | 0 | −3.046 |

7 | 5.0 | 10 | 20 | 5.3 | 10 | 20.3 | 0.3 | 0.0 | 0.3 | 5.825 | 0 | 1.489 |

8 | 10 | 10 | 50 | 10.1 | 9.7 | 49.5 | 1.0 | 0.3 | 0.5 | 0.995 | −3.046 | −1.005 |

### 5.3 Discussion

The system shows the ability to detect the objects with small acceptable error rates in all directions (*x, y,* and *z*) comparing with other proposed methods discussed in the literature, where our system has a better performance over many other methods. The proposed system is ease of use, simple, and accurate, in addition to its low cost, where low-resolution array is needed for localizing an object using only three sensors that read the minimum three distances at each acquired data sample, Table 2 shows comparison between the proposed method and current methods. The major sources of error were due to the reconstruction algorithm in finding the unknowns of the

Comparison with current methods

Reference | Error or accuracy | Method | Cost |

12 | ±7.60 mm. | Kalman filter and ultrasonic sensors | Low |

13 | ±19.49 mm | FPGA and ultrasonic sensors | High |

14 | ±3.8 cm error for x, ±1.5 cm for y sensors, ±3.5 cm error for z, and ±0.5^{o} | genetic algorithm and ultrasonic sensors | High |

17 | ±3.33 mm error in frontal sensors, ±6.21 mm for lateral sensors and ±0.2^{o} | Omnidirectional scanning algorithm for and ultrasonic sensors | Medium |

This paper | ±1.4 mm for x, ±1.8 mm for y, and ±3.7 mm for z | Distance measurement and 3D Localization algorithm using 2D ultrasonic sensors | Medium |

The algorithm is robust and efficient and was tested on artificial location values and gave very accurate results. PFP algorithm has many pros over other Matching Pursuits (MP) methods [21]. The proposed system can be exploited as a real-time 3D localization system for localizing a moving object if the sampling and processing rates of the used interface microcontroller were higher. The system has the advantage of detecting an object regardless of other environmental factors such as illumination degree, object's transparency, or temperature which makes it more reliable than optical or infrared localization systems, Table 3 shows a statistical analysis of the proposed system error. The source of these errors in the measurement is the type of analog to digital converter (ADC) and its number of bits which is in Arduino a 10-bit ADC with a resolution of 5mV since the output voltage of ultrasonic sensor ranges between 0 and 5V. the second source of error is the sampling frequency of the Arduino, which is around 10KHz and it causes some little delay since we use 27 ultrasonic sensors. The last type of error source is the signal to noise ratio (SNR) of the used sensors, which is around −3 dB.

Statistical analysis of measurement error

Axis | Minimum error | Maximum error | Average | Variance | Standard deviation |

X | 0.000 | 0.300 | 0.275 | 0.102 | 0.319 |

Y | 0.000 | 0.400 | 0.163 | 0.026 | 0.159 |

Z | 0.200 | 1.200 | 0.475 | 0.099 | 0.315 |

## 6 Conclusion and Future Work

In this paper, we have presented an effective and promising new novel 3D localization system based on 2D ultrasonic sensors array. The system shows the ability to detect a 5 × 5 cm object with various (*x, y, z*) locations with small acceptable error rates in all directions, where the proposed system successfully localizes an object at different positions with error average of ±1.4 mm in *x*-direction, error average of ±1.8 mm in *y*-direction, and error average of ±3.7 mm in *z*-direction, which are considered very acceptable error rates compared to the results in similar studies using different approaches. The system can be exploited as a human face topographic tool used in the field of face recognition systems for biometrics and security purposes. Also, future work includes enhancing the system resolution and accuracy by increasing the number of sensors used and by changing the type of the ultrasonic sensors to be coaxial omnidirectional transceiver sensors and a controller with higher sampling and processing rates for real-time localization.

## References

- [1]
K. Peter and S. Herbert, “Localization of 3D objects based on ultrasonic data, considering the influence of measurement uncertainty,” Proceedings of IEEE Sensors (IEEE Cat. No.03CH37498), 2003.

- [2]
A. Jimenez Martin, A. Hernandez Alonso, D. Ruiz, , “EMFi-based ultrasonic sensory array for 3D localization of reflectors using positioning algorithms,”

*IEEE Sensor. J.*, vol. 15, no. 5, pp. 2951–2962, 2015. - [3]↑
J. Vera-Gómez, A. Quesada-Arencibia, C. García, R. Suárez Moreno, and F. Guerra Hernández, “An intelligent parking management system for urban areas,”

*Sensors*, vol. 16, no. 6, p. 931, 2016. - [4]
J. Llata, E. Sarabia, and J. Oria, “Three-dimensional robotic vision using ultrasonic sensors,”

*J. Intel. Rob. Sys.*, vol. 33, no. 3, pp. 267–284, 2002. - [5]
J. Velasco, D. Pizarro, and J. Macias-Guarasa, “Source localization with acoustic sensor arrays using generative model based fitting with sparse constraints,”

*Sensors*, vol. 12, no. 12, pp. 13781–13812, 2012. - [6]
S. Hirata, M. Kurosawa, and T. Katagiri, “Real-time ultrasonic distance measurements for autonomous mobile robots using cross correlation by single-bit signal processing,” 2009 IEEE International Conference on Robotics and Automation, 2009.

- [7]
M. Popelka, J. Struska, and M. Struska, “Using ultrasonic sensors to create 3D navigation model of area with ultrasonic sensors,”

*Int. J. Circ. Sys. Signal Proces.*, vol. 10, no. 2016, pp. 82–87, 2016. - [8]
S. Ahmad, A. Kamal, and I. Mobin, “Ultrasonic sensor based 3D mapping & localization,”

*Int. J. Comput. Sci. Eng. (IJCSE)*, vol. 8, no. 4, pp. 140–151, 2016. - [9]
B. Kuipers and Y. Byun, “A robot exploration and mapping strategy based on a semantic hierarchy of spatial representations,”

*Robot. Autonom. Sys.*, vol. 8, no. 1–2, pp. 47–63, 1991. - [10]
J. Guivant and E. Nebot, “Optimization of the simultaneous localization and map-building algorithm for real-time implementation,”

*IEEE Trans. Robot. Autom.*, vol. 17, no. 3, pp. 242–257, 2001. - [11]
K. Ohtani and M. Baba,

*Shape Recognition and Position Measurement of an Object Using an Ultrasonic Sensor Array*, 1st ed. INTECH Open Access Publisher, 2012. - [12]↑
H. -S. Kim and C. Jong-Suk, “Advanced indoor localization using ultrasonic sensor and digital compass,” Control, Automation and Systems, 2008. ICCAS 2008. International Conference on. IEEE, 2008.

- [13]↑
V. Kunin, W. Jia, M. Turqueti, J. Saniie, and E. Oruklu, “3D direction of arrival estimation and localization using ultrasonic sensors in an anechoic chamber,” Ultrasonics Symposium (IUS), IEEE International. IEEE, 2011.

- [14]
L. Moreno, J. M. Armingol, S. Garrido, A. De La Escalera, and M. A. Salichs. “A genetic algorithm for mobile robot localization using ultrasonic sensors,”

*J. Intel. Robot. Sys.*, vol. 34, no. 2, pp. 135–154, 2002. - [15]
S. Kim and Y. Kim, “Robot localization using ultrasonic sensors,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566), 2004.

- [16]↑
B. Ilias, S. Shukor, A. Adom, N. Rahim, M. Ibrahim, and S. Yaacob, “Indoor mobile robot localization using KNN,” 2016 6th IEEE International Conference on Control System, Computing and Engineering (ICCSCE), 2016.

- [17]↑
W. Mu, G. Zhang, Y. Huang, X. Yang, H. Liu, and W. Yan, “Omni-directional scanning localization method of a mobile robot based on ultrasonic sensors,”

*Sensors*, vol. 16, no. 12, p. 2189, 2016. - [18]↑
Z. Xu, W. Yang, K. You, W. Li, and Y. Kim, “Vehicle autonomous localization in local area of coal mine tunnel based on vision sensors and ultrasonic sensors,”

*PloS One*, vol. 12, no. 1, p. e0171012, 2017. - [19]↑
W. Seo and K. Baek, “Indoor dead reckoning localization using ultrasonic anemometer with IMU,”

*J. Sensor.*, vol. 2017, pp. 1–12, 2017. - [20]↑
J. Steckel and H. Peremans, “Acoustic flow-based control of a mobile platform using a 3D sonar sensor,”

*IEEE Sensor. J.*, vol. 17, no. 10, pp. 3131–3141, 2017. - [21]↑
M. Plumbley, “Recovery of Sparse Representations by Polytope Faces Pursuit,”

*In International Conference on Independent Component Analysis and Signal Separation.*, pp. 206–213, 2006.