Abstract
Nowadays, drone imagery is a common way to quickly obtain information on the state of vegetation, as well as a method for taking orthophotos and terrain models. For terrain modelling, aerial photographs are required to have at least 60% overlap between adjacent images. Typically, the inadequate overlap is only discovered later, during post-processing. In our previous work we have presented a method we developed to determine on the spot, whether the in-flight images are suitable for producing the terrain model or whether it is necessary to re-render a part of the area, which can be done under the same conditions (weather, ionosphere, satellite geometry). Our proposed method for calculating the overlap is different from the usual procedure. Using VBA-based calculations, we computed the overlap between the images based on the position of the centre of the images, the altitude of the flight and the rotation angle of the images. The method was tested in practice, but we felt it necessary to verify our calculations. During the verification, we checked the VBA-based calculations using a Python script. The test showed that the results obtained with a previous midpoint calculation method were 92.2 percent identical to the calculations performed with Python. The Python program is accurate and fast; therefore, the use is recommended on the field.
Introduction
3D terrain modelling requires that the adjacent images captured by the drone overlap by at least 60%. The processing of the imagery does not necessarily take place immediately after the flight, and the complex photogrammetric process takes a long time. If the point cloud provided by the collected imagery is incomplete, it is not possible to perform a proper 3D reconstruction with the desired quality, or to create an elevation model from the images. In this case, parts of the terrain must be re-sampled. If areas with missing coverage could be identified before an orthophoto is taken, a lot of time could be saved, in addition, the next flight would take place under almost the same conditions. We did not find a similar calculation in the literature. This may be because professional drones used for photogrammetry do not have this problem. However, users of commercial drones may benefit from a program that shows within a few minutes of the flight whether there are areas that may need to be re-recorded.
In our previous work (Barna et al., 2019), we presented our developed method for drone flight footprint monitoring. Our goal was to determine whether the imagery captured sufficiently covered the area, and whether any missed shots caused problems. The method can be used directly after the flight, in the field, to determine whether the images provide the necessary overlap and can be used to create the orthophoto and elevation model. The calculations were made with common Excel VBA, and the Qgis software was used to display the map. Applying our method in practice, sampling from the missing areas can begin within 10–30 min.
In this paper we test the accuracy of the method developed. For this purpose, we are looking for an easily accessible software that provides the appropriate spatial computations. In addition to verifying the VBA calculations, we aim to make the new method deliver results faster, within 5–10 min, on a device with average computational capacity, without mobile network coverage and internet access.
Drone imaging in practice
Drones, the Small Unmanned Aerial Vehicles (sUAV), have been with us for a long time. These devices can be used for a variety of useful operations (Niedzielski, 2018), including glacier research (Torres et al., 2018), or the study of whale behaviour (Tielidze et al., 2020), and models with very good capabilities are available for civilian use, which anyone can buy and use. Over the past decade, these consumer-grade, multi-rotor drones have become viable platforms for performing low-altitude aerial photogrammetry. More and better sensors are installed on board, and the distance that can be covered is increasing. Since drones are also equipped with GNSS (Global Navigation Satellite System), the need for agricultural applications has also arisen. Cano et al. (2017) found that under controlled conditions, sUAVs can observe a given object or track a route with sufficient accuracy.
Users who did not have the specific training for photogrammetry, or for operating drones, or even for processing the imagery have now become pilots. However, by applying certain rules (Mathews et al., 2023), this work can still be efficient and highly effective.
The high spatial resolution available with drones provides an effective support to achieve more accurate analysis than satellite data (Slade et al., 2023). Excellent results can also be achieved with free downloadable image processing software (Hung et al., 2019), but these require a large amount of data. Modern photogrammetric software has become so advanced that image processing requires minimal user intervention to produce results. However, it is also the case that such software does not allow the operator to configure the drone to its maximum potential, making it difficult to adapt flight plans to the conditions of the area or the needs of the client (Radford, 2020). Sometimes, in order to achieve optimal results, it is necessary to dig deeper into the flight plans to ensure that the aircraft can collect better data and produce high quality digital surface models and orthophotos.
If the elevation change is significant, it can be difficult to choose the optimal flight path and altitude. If the flight altitude is optimized for the highest point, the imagery will be consistent, but the required spatial resolution will be lower in case of lower altitudes. If optimized for the lower elevation, the amount of overlap required to produce an accurate 3D model may not always be achieved at higher elevations. It is possible to dynamically plan the flight path of a UAV in 3D (Chowdhury and De, 2023), but this requires a fair amount of preparation and significant resources.
The elevation model produced by the drone imagery is much more detailed than downloadable or commercially available models. When using passive sensors, the boundary surface of the vegetation becomes the topography image, a DSM (Digital Surface Model) is obtained. For the production of DTM (Digital Terrain Model), this level of detail makes post-processing significantly more difficult (Lakshmi and Yarrakula, 2019). Active sensors such as LIDAR can produce DTM and DSM in one step (Kaňuk et al., 2018). Due to navigational errors, acquisitions at different time points differ from each other by a constant error (Barna and Horváthné Kovács, 2022). These errors can be corrected afterwards by a field survey using a required number of ground control points (GCPs) (San and Suzen, 2005).
The agricultural application of SUAVs has started to evolve in parallel with the development of their instrumentation, and they are serving an increasing number of information needs with drone imaging (Busznyák, 2022a). Although visible light range vegetation indices exist, the use of the NIR band is strongly recommended for the agricultural application (Busznyák, 2022b). For this reason, it is advisable to capture vegetation with multispectral cameras or with a camera system consisting of several cameras (Olson and Anderson, 2021). In such cases, it often happens that the cameras take their images out of sync and at different times and positions. This can cause problems in the assembly of their orthomosaics. There are methods to assess geometric accuracy (S Sai et al., 2019), but there are cases where the time between the assessment result and a possible re-survey can significantly affect the measurement result. Since UAVs can often be used throughout the entire growing season (Schirrmann et al., 2016), a delay of a few days in some plant phenophases can cause a significant change in results (Duan et al., 2017; Marino and Alvino, 2019). For this reason, it is important that the image set captured during a given flight path contains all relevant information (Hung et al., 2019) in the required number of replicates (Jiménez-Jiménez et al., 2021), especially when the study areas have a high exposure to topography. In this case, flight path and altitude can also affect the quality of the imagery (Hsieh et al., 2023).
Commercially available drones, such as the DJI Phantom 4 PRO sUAV we use, are capable of autonomous photogrammetric survey of an area with manufacturer software support (Adjidjonu and Burgett, 2019; Mulakala, 2019). Since the need for remote sensing in agriculture is mainly induced by the vegetation condition, some areas may not be surveyed under optimal conditions. Problems can be caused by rapidly changing or poor light conditions (Burdziakowski and Bobkowska, 2021), or variable, excessively strong winds (Ćwiąkała, 2019). In such cases, the UAV tries to follow the set route, taking images at the required frequency and camera settings, but this is not always possible.
Description of the footprint calculation developed
To create a relief model, 60% overlap between adjacent images is required. The coordinates of the centre of the images produced are also displayed to show where a sufficient number of images have not been taken. However, it is possible that a survey that appears to be good at a glance does not meet the required standard. This is a problem because this error can only be detected when the images are processed, so that the missing areas can only be filled in later under different weather conditions (wind and sunlight), which results images of different colours and shadows. In addition, the position of the GPS satellites will not be the same and the disturbances (ionosphere) affecting the position measurement will be different, which may lead to position errors.
For these reasons, it would be advisable to carry out the check-up in the field, immediately after the flight. The orthophotograph takes a long time to produce, so we have developed a different method of verification.
The necessary calculations were done by VBA program in Excel, developed with the help of GeoGebra online apps. We produced footprints of the images and calculated how much of each image is covered by at least 60% of the other images. By footprint, we mean the rectangular area of the earth's surface that is captured in an image by the drone's camera.
Verification of overlaps calculation
For some images, GeoGebra was used to check the accuracy of the VBA overlap calculation, but we needed a method that would perform the calculations quickly even when the number of images was in the hundreds. We had been planning to use the footprints of the images to create the overlapping shapes by geometric calculations and then compare their areas with the original areas.
Elementary coordinate geometry calculations are very complex. The sides of the two rectangles being compared intersect at two or four points, or in extreme cases, they may intersect at two points and the vertex of one rectangle may lie on the side of the other rectangle. Using elementary geometric methods, it would have been necessary to find all the intersections of the sides of the two rectangles and then calculate the area of the plane given by the coordinates obtained. Once the intersection points had been found, the area calculation would not have been easy either, since the overlapping shape could be triangular, quadrilateral, pentagonal or hexagonal (Fig. 2).
This complex calculation for each pair of images, which would generate greatly increased running time. The problems encountered meant that we had to look for a solution other than Excel.
Materials and methods
The calculations were based on 141 images from a drone survey of a grassland. The survey took place in Iregszemcse on 04 April 2019, with an average relative (measured from ground level) flight altitude of 10.18 m. The images were captured by the FC330 camera of a DJI Phantom 4 drone. The camera had a field of view (FOV) of 94°, an image size of 4,000 × 3,000 pixels, and the average height of the calculated footprints was 11.0 m and the average width was 14.67 m.
The images of flight were previously evaluated using the VBA program. The weather was very windy, the effect of which is clearly visible in the flight footprints (Fig. 3) created by QGIS software. Due to the wind, the capturing of images was dense, so a wide range of overlap values were obtained. The DJI_0104 and DJI_0108 images have been highlighted because they received special attention in the analysis later.
Other method for overlaps calculation
Spatial operations, including overlap calculations, are solved in geographic information software. Since QGIS is developed in C++ and Python, we have also looked for a solution in these languages.
Python is one of today's most popular, versatile programming languages. There are several development environments and many libraries, called packages, which provide ready-made functions for a wide range of tasks.
The Shapely Python package can be used to solve the intersection calculation. The intersection function in this package makes it very easy to calculate the area of overlapping parts of shapes of arbitrary shape and location, so calculating the intersection ratio is not complicated.
…
As a check, the overlap area calculated by Python was compared with the result obtained in GeoGebra, which were of course identical.
To check the overlaps, we used the coordinates of the footprints of the sample area generated earlier in Excel. The Python code was used to calculate how many of the images overlapped at least 60% with other images. This count was compared with the count calculated by the Excel VBA program.
Finding the causes of discrepancies
Differences between the results of VBA and Python were searched for by using GeoGebra and re-running the calculations step by step.
The effect of rotation on overlap
The calculations in VBA assumed that the sides of the images are parallel to each other, but in real world this is only approximately true. We therefore used a Python program to investigate the effect of rotation on the area of overlap.
Results
Checking overlaps
The overlaps calculated with VBA do not fully match the results of the Python program. The result of both calculations is a count: the number of images that cover at least 60% of an image. The results show that, for the selected flight, the results of the VBA averaging method agree 92.2 percent (130/141) with the Python results. Out of the 141 images produced, 11 images were found to have a discrepancy between the calculations. The largest discrepancy was 2 and this only occurred in two cases (Fig. 4).
The following diagram (Fig. 5) shows that there are a large number of images that overlap, but only a small number of images where the overlap exceeds the required 60%.
Reasons for the discrepancies
A specific pair of images was chosen to investigate the discrepancies. In one case, the VBA program calculates that the DJI_0104 image does not overlap 60% with the DJI_0108 image, but Python does. The Python program calculates overlapping I in a different way, the DJI_0108 image does not overlap 60% with the DJI_0104 image, but the VBA program calculates that it does. We tried to clarify this contradictory result by using GeoGebra, where, in addition to the position of the two rectangles, we also indicated the values used for the calculations (Fig. 6).
The Python program calculates the overlap percentage from the area of the first reference rectangle. The areas of the two rectangles are not equal, because the DJI 108 image was taken from a 20 cm higher altitude. Since the two areas are different, the overlap percentages are also different, with 60.73% overlap compared to DJI_0104, which has a smaller area, and only 58.42% overlap compared to DJI_0108 (Table 1).
Overlap values of DJI_0104 and DJI_0108 images calculated in Python
Area m2 | Overlap % | |
Common area (purple) | 98.3493 | |
DJI_0104 (green) | 161.9483 | 60.73% |
DJI_0108 (blue) | 168.3615 | 58.42% |
Figure 7 shows a sketch of the midpoint calculation, which is of course similar to Fig. 6. In the explanation, the notations in Fig. 5 are arbitrarily indicated. The VBA program checks whether the centre (O1) of the overlapping rectangle is closer to the centre (O) of the base rectangle than the distance (P) between the intersection of the hyperbola and the line connecting the centres. The distance (k) between the centres is simply calculated. To calculate the point P on the hyperbola, we first calculate the angle beta, which depends on the slope of the side (alpha) and the slope of the line joining the centres (alpha + beta). Next, we need to calculate the coordinates of the intersection of the line of the given slope and the hyperbola, from which we can determine the distance between points O and P (denoted k_h below). If the distance OP (k_h) is less than or equal to the distance OO1 (k), then the overlap is correct.
In the calculations we assumed that the sides of the images are parallel to each other, but in reality, this is only approximately true. In addition to the different footprints, the slight difference in the slope of the sides also affected the distances. Thus, it is possible to have different results, both within 10 cm (Table 2).
Overlap values of DJI_0104 and DJI_0108 images calculated with VBA
Base (reference) picture | DJI_0108.JPG | DJI_0104.JPG |
Slope of the line joining the midpoints (k1) | (k1) −0.8780 | |
The slope of the sides of the base rectangle (k2–k3) | (k2) −0.9099 | (k3) −0.9035 |
Slope of the line joining the centres with respect to the side (kx) | 0.01771 | 0.01422 |
Distance between centres (k) | 4.4366 | |
Distance of the hyperbola point from the reference centre (k_h) | 4.3740 | 4.4665 |
Check | k > k_h | k < k_h |
Result | does not overlap | overlapped |
The impact of rotation
The sides of the footprints are not parallel, so we investigated the rotation effect. The footprint of the DJI_0108 image was rotated around its centre point every 5° between −90° and +90° using a Python program, and the percentages of overlap were recorded relative to the footprint of the DJI_0104 image. The result of the rotation is shown in Fig. 8. The overlap was the largest at baseline (rotation = 0) and then became smaller as the rotation progressed, but after a while it started to increase anew, though it did not reach 60% again. It can also be seen that the curve is not symmetric for rotation 0. The minimum was 59.02% and the maximum 60.74%, values not significantly different from 60%.
We then examined the effect of rotation on overlap in more detail. Preliminarily, we expected that without rotation, i.e. with parallel sides, the overlap would be the largest.
We took two congruent rectangles with a 3:4 aspect ratio (Fig. 7). In the base rectangle (ABCD) we drew a hyperbola given by the previous equation. The hyperbola was intersected by lines drawn from point W at 15° intervals from 0 to 90°, giving the points O1, …, O7. The centre of the overlapping rectangle (H) was then inserted in sequence into the points created on the hyperbola and rotated counter clockwise between −90 and +90° every 5°, while the overlap percentage was calculated and recorded. The overlap percentage curves are also shown in Fig. 9.
The results did not confirm our hypothesis, the curve does not show any regularity. There is always just 60% overlap in the hyperbola centres (O1–O7) without rotation. Only at the two endpoints of the hyperbola is the curve symmetric (O1 and O7), i.e. it does not matter in which direction the rotation was done. In the other cases, the position of the centre and the direction and angle of rotation also affected the overlap. Most overlaps greater than 60% were found for O3, where there are only two rotation values with slightly less overlap than this (>59.95%)!
Conclusions
The popular Python language is suitable for testing the VBA footprint calculation method. Python has a number of development packages, Shapely is specifically designed for spatial calculations, so it can calculate the footprint overlap of two drone images.
The Python calculation proved the usefulness of the developed VBA method. Of course, the Python method gives accurate results, but VBA is also acceptable, so it can be used even if the user only has Excel. Results show that both methods are suitable for on-site verification of drone flights.
The Python script runs faster, and provides the results within 5–10 min on a single laptop. The calculation does not require internet access.
The data produced within 10 min can be used to decide whether there is sufficient overlapping footage and whether additional footage should be taken with the drone. The conditions of the newer flight are nearly identical, so the newer images can be inserted into the earlier ones.
References
Adjidjonu, D. and Burgett, J. (2019). Optimal UAS parameters for aerial mapping and modeling. In: 55th ASC Annual International Conference, University of Denver, 10–13 April 2019, Available at: http://ascpro0.ascweb.org/archives/cd/2019/paper/CPRT255002019.pdf(Accessed 23 August 2023).
Barna, R. and Horváthné Kovács, B. (2022). Elevation models on a sample area of Bőszénfa. Journal of Agricultural Informatics, 12(2): 9–17, https://doi.org/10.17700/jai.2021.12.2.600.
Barna, R., Solymosi, K., and Stettner, E. (2019). Mathematical analysis of drone flight path. Journal of Agricultural Informatics, 10(2): 15–27, https://doi.org/10.17700/jai.2019.10.2.533.
Burdziakowski, P. and Bobkowska, K. (2021). UAV photogrammetry under poor lighting conditions-accuracy considerations. Sensors, 21(10): 3531, https://doi.org/10.3390/s21103531.
Busznyák, J. (2022a). Collection and processing of multispectral remote sensing data. Agricultural Technology, 63(4): 2–5.
Busznyák, J. (2022b). Vegetation analysis possibilities using visible-range drone imagery. Journal of Central European Green Innovation, 10(2): 3–17, https://doi.org/10.33038/jcegi.3463.
Cano, E., Horton, R., Liljegren, C., and Bulanon, D. (2017). Comparison of small unmanned aerial vehicles performance using image processing. Journal of Imaging, 3(1): 4, https://doi.org/10.3390/jimaging3010004.
Chowdhury, A. and De, D. (2023). rGSO-UAV: reverse Glowworm Swarm Optimization inspired UAV path-planning in a 3D dynamic environment. Ad Hoc Networks, 140: 103068, https://doi.org/10.1016/j.adhoc.2022.103068.
Ćwiąkała, P. (2019). Testing procedure of unmanned aerial vehicles (UAVs) trajectory in automatic missions. Applied Sciences, 9(17): 3488, https://doi.org/10.3390/app9173488.
Duan, T., Chapman, S.C., Guo, Y., and Zheng, B. (2017). Dynamic monitoring of NDVI in wheat agronomy and breeding trials using an unmanned aerial vehicle. Field Crops Research, 210: 71–80, https://doi.org/10.1016/j.fcr.2017.05.025.
Hsieh, C.-S., Hsiao, D.-H., and Lin, D.-Y. (2023). Contour mission flight planning of UAV for photogrammetry in hillside areas. Applied Sciences, 13(13): 7666, https://doi.org/10.3390/app13137666.
Hung, I.-K., Unger, D., Kulhavy, D., and Zhang, Y. (2019). Positional precision analysis of orthomosaics derived from drone captured aerial imagery. Drones, 3(2): 46, https://doi.org/10.3390/drones3020046.
Jiménez-Jiménez, S.I., Ojeda-Bustamante, W., Marcial-Pablo, M., and Enciso, J. (2021). Digital terrain models generated with low-cost UAV photogrammetry: methodology and accuracy. ISPRS International Journal of Geo-Information., 10(5): 285, https://doi.org/10.3390/ijgi10050285.
Kaňuk, J., Gallay, M., Eck, C., Zgraggen, C., and Dvorný, E. (2018). Technical report: unmanned helicopter solution for survey-grade lidar and hyperspectral mapping. Pure and Applied Geophysics, 175(9): 3357–3373, https://doi.org/10.1007/s00024-018-1873-2.
Lakshmi, S.E. and Yarrakula, K. (2019). Review and critical analysis on digital elevation models. Geophys, 35(2): 129–157, https://doi.org/10.15233/gfz.2018.35.7.
Marino, S. and Alvino, A. (2019). Detection of spatial and temporal variability of wheat cultivars by high-resolution vegetation indices. Agronomy, 9(5): 226, https://doi.org/10.3390/agronomy9050226.
Mathews, A.J., Singh, K.K., Cummings, A.R., and Rogers, S.R. (2023). Fundamental practices for drone remote sensing research across disciplines. Drone Systems and Applications, 11: 1–22, https://doi.org/10.1139/dsa-2023-0021.
Mulakala, J. (2019). Measurement accuracy of the DJI Phantom 4 RTK & photogrammetry. DroneDeploy, Available at: https://www.gim-international.com/files/23b0ad77f81a0aa56e8c83f8c4300270.pdf (Accessed 08 July 2023).
Niedzielski, T. (2018). Applications of unmanned aerial vehicles in geosciences: introduction. Pure and Applied Geophysics, 175: 3141–3144, https://doi.org/10.1007/s00024-018-1992-9.
Olson, D. and Anderson, J. (2021). Review on unmanned aerial vehicles, remote sensors, imagery processing, and their applications in agriculture. Agronomy Journal, 113(2): 971–992, portico https://doi.org/10.1002/agj2.20595.
Radford, C.R. (2020). Best practices when using multi-rotor consumer UAVs for photogrammetric mapping: limitations and possible solutions, Master's Thesis. Queen's University, Kingston (Canada), https://qspace.library.queensu.ca/items/25f4b5e7-0acd-4d01-ad88-2bf6ff588d41.
S Sai, S., Tjahjadi, M.E., and A Rokhmana, C. (2019). Geometric accuracy assessments of orthophoto production from UAV aerial images. KnE Engineering, 4(3): 333–344, https://doi.org/10.18502/keg.v4i3.5876.
San, B.T. and Suzen, M.L. (2005). Digital elevation model (DEM) generation and accuracy assessment from ASTER stereo data. International Journal of Remote Sensing, 26(22): 5013–5027, https://doi.org/10.1080/01431160500177620.
Schirrmann, M., Giebel, A., Gleiniger, F., Pflanz, M., Lentschke, J., and Dammer, K.-H. (2016). Monitoring agronomic parameters of winter wheat crops with low-cost UAV imagery. Remote Sensing, 8(9): 706, https://doi.org/10.3390/rs8090706.
Slade, G., Fawcett, D., Cunliffe, A.M., Brazier, R.E., Nyaupane, K., Mauritz, M., Vargas, S., and Anderson, K. (2023). Optical reflectance across spatial scales-an intercomparison of transect-based hyperspectral, drone, and satellite reflectance data for dry season rangeland. Drone Systems and Applications, 11: 1–20, https://doi.org/10.1139/dsa-2023-0003.
Tielidze, L.G., Svanadze, D., Gadrani, L., Asanidze, L., Wheate, R.D., and Hamilton, G.S. (2020). A 54-year record of changes at Chalaati and Zopkhito glaciers, Georgian Caucasus, observed from archival maps, satellite imagery, drone survey and ground-based investigation. Hungarian Geographical Bulletin, 69(2): 175–189, http://dx.doi.org/10.15201/hungeobull.69.2.6.
Torres, L.G., Nieukirk, S.L., Lemos, L., and Chandler, T.E. (2018). Drone up! Quantifying whale behavior from a new perspective improves observational capacity. Frontiers in Marine Science, 5, https://doi.org/10.3389/fmars.2018.00319.