Citations¶
[1]T. Dang, S. Khattak, F. Mascarich, and K. Alexis, “Explore locally, plan globally: A path planning framework for autonomous robotic exploration in subterranean environments,” in 2019 19th International Conference on Advanced Robotics (ICAR). IEEE, 2019, pp. 9–16.
[2] Y. Chang, K. Ebadi, C. E. Denniston, M. F. Ginting, A. Rosinol, A. Reinke, M. Palieri, J. Shi, A. Chatterjee, B. Morrell et al., “Lamp 2.0: A robust multi-robot slam system for operation in challenging large-scale underground environments,” IEEE Robotics and Automation Letters, vol. 7, no. 4, pp. 9175–9182, 2022.
[3] F. Vanegas, K. J. Gaston, J. Roberts, and F. Gonzalez, “A framework for uav navigation and exploration in gps-denied environments,” in 2019 ieee aerospace conference. IEEE, 2019, pp. 1–6.
[4] C. Tatsch, J. A. B. Jnr, D. Covell, I. B. Tulu, and Y. Gu, “Rhino: An autonomous robot for mapping underground mine environments,” arXiv preprint arXiv:2305.06958, 2023.
[5] M. ´A. Mu˜noz-Ba˜n´on, F. A. Candelas, and F. Torres, “Targetless camera-lidar calibration in unstructured environments,” IEEE Access, vol. 8, pp. 143 692–143 705, 2020.
[6] B. Lindqvist, A. A. Agha-Mohammadi, and G. Nikolakopoulos, “Exploration-RRT: A multi-objective path planning and exploration framework for unknown and unstructured environments,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2021, pp. 3429–3435. [Online]. Available: https://ieeexplore.ieee.org/abstract/document/9636243/
[7] D. Trivun, E. ˇSalaka, D. Osmankovi´c, J. Velagi´c, and N. Osmi´c, “Active slam-based algorithm for autonomous exploration with mobile robot,” in 2015 IEEE International Conference on Industrial Technology (ICIT). IEEE, 2015, pp. 74–79.
[8] T. Dang, F. Mascarich, S. Khattak, H. Nguyen, H. Nguyen, S. Hirsh, R. Reinhart, C. Papachristos, and K. Alexis, “Autonomous search for underground mine rescue using aerial robots,” in 2020 IEEE Aerospace Conference. IEEE, 2020, pp. 1–8.
[9] R. Almadhoun, T. Taha, L. Seneviratne, J. Dias, and G. Cai, “A survey on inspecting structures using robotic systems,” International Journal of Advanced Robotic Systems, vol. 13, no. 6, p. 1729881416663664, 2016.
[23] F. Amigoni, “Experimental evaluation of some exploration strategies for mobile robots,” in 2008 IEEE International Conference on Robotics and Automation. IEEE, 2008, pp. 2818–2823.
[24] A. G. Bachrach, “Trajectory bundle estimation for perception-driven planning,” Ph.D. dissertation, Massachusetts Institute of Technology, 2013.
[25] F. Furrer, M. Burri, M. Achtelik, and R. Siegwart, “Rotors—a modular gazebo mav simulator framework,” Robot Operating System (ROS) The Complete Reference (Volume 1), pp. 595–625, 2016.
[26] A. Santamaria-Navarro, R. Thakker, D. D. Fan, B. Morrell, and A.-a. Agha-mohammadi, “Towards resilient autonomous navigation of drones,” in The International Symposium of Robotics Research. Springer, 2019, pp. 922–937.
[27] A. Reinke, M. Palieri, B. Morrell, Y. Chang, K. Ebadi, L. Carlone, and A.-A. Agha-Mohammadi, “Locus 2.0: Robust and computationally efficient lidar odometry for real-time 3d mapping,” IEEE Robotics and Automation Letters, vol. 7, no. 4, pp. 9043–9050, 2022.
[28] C. Cadena, L. Carlone, H. Carrillo, Y. Latif, D. Scaramuzza, J. Neira, I. Reid, and J. J. Leonard, “Past, present, and future of simultaneous localization and mapping: Toward the robust-perception age,” IEEE Transactions on robotics, vol. 32, no. 6, pp. 1309–1332, 2016.
[29] J. Zhou, G. Cui, S. Hu, Z. Zhang, C. Yang, Z. Liu, L. Wang, C. Li, and M. Sun, “Graph neural networks: A review of methods and applications,” AI open, vol. 1, pp. 57–81, 2020.
[30] H. Yang, J. Shi, and L. Carlone, “Teaser: Fast and certifiable point cloud registration,” IEEE Transactions on Robotics, vol. 37, no. 2, pp. 314–333, 2020.
[31] R. B. Rusu, N. Blodow, and M. Beetz, “Fast point feature histograms (fpfh) for 3d registration,” in 2009 IEEE international conference on robotics and automation. IEEE, 2009, pp. 3212–3217.
[32] A. Segal, D. Haehnel, and S. Thrun, “Generalized-icp.” in Robotics: science and systems, vol. 2, no. 4. Seattle, WA, 2009, p. 435.
[33] E. Jones, J. Sofonia, C. Canales, S. Hrabar, and F. Kendoul, “Applications for the hovermap autonomous drone system in underground mining operations,” Journal of the Southern African Institute of Mining and Metallurgy, vol. 120, no. 1, pp. 49–56, 2020.
[34] A. A. Ravankar, A. Ravankar, Y. Kobayashi, and T. Emaru, “Autonomous mapping and exploration with unmanned aerial vehicles using low cost sensors,” Multidisciplinary digital publishing institute proceedings, vol. 4, no. 1, p. 44, 2018.
[35] S. P. Yamaguchi, M. Sakuma, T. Ueno, F. Karolonek, T. Uhl, A. A. Ravankar, T. Emaru, and Y. Kobayashi, “Aerial robot model based design and verification of the single and multi-agent inspection application development,” arXiv preprint arXiv:1812.05296, 2018.
[36] M. Sakuma, Y. Kobayashi, T. Emaru, and A. A. Ravankar, “Mapping of pier substructure using uav,” in 2016 IEEE/SICE International Symposium on System Integration (SII). IEEE, 2016, pp. 361–366.
[37] A. Ravankar, A. A. Ravankar, Y. Kobayashi, and T. Emaru, “On a bio-inspired hybrid pheromone signalling for efficient map exploration of multiple mobile service robots,” Artificial life and robotics, vol. 21, pp. 221–231, 2016.
[38] A. Ravankar, A. A. Ravankar, Y. Kobayashi, Y. Hoshino, C.-C. Peng, and M. Watanabe, “Hitchhiking based symbiotic multi-robot navigation in sensor networks,” Robotics, vol. 7, no. 3, p. 37, 2018.
[39] W. Burgard, D. Fox, and S. Thrun, “Probabilistic robotics (intelligent robotics and autonomous agents),” 2005.
[40] V. Kumar and N. Michael, “Opportunities and challenges with autonomous micro aerial vehicles,” The International Journal of Robotics Research, vol. 31, no. 11, pp. 1279–1291, 2012.
[41] S. Grzonka, G. Grisetti, and W. Burgard, “A fully autonomous indoor quadrotor,” IEEE Transactions on Robotics, vol. 28, no. 1, pp. 90–100, 2011.
[42] A. Ravankar, A. A. Ravankar, Y. Kobayashi, and T. Emaru, “Symbiotic navigation in multi-robot systems with remote obstacle knowledge sharing,” Sensors, vol. 17, no. 7, p. 1581, 2017.
[43] B. Yamauchi, “A frontier-based approach for autonomous exploration,” in Proceedings 1997 IEEE International Symposium on Computational Intelligence in Robotics and Automation CIRA’97.’Towards New Computational Principles for Robotics and Automation’. IEEE, 1997, pp. 146–151.
[44] D. Fox, W. Burgard, and S. Thrun, “The dynamic window approach to collision avoidance,” IEEE Robotics & Automation Magazine, vol. 4, no. 1, pp. 23–33, 1997.
[45] A. A. Ravankar, Y. Kobayashi, and T. Emaru, “Clustering based loop closure technique for 2d robot mapping based on ekf-slam,” in 2013 7th Asia Modelling Symposium. IEEE, 2013, pp. 72–77.
[46] F. Vanegas and F. Gonzalez, “Enabling uav navigation with sensor and environmental uncertainty in cluttered and gps-denied environments,” Sensors, vol. 16, no. 5, p. 666, 2016.
[47] F. Vanegas, D. Campbell, N. Roy, K. J. Gaston, and F. Gonzalez, “Uav tracking and following a ground target under motion and localisation uncertainty,” in 2017 IEEE Aerospace Conference. IEEE, 2017, pp. 1–10.
[48] R. Mur-Artal and J. D. Tard´os, “Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras,” IEEE transactions on robotics, vol. 33, no. 5, pp. 1255–1262, 2017.
[49] M. Labbe and F. Michaud, “Online global loop closure detection for large-scale multi-session graph-based slam,” in 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2014, pp. 2661–2666.
[50] A. S. Huang, A. Bachrach, P. Henry, M. Krainin, D. Maturana, D. Fox, and N. Roy, “Visual odometry and mapping for autonomous flight using an rgb-d camera,” in Robotics Research: The 15th International Symposium ISRR. Springer, 2017, pp. 235–252.
[51] G. Zhou, L. Fang, K. Tang, H. Zhang, K. Wang, and K. Yang, “Guidance: A visual sensing platform for robotic applications,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, 2015, pp. 9–14.
[52] J. Pineau, G. Gordon, and S. Thrun, “Anytime point-based approximations for large pomdps,” Journal of Artificial Intelligence Research, vol. 27, pp. 335–380, 2006.
[53] A. Guez and J. Pineau, “Multi-tasking slam,” in 2010 IEEE International Conference on Robotics and Automation. IEEE, 2010, pp. 377–384.
[54] A. Hornung, K. M. Wurm, M. Bennewitz, C. Stachniss, and W. Burgard, “Octomap: An efficient probabilistic 3d mapping framework based on octrees,” Autonomous robots, vol. 34, pp. 189–206, 2013.
[55] D. Klimenko, J. Song, and H. Kurniawati, “Tapir: A software toolkit for approximating and adapting pomdp solutions online,” in Proceedings of the Australasian Conference on Robotics and Automation, Melbourne, Australia, vol. 24, 2014.
[56] H. Kurniawati and V. Yadav, “An online pomdp solver for uncertainty planning in dynamic environment,” in Robotics Research: The 16th International Symposium ISRR. Springer, 2016, pp. 611–629.
[57] S. Khattak, H. Nguyen, F. Mascarich, T. Dang, and K. Alexis, “Complementary multi–modal sensor fusion for resilient robot pose estimation in subterranean environments,” in 2020 International Conference on Unmanned Aircraft Systems (ICUAS). IEEE, 2020, pp. 1024–1029.
[58] M. Ramezani, K. Khosoussi, G. Catt, P. Moghadam, J. Williams, P. Borges, F. Pauling, and N. Kottege, “Wildcat: Online continuous-time 3d lidar-inertial slam,” arXiv preprint arXiv:2205.12595, 2022.
[59] A. Agha, K. Otsu, B. Morrell, D. D. Fan, R. Thakker, A. Santamaria-Navarro, S.-K. Kim, A. Bouman, X. Lei, J. Edlund et al., “Nebula: Quest for robotic autonomy in challenging environments; team costar at the darpa subterranean challenge,” arXiv preprint arXiv:2103.11470, 2021.
[60] T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and D. Rus, “Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping,” in 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, 2020, pp. 5135–5142.
[61] N. El-Sheimy, H. Hou, and X. Niu, “Analysis and modeling of inertial sensors using allan variance,” IEEE Transactions on instrumentation and measurement, vol. 57, no. 1, pp. 140–149, 2007.
[62] R. G. Valenti, I. Dryanovski, and J. Xiao, “Keeping a good attitude: A quaternion-based orientation filter for imus and margs,” Sensors, vol. 15, no. 8, pp. 19 302–19 330, 2015.
[51] G. Zhou, L. Fang, K. Tang, H. Zhang, K. Wang, and K. Yang, “Guidance: A visual sensing platform for robotic applications,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, 2015, pp. 9–14.
[52] J. Pineau, G. Gordon, and S. Thrun, “Anytime point-based approximations for large pomdps,” Journal of Artificial Intelligence Research, vol. 27, pp. 335–380, 2006.
[53] A. Guez and J. Pineau, “Multi-tasking slam,” in 2010 IEEE International Conference on Robotics and Automation. IEEE, 2010, pp. 377–384.
[54] A. Hornung, K. M. Wurm, M. Bennewitz, C. Stachniss, and W. Burgard, “Octomap: An efficient probabilistic 3d mapping framework based on octrees,” Autonomous robots, vol. 34, pp. 189–206, 2013.
[55] D. Klimenko, J. Song, and H. Kurniawati, “Tapir: A software toolkit for approximating and adapting pomdp solutions online,” in Proceedings of the Australasian Conference on Robotics and Automation, Melbourne, Australia, vol. 24, 2014.
[56] H. Kurniawati and V. Yadav, “An online pomdp solver for uncertainty planning in dynamic environment,” in Robotics Research: The 16th International Symposium ISRR. Springer, 2016, pp. 611–629.
[57] S. Khattak, H. Nguyen, F. Mascarich, T. Dang, and K. Alexis, “Complementary multi–modal sensor fusion for resilient robot pose estimation in subterranean environments,” in 2020 International Conference on Unmanned Aircraft Systems (ICUAS). IEEE, 2020, pp. 1024–1029.
[58] M. Ramezani, K. Khosoussi, G. Catt, P. Moghadam, J. Williams, P. Borges, F. Pauling, and N. Kottege, “Wildcat: Online continuous-time 3d lidar-inertial slam,” arXiv preprint arXiv:2205.12595, 2022.
[59] A. Agha, K. Otsu, B. Morrell, D. D. Fan, R. Thakker, A. Santamaria-Navarro, S.-K. Kim, A. Bouman, X. Lei, J. Edlund et al., “Nebula: Quest for robotic autonomy in challenging environments; team costar at the darpa subterranean challenge,” arXiv preprint arXiv:2103.11470, 2021.
[60] T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and D. Rus, “Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping,” in 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, 2020, pp. 5135–5142.
[61] N. El-Sheimy, H. Hou, and X. Niu, “Analysis and modeling of inertial sensors using allan variance,” IEEE Transactions on instrumentation and measurement, vol. 57, no. 1, pp. 140–149, 2007.
[62] R. G. Valenti, I. Dryanovski, and J. Xiao, “Keeping a good attitude: A quaternion-based orientation filter for imus and margs,” Sensors, vol. 15, no. 8, pp. 19 302–19 330, 2015.
[63] A. Geiger, F. Moosmann, ¨O. Car, and B. Schuster, “Automatic camera and range sensor calibration using a single shot,” in 2012 IEEE international conference on robotics and automation. IEEE, 2012, pp. 3936–3943.
[64] S. Verma, J. S. Berrio, S. Worrall, and E. Nebot, “Automatic extrinsic calibration between a camera and a 3d lidar using 3d point and plane correspondences,” in 2019 IEEE Intelligent Transportation Systems Conference (ITSC). IEEE, 2019, pp. 3906–3912.
[65] L. Zhou, Z. Li, and M. Kaess, “Automatic extrinsic calibration of a camera and a 3d lidar using line and plane correspondences,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018, pp. 5562–5569.
[66] J. K¨ummerle, T. K¨uhner, and M. Lauer, “Automatic calibration of multiple cameras and depth sensors with a spherical target,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018, pp. 1–8.
[67] J.-E. Ha, “Extrinsic calibration of a camera and laser range finder using a new calibration structure of a plane with a triangular hole,” International Journal of Control, Automation and Systems, vol. 10, no. 6, pp. 1240–1244, 2012.
[68] Y. Park, S. Yun, C. S. Won, K. Cho, K. Um, and S. Sim, “Calibration between color camera and 3d lidar instruments with a polygonal planar board,” Sensors, vol. 14, no. 3, pp. 5333–5353, 2014.
[69] X. Gong, Y. Lin, and J. Liu, “3d lidar-camera extrinsic calibration using an arbitrary trihedron,” Sensors, vol. 13, no. 2, pp. 1902–1918, 2013.
[70] K. H. Strobl and G. Hirzinger, “Optimal hand-eye calibration,” in 2006 IEEE/RSJ international conference on intelligent robots and systems. IEEE, 2006, pp. 4647–4653.
[71] A. Geiger, P. Lenz, C. Stiller, and R. Urtasun, “Vision meets robotics: The kitti dataset,” The International Journal of Robotics Research, vol. 32, no. 11, pp. 1231–1237, 2013.
[72] J. Castorena, U. S. Kamilov, and P. T. Boufounos, “Autocalibration of lidar and optical cameras via edge alignment,” in 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2016, pp. 2862–2866.
[73] J. Jiang, P. Xue, S. Chen, Z. Liu, X. Zhang, and N. Zheng, “Line feature based extrinsic calibration of lidar and camera,” in 2018 IEEE International Conference on Vehicular Electronics and Safety (ICVES). IEEE, 2018, pp. 1–6.
[74] S. Thrun, W. Burgard, and D. Fox, “A real-time algorithm for mobile robot mapping with applications to multi-robot and 3d mapping,” in Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No. 00CH37065), vol. 1. IEEE, 2000, pp. 321–328.
[75] F. Amigoni, S. Gasparini, and M. Gini, “Building segment-based maps without pose information,” Proceedings of the IEEE, vol. 94, no. 7, pp. 1340–1359, 2006.
[76] F. Lu and E. Milios, “Robot pose estimation in unknown environments by matching 2d range scans,” Journal of Intelligent and Robotic systems, vol. 18, pp. 249–275, 1997.
[77] T. Dang, F. Mascarich, S. Khattak, C. Papachristos, and K. Alexis, “Graph-based path planning for autonomous robotic exploration in subterranean environments,” in 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2019, pp. 3105–3112.
[78] S. Khattak, C. Papachristos, and K. Alexis, “Keyframe-based direct thermal–inertial odometry,” in 2019 International Conference on Robotics and Automation (ICRA). IEEE, 2019, pp. 3563–3569.
[79] T. Dang, F. Mascarich, S. Khattak, H. Nguyen, N. Khedekar, C. Papachristos, and K. Alexis, “Field-hardened robotic autonomy for subterranean exploration,” Field and Service Robotics (FSR), 2019.
[80] K. Verbiest, S. A. Berrabah, and E. Colon, “Autonomous frontier based exploration for mobile robots,” in Intelligent Robotics and Applications: 9th International Conference, ICIRA 2015, Portsmouth, UK, August 24–27, 2015, Proceedings, Part III. Springer, 2015, pp. 3–13.
[81] C. Balaguer, R. Montero, J. Victores, S. Mart´ınez, and A. Jard´on, “Towards fully automated tunnel inspection: A survey and future trends,” in ISARC. Proceedings of the International Symposium on Automation and Robotics in Construction, vol. 31. IAARC Publications, 2014, p. 1. [Online]. Available: https://www.researchgate.net/profile/Juan-Victores/publication/269695712 Towards Fully Automated Tunnel Inspection A Survey and Future Trends/links/54aa77cc0cf2bce6aa1d225d/ Towards-Fully-Automated-Tunnel-Inspection-A-Survey-and-Future-Trends.pdf
[82] E. Gambao and C. Balaguer, “Robotics and automation in construction [guest editors],” IEEE Robotics & Automation Magazine, vol. 9, no. 1, pp. 4–6, 2002.
[83] C. Balaguer and M. Abderrahim, Robotics and automation in construction. BoD–Books on Demand, 2008.
[84] C. Balaguer and J. Victores, “Technology innovation in underground construction. chapter: Robotic tunnel inspection and repair,” 2010.
[85] W. Bergeson, S. L. Ernst et al., “Tunnel operations, maintenance, inspection, and evaluation (tomie) manual,” United States. Federal Highway Administration, Tech. Rep., 2015.
[86] T. C. S. No, “Guidebook on non-destructive testing of concrete structures,” Training Course Series, 2002.
[87] M. J. Sansalone and W. B. Streett, “Impact-echo. nondestructive evaluation of concrete and masonry,” 1997.
[88] N. J. Carino, “The impact-echo method: an overview,” Structures 2001: A Structural Engineering Odyssey, pp. 1–18, 2001.
[89] M. Sansalone and N. J. Carino, “Detecting delaminations in concrete slabs with and without overlays using the impact-echo method,” Materials Journal, vol. 86, no. 2, pp. 175–184, 1989.
[90] T. Powers, “Measuring young’s modulus of elasticity by means of sonic vibrations,” in Proc. ASTM, vol. 38, no. Part 2, 1938.
[91] L. Obert, “Sonic method of determining the modulus of elasticity of building materials under pressure,” in Proc. ASTM, vol. 39, 1939, p. 987.
[92] H. Toutanji, “Ultrasonic wave velocity signal interpretation of simulated concrete bridge decks,” Materials and Structures, vol. 33, pp. 207–215, 2000.
[93] R. B. Polder, “Test methods for on site measurement of resistivity of concrete—a rilem tc-154 technical recommendation,” Construction and building materials, vol. 15, no. 2-3, pp. 125–131, 2001.
[94] J.-F. Lataste, C. Sirieix, D. Breysse, and M. Frappa, “Electrical resistivity measurement applied to cracking assessment on reinforced concrete structures in civil engineering,” Ndt & E International, vol. 36, no. 6, pp. 383–394, 2003.
[95] W. J. McCarter and Ø. Vennesland, “Sensor systems for use in reinforced concrete structures,” Construction and Building Materials, vol. 18, no. 6, pp. 351–358, 2004.
[96] E. Nakamura, H. Watanabe, H. Koga, M. Nakamura, and K. Ikawa, “Half-cell potential measurements to assess corrosion risk of reinforcement steels in a pc bridge,” in RILEM symposium on on site assessment of concrete, masonry and timber structures-SACoMaTiS, 2008, pp. 109–117.
[97] J. Richards, “Inspection, maintenance and repair of tunnels: international lessons and practice,” Tunnelling and Underground Space Technology, vol. 13, no. 4, pp. 369–375, 1998.
[98] D. Wu and G. Busse, “Lock-in thermography for nondestructive evaluation of materials,” Revue g´en´erale de thermique, vol. 37, no. 8, pp. 693–703, 1998.
[99] N. Avdelidis and A. Moropoulou, “Applications of infrared thermography for the investigation of historic structures,” Journal of Cultural Heritage, vol. 5, no. 1, pp. 119–127, 2004.
[100] O. B¨uy¨uk¨ozt¨urk, “Imaging of concrete structures,” Ndt & E International, vol. 31, no. 4, pp. 233–243, 1998.
[101] D. McCann and M. Forde, “Review of ndt methods in the assessment of concrete and masonry structures,” Ndt & E International, vol. 34, no. 2, pp. 71–84, 2001.
[102] C. Colla, M. Krause, C. Maierhofer, H.-J. H¨ohberger, and H. Sommer, “Combination of ndt techniques for site investigation of non-ballasted railway tracks,” NDT & E International, vol. 35, no. 2, pp. 95–105, 2002.
[103] Y.-K. Zhu, G.-Y. Tian, R.-S. Lu, and H. Zhang, “A review of optical ndt technologies,” Sensors, vol. 11, no. 8, pp. 7773–7798, 2011.
[104] K. Loupos, A. D. Doulamis, C. Stentoumis, E. Protopapadakis, K. Makantasis, N. D. Doulamis, A. Amditis, P. Chrobocinski, J. Victores, R. Montero et al., “Autonomous robotic system for tunnel structural inspection and assessment,” International Journal of Intelligent Robotics and Applications, vol. 2, pp. 43–66, 2018.
[105] R. Montero, J. G. Victores, S. Martinez, A. Jard´on, and C. Balaguer, “Past, present and future of robotic tunnel inspection,” Automation in Construction, vol. 59, pp. 99–112, 2015.