Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de Twente

This project presents the design, development, and evaluation of a virtual reality teleoperation system for controlling the Franka Research 3 robotic arm. Addressing the limitations of traditional 2D visualization interfaces, the proposed solution enables intuitive control through a distributed arch...

Full description

Autores:
Rivera Arbeláez, Juan Pablo
Tipo de recurso:
Trabajo de grado de pregrado
Fecha de publicación:
2025
Institución:
Universidad Autónoma de Occidente
Repositorio:
RED: Repositorio Educativo Digital UAO
Idioma:
eng
OAI Identifier:
oai:red.uao.edu.co:10614/16179
Acceso en línea:
https://hdl.handle.net/10614/16179
https://red.uao.edu.co/
Palabra clave:
Ingeniería Mecatrónica
Interfaces de teleoperación
Interacción humano-robot
Realidad virtual
Manipulador robótico
Usabilidad del sistema
Teleoperation Interfaces
Human Robot Interaction
Virtual Reality
Robotic Manipulators
System Usability
Rights
openAccess
License
)-- Universidad Autónoma de Occidente, 2025
id REPOUAO2_b4149034f79a3b2112b9dc8e30f7a07e
oai_identifier_str oai:red.uao.edu.co:10614/16179
network_acronym_str REPOUAO2
network_name_str RED: Repositorio Educativo Digital UAO
repository_id_str
dc.title.spa.fl_str_mv Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de Twente
title Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de Twente
spellingShingle Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de Twente
Ingeniería Mecatrónica
Interfaces de teleoperación
Interacción humano-robot
Realidad virtual
Manipulador robótico
Usabilidad del sistema
Teleoperation Interfaces
Human Robot Interaction
Virtual Reality
Robotic Manipulators
System Usability
title_short Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de Twente
title_full Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de Twente
title_fullStr Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de Twente
title_full_unstemmed Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de Twente
title_sort Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de Twente
dc.creator.fl_str_mv Rivera Arbeláez, Juan Pablo
dc.contributor.advisor.none.fl_str_mv Castillo García, Javier Ferney
dc.contributor.author.none.fl_str_mv Rivera Arbeláez, Juan Pablo
dc.contributor.corporatename.spa.fl_str_mv Universidad Autónoma de Occidente
dc.contributor.jury.none.fl_str_mv Llanos Neuta, Nicolas
dc.subject.proposal.spa.fl_str_mv Ingeniería Mecatrónica
Interfaces de teleoperación
Interacción humano-robot
Realidad virtual
Manipulador robótico
Usabilidad del sistema
topic Ingeniería Mecatrónica
Interfaces de teleoperación
Interacción humano-robot
Realidad virtual
Manipulador robótico
Usabilidad del sistema
Teleoperation Interfaces
Human Robot Interaction
Virtual Reality
Robotic Manipulators
System Usability
dc.subject.proposal.eng.fl_str_mv Teleoperation Interfaces
Human Robot Interaction
Virtual Reality
Robotic Manipulators
System Usability
description This project presents the design, development, and evaluation of a virtual reality teleoperation system for controlling the Franka Research 3 robotic arm. Addressing the limitations of traditional 2D visualization interfaces, the proposed solution enables intuitive control through a distributed architecture that integrates ROS2 middleware, Unity engine, a ZED Mini stereo camera, and the Meta Quest 2 VR headset with touch controllers. Following an iterative methodology grounded in the Rational Unified Process and concurrent design principles, the system was incrementally developed and tested with 14 participants. Core functionalities included real-time stereo image streaming and joystick-based manipulation. Performance metrics and user-centered evaluations, collected through the System Usability Scale (SUS), NASA Task Load Index (NASA-TLX), Virtual Reality Sickness Questionnaire (VRSQ), and User Experience Questionnaire (UEQ) demonstrated the system’s overall learnability, usability, and operational feasibility. SUS’s score of 61.25 suggests a usable system with room for improvement to reach full user-friendliness. NASA-TLX score of 32.44 reflects a low perceived workload. VRSQ score of 16.07 points to a low incidence of cybersickness symptoms, supporting the system’s comfort during short-duration use. A clear improvement in task execution time and user confidence was observed across trials, confirming the interface's effectiveness despite minor limitations in synchronization and visual stability. The system architecture emphasizes modularity, scalability, and extensibility, making it suitable for advanced research and practical applications in teleoperation. Key contributions include the validation of a ROS2–Unity integration model for immersive control, a working force-feedback implementation, and insights into usability trade-offs in impedance-based motion control. This work contributes to closing the gap in remote human–robot collaboration and sets a foundation for future innovations around teleoperation and: digital twins, augmented reality, and adaptive haptic control.
publishDate 2025
dc.date.accessioned.none.fl_str_mv 2025-06-17T14:52:03Z
dc.date.available.none.fl_str_mv 2025-06-17T14:52:03Z
dc.date.issued.none.fl_str_mv 2025-06-03
dc.type.spa.fl_str_mv Trabajo de grado - Pregrado
dc.type.coarversion.fl_str_mv http://purl.org/coar/version/c_970fb48d4fbd8a85
dc.type.coar.eng.fl_str_mv http://purl.org/coar/resource_type/c_7a1f
dc.type.content.eng.fl_str_mv Text
dc.type.driver.eng.fl_str_mv info:eu-repo/semantics/bachelorThesis
dc.type.redcol.eng.fl_str_mv http://purl.org/redcol/resource_type/TP
dc.type.version.eng.fl_str_mv info:eu-repo/semantics/publishedVersion
format http://purl.org/coar/resource_type/c_7a1f
status_str publishedVersion
dc.identifier.citation.spa.fl_str_mv Rivera Arbeláez, J. P. (2025). Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de Twente (Pasantía de investigación). Universidad Autónoma de Occidente. Cali. Colombia. https://hdl.handle.net/10614/16179
dc.identifier.uri.none.fl_str_mv https://hdl.handle.net/10614/16179
dc.identifier.instname.spa.fl_str_mv Universidad Autónoma de Occidente
dc.identifier.reponame.spa.fl_str_mv Respositorio Educativo Digital UAO
dc.identifier.repourl.none.fl_str_mv https://red.uao.edu.co/
identifier_str_mv Rivera Arbeláez, J. P. (2025). Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de Twente (Pasantía de investigación). Universidad Autónoma de Occidente. Cali. Colombia. https://hdl.handle.net/10614/16179
Universidad Autónoma de Occidente
Respositorio Educativo Digital UAO
url https://hdl.handle.net/10614/16179
https://red.uao.edu.co/
dc.language.iso.spa.fl_str_mv eng
language eng
dc.relation.references.none.fl_str_mv [1] T. Fong, C. Thorpe, and C. Baur, “Collaboration, Dialogue, Human-Robot Interaction,” in Robotics Research, Berlin, Heidelberg: Springer Berlin Heidelberg, 2003, pp. 255–266. doi: 10.1007/3-540-36460-9_17.
[2] M. Quigley et al., “ROS: an open-source Robot Operating System”, Accessed: Mar. 23, 2025. [Online]. Available: http://www.robotics.stanford.edu/~ang/papers/icraoss09-ROS.pdf
[3] M. A. Goodrich and A. C. Schultz, “Human-Robot Interaction: A Survey,” Foundations and Trends® in Human-Computer Interaction, vol. 1, no. 3, pp. 203–275, 2007, doi: 10.1561/1100000005.
[4] A. Kok Arslan, “A Conceptual Framework For AR/VR Integration Into Our Every Day Lives,” Journal of Multidisciplinary Engineering Science and Technology (JMEST), vol. 7, pp. 2458–9403, 2020, Accessed: Jul. 08, 2024. [Online]. Available: www.jmest.org
[5] M. J. Page et al., “The PRISMA 2020 statement: an updated guideline for reporting systematic reviews,” Syst Rev, vol. 10, no. 1, Dec. 2021, doi: 10.1186/S13643-021-01626-4.
[6] S. I. Abdelmaksoud, M. H. Al-Mola, G. E. M. Abro, and V. S. Asirvadam, “In-Depth Review of Advanced Control Strategies and Cutting-Edge Trends in Robot Manipulators: Analyzing the Latest Developments and Techniques,” IEEE Access, vol. 12, pp. 47672–47701, Apr. 2024, doi: 10.1109/ACCESS.2024.3383782.
[7] Franka Robotics, “Franka Control Interface Documentation.” Accessed: Mar. 23, 2025. [Online]. Available: https://support.franka.de/docs/index.html
[8] Franka Robotics, “Franka Research 3.” Accessed: Mar. 23, 2025. [Online]. Available: https://franka.de/products/franka-research-3
[9] Y. P. Su, X. Q. Chen, C. Zhou, L. H. Pearson, C. G. Pretty, and J. G. Chase, “Integrating Virtual, Mixed, and Augmented Reality into Remote Robotic Applications: A Brief Review of Extended Reality-Enhanced Robotic Systems for Intuitive Telemanipulation and Telemanufacturing Tasks in Hazardous Conditions,” 2023, Multidisciplinary Digital Publishing Institute (MDPI). doi: 10.3390/app132212129.
[10] A. Martín-Barrio, J. J. Roldán, S. Terrile, J. del Cerro, and A. Barrientos, “Application of immersive technologies and natural language to hyper-redundant robot teleoperation,” Virtual Real, vol. 24, no. 3, pp. 541–555, 2020, doi: 10.1007/s10055-019-00414-9.
[11] K. Duan and Z. Zou, “Morphology agnostic gesture mapping for intuitive teleoperation of construction robots,” Advanced Engineering Informatics, vol. 62, 2024, doi: 10.1016/j.aei.2024.102600.
[12] D. Sun, A. Kiselev, Q. Liao, T. Stoyanov, and A. Loutfi, “A New Mixed-Reality-Based Teleoperation System for Telepresence and Maneuverability Enhancement,” IEEE Trans Hum Mach Syst, vol. 50, no. 1, pp. 55–67, 2020, doi: 10.1109/THMS.2019.2960676.
[13] E. Coronado, S. Itadera, and I. G. Ramirez-Alpizar, “Integrating Virtual, Mixed, and Augmented Reality to Human–Robot Interaction Applications Using Game Engines: A Brief Review of Accessible Software Tools and Frameworks,” Applied Sciences (Switzerland), vol. 13, no. 3, 2023, doi: 10.3390/app13031292.
[14] Q. Zhang, Q. Liu, J. Duan, and J. Qin, “Research on Teleoperated Virtual Reality Human–Robot Five-Dimensional Collaboration System,” Biomimetics, vol. 8, no. 8, 2023, doi: 10.3390/biomimetics8080605.
[15] B. R. Galarza, P. Ayala, S. Manzano, and M. V Garcia, “Virtual Reality Teleoperation System for Mobile Robot Manipulation,” Robotics 2023, Vol. 12, Page 163, vol. 12, no. 6, p. 163, 2023, doi: 10.3390/ROBOTICS12060163.
[16] K. A. Szczurek, R. M. Prades, E. Matheson, J. Rodriguez-Nogueira, and M. Di Castro, “Multimodal Multi-User Mixed Reality Human-Robot Interface for Remote Operations in Hazardous Environments,” IEEE Access, vol. 11, pp. 17305–17333, 2023, doi: 10.1109/ACCESS.2023.3245833.
[17] F. Huang, X. Yang, T. Yan, and Z. Chen, “Telepresence augmentation for visual and haptic guided immersive teleoperation of industrial manipulator,” ISA Trans, vol. 150, pp. 262–277, Jul. 2024, doi: 10.1016/j.isatra.2024.05.003.
[18] B. Bejczy et al., “Mixed reality interface for improving mobile manipulator teleoperation in contamination critical applications,” in Procedia Manufacturing, Elsevier B.V., 2020, pp. 620–626. doi: 10.1016/j.promfg.2020.10.087.
[19] Unity Technologies, “Unity Documentation.” Accessed: Mar. 23, 2025. [Online]. Available: https://docs.unity.com/
[20] Open Robotics, “ROS2 Humble Documentation.” Accessed: Mar. 23, 2025. [Online]. Available: https://docs.ros.org/en/humble/index.html
[21] Unity Technologies, “Unity Robotic Hub - GitHub Repository.” Accessed: Mar. 23, 2025. [Online]. Available: https://github.com/Unity-Technologies/Unity-Robotics-Hub
[22] D. J. Rea and S. H. Seo, “Still Not Solved: A Call for Renewed Focus on User-Centered Teleoperation Interfaces,” Front Robot AI, vol. 9, p. 704225, Mar. 2022, doi: 10.3389/FROBT.2022.704225/BIBTEX.
[23] R. Hetrick, N. Amerson, B. Kim, E. Rosen, E. J. D. Visser, and E. Phillips, “Comparing Virtual Reality Interfaces for the Teleoperation of Robots,” 2020 Systems and Information Engineering Design Symposium, SIEDS 2020, Apr. 2020, doi: 10.1109/SIEDS49339.2020.9106630.
[24] D. Whitney, E. Rosen, E. Phillips, G. Konidaris, and S. Tellex, “Comparing Robot Grasping Teleoperation Across Desktop and Virtual Reality with ROS Reality,” Springer Proceedings in Advanced Robotics, vol. 10, pp. 335–350, 2020, doi: 10.1007/978-3-030-28619-4_28.
[25] P. Nandhini, P. Chellammal, J. S. Jaslin, S. Harthy Ruby Priya, M. Uma, and R. Kaviyaraj, “Teleoperation in the Age of Mixed Reality: VR, AR, and ROS Integration for Human-Robot Direct Interaction,” 2023 4th International Conference on Electronics and Sustainable Communication Systems, ICESC 2023 - Proceedings, pp. 240–245, 2023, doi: 10.1109/ICESC57686.2023.10193567.
[26] D. of B. E. University of Twente, “Nakama Robotics Lab.” Accessed: Mar. 23, 2025. [Online]. Available: https://www.utwente.nl/en/et/be/research/nakama_robotics_lab/
[27] A. Mohan et al., “Telesurgery and Robotics: An Improved and Efficient Era,” Cureus, vol. 13, no. 3, Mar. 2021, doi: 10.7759/CUREUS.14124.
[28] “Teleoperation with haptic feedback and VR training | MTC.” Accessed: Aug. 19, 2024. [Online]. Available: https://cms.the-mtc.org/teleoperation-haptic-feedback-and-virtual-reality-training-hazardous-environments
[29] A. U. Krishnan, T. C. Lin, and Z. Li, “Design Interface Mapping for Efficient Free-form Tele-manipulation,” IEEE International Conference on Intelligent Robots and Systems, vol. 2022-October, pp. 6221–6226, 2022, doi: 10.1109/IROS47612.2022.9982149.
[30] P. Y. Reyes-Delgado, M. Mora, H. A. Duran-Limon, L. C. Rodríguez-Martínez, R. V O’Connor, and R. Mendoza-Gonzalez, “The strengths and weaknesses of software architecture design in the RUP, MSF, MBASE and RUP-SOA methodologies: A conceptual review,” Comput Stand Interfaces, vol. 47, pp. 24–41, 2016, doi: 10.1016/j.csi.2016.02.005.
[31] K. T. Ulrich and S. D. Eppinger, Product design and development, 6th ed. New York: McGraw-Hill Education, 2016. Accessed: Mar. 30, 2025. [Online]. Available: https://archive.org/details/productdesigndev0000ulri_r6y7
[32] Instituto Nacional de Seguridad y Salud en el Trabajo (INSST), “NTP 544: Estimación de la carga mental de trabajo: el método NASA-TLX,” España, Feb. 2000. Accessed: Mar. 30, 2025. [Online]. Available: https://www.insst.es/documentacion/colecciones-tecnicas/ntp-notas-tecnicas-de-prevencion/16-serie-ntp-numeros-541-a-575-ano-2001/ntp-544-estimacion-de-la-carga-mental-de-trabajo-el-metodo-nasa-tlx-2000
[33] P. Kourtesis, J. Linnell, R. Amir, F. Argelaguet, and S. E. MacPherson, “Cybersickness in Virtual Reality Questionnaire (CSQ-VR): A Validation and Comparison against SSQ and VRSQ,” Virtual Worlds - MDPI, vol. 1, Feb. 2023, doi: 10.36227/techrxiv.21973640.v1.
[34] M. Schrepp, “User Experience Questionnaire Handbook,” Sep. 2023, doi: 10.13140/RG.2.1.2815.0245.
[35] J. Brooke, “SUS: A ‘Quick and Dirty’ Usability Scale,” Usability Evaluation In Industry, pp. 207–212, 1996, doi: 10.1201/9781498710411-35.
dc.rights.spa.fl_str_mv )-- Universidad Autónoma de Occidente, 2025
dc.rights.coar.fl_str_mv http://purl.org/coar/access_right/c_abf2
dc.rights.uri.eng.fl_str_mv https://creativecommons.org/licenses/by-nc-nd/4.0/
dc.rights.accessrights.eng.fl_str_mv info:eu-repo/semantics/openAccess
dc.rights.creativecommons.spa.fl_str_mv Atribución-NoComercial-SinDerivadas 4.0 Internacional (CC BY-NC-ND 4.0)
rights_invalid_str_mv )-- Universidad Autónoma de Occidente, 2025
https://creativecommons.org/licenses/by-nc-nd/4.0/
Atribución-NoComercial-SinDerivadas 4.0 Internacional (CC BY-NC-ND 4.0)
http://purl.org/coar/access_right/c_abf2
eu_rights_str_mv openAccess
dc.format.extent.spa.fl_str_mv 64 páginas
dc.format.mimetype.none.fl_str_mv application/pdf
dc.publisher.spa.fl_str_mv Universidad Autónoma de Occidente
dc.publisher.program.spa.fl_str_mv Ingeniería Mecatrónica
dc.publisher.faculty.spa.fl_str_mv Facultad de Ingeniería y Ciencias Básicas
dc.publisher.place.spa.fl_str_mv Cali
institution Universidad Autónoma de Occidente
bitstream.url.fl_str_mv https://red.uao.edu.co/bitstreams/c3c72cfa-68f9-4320-a961-1505c493a740/download
https://red.uao.edu.co/bitstreams/37d68d12-0ed3-4806-8d68-73bfe48831b8/download
https://red.uao.edu.co/bitstreams/e82b1461-0d1c-4869-a2fb-e954f2286b54/download
https://red.uao.edu.co/bitstreams/af64f5e9-fb12-4cdc-9548-24c6b74ab7f0/download
https://red.uao.edu.co/bitstreams/9fc25dac-0a7e-452e-8f69-43fa8be0b53f/download
https://red.uao.edu.co/bitstreams/9c38c0a7-3d48-40db-aae4-f36a10a08859/download
https://red.uao.edu.co/bitstreams/1d73ac4b-2551-4a7a-b089-ca75d7b81968/download
https://red.uao.edu.co/bitstreams/89d8bb85-e94d-47a9-8e69-731336c600e0/download
https://red.uao.edu.co/bitstreams/1eda86cb-6e51-4377-9187-010fa9a44a9b/download
https://red.uao.edu.co/bitstreams/702104b0-93bc-4d5c-a9dd-434a24d56913/download
https://red.uao.edu.co/bitstreams/5ea93a4f-3254-4f04-832e-068798b43d56/download
https://red.uao.edu.co/bitstreams/ab76d373-9314-4840-805c-03b9e5922015/download
https://red.uao.edu.co/bitstreams/9839c0e6-d248-42cf-b41e-e2b46e958441/download
https://red.uao.edu.co/bitstreams/d822a641-8a11-450b-a859-6cd27a2f4532/download
https://red.uao.edu.co/bitstreams/ad54c85b-0dab-4f15-912d-57d4ef3962a2/download
https://red.uao.edu.co/bitstreams/94bb5dc1-fdfd-409f-99a8-92f513e7c953/download
https://red.uao.edu.co/bitstreams/9d8e94b2-2c8f-416d-91c8-1dc09bed87bb/download
https://red.uao.edu.co/bitstreams/3a7fb950-4b19-4ba4-a876-32aca5e9dde2/download
https://red.uao.edu.co/bitstreams/e25d43f0-f7bd-4b19-8465-ea93a3bd182d/download
https://red.uao.edu.co/bitstreams/ff90325d-df14-48c6-b6e7-3b1256fbc787/download
https://red.uao.edu.co/bitstreams/cd467e6b-c875-4892-be09-fdcfc03e48ff/download
https://red.uao.edu.co/bitstreams/eb1dca96-66d8-476f-bbdc-ef5929ac9797/download
https://red.uao.edu.co/bitstreams/8f6bbfbe-6a08-4da1-b072-372d6a613cd6/download
https://red.uao.edu.co/bitstreams/5aec91de-9634-4171-9601-3ed5673e3245/download
https://red.uao.edu.co/bitstreams/06bfb751-8ad2-40a6-b5c3-3b0ad7c3ecdb/download
https://red.uao.edu.co/bitstreams/ea6d591f-d461-43f7-b526-74571df93c5e/download
https://red.uao.edu.co/bitstreams/a282dd34-c28a-4481-a549-dce3b71bd95a/download
bitstream.checksum.fl_str_mv 1fa09822ffe568d2d8fb86da056f4091
021e82b70d99038bdb7b8e705df90a52
3df0ab7065d840c2c41902540344105c
5b975c9368dc35f928993c9262bbfb2c
e2917de5901352bb490601ba2fd9f565
c5dcacbfd7b6d3d039b0a226466c16bc
0143926f7c9b38a8cd3a5c8d32835c7d
d9e111ca253ed63a49872bc11b17210d
9bedfecf7b540d75db48f29bd3502385
edfca0b38fd984b08856ee7dd6b3bdd3
8c4ff23c5426b01a7261fcdd037060aa
a62cfc5bdd9d9098320477bb8d3a2d57
ad3189faace84fe8b2ec043e847d3f5a
e5e614e1c1f2b98afef00ee5571b5dae
4a7704ca8df073e3d0aa710b401e933d
ae9912ec1717a1c3749dcfd3a3643796
6987b791264a2b5525252450f99b10d1
189ae7a21f770d5facbc7186ffc8d935
24d68299ae13852334a122ea657bbaaa
d4db8392b77073594aaa0edaee1b31d2
8505994a5ef9174396ba65a2e700697d
4eaf0bbddc8d84f3430d8de2447bc658
9113568f96d73b8b9425bb14b098db00
fcd8ac075e59bdc77a1ea27d2cd1767e
07ca1afcce94059792c37140cfc7eeef
4851c92072d435796d8de82f374ca23c
b9480f22b1be82aefdd180f6d599c1ad
bitstream.checksumAlgorithm.fl_str_mv MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
repository.name.fl_str_mv Repositorio Digital Universidad Autonoma de Occidente
repository.mail.fl_str_mv repositorio@uao.edu.co
_version_ 1837098865812045824
spelling Castillo García, Javier FerneyRivera Arbeláez, Juan PabloUniversidad Autónoma de OccidenteLlanos Neuta, Nicolas2025-06-17T14:52:03Z2025-06-17T14:52:03Z2025-06-03Rivera Arbeláez, J. P. (2025). Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de Twente (Pasantía de investigación). Universidad Autónoma de Occidente. Cali. Colombia. https://hdl.handle.net/10614/16179https://hdl.handle.net/10614/16179Universidad Autónoma de OccidenteRespositorio Educativo Digital UAOhttps://red.uao.edu.co/This project presents the design, development, and evaluation of a virtual reality teleoperation system for controlling the Franka Research 3 robotic arm. Addressing the limitations of traditional 2D visualization interfaces, the proposed solution enables intuitive control through a distributed architecture that integrates ROS2 middleware, Unity engine, a ZED Mini stereo camera, and the Meta Quest 2 VR headset with touch controllers. Following an iterative methodology grounded in the Rational Unified Process and concurrent design principles, the system was incrementally developed and tested with 14 participants. Core functionalities included real-time stereo image streaming and joystick-based manipulation. Performance metrics and user-centered evaluations, collected through the System Usability Scale (SUS), NASA Task Load Index (NASA-TLX), Virtual Reality Sickness Questionnaire (VRSQ), and User Experience Questionnaire (UEQ) demonstrated the system’s overall learnability, usability, and operational feasibility. SUS’s score of 61.25 suggests a usable system with room for improvement to reach full user-friendliness. NASA-TLX score of 32.44 reflects a low perceived workload. VRSQ score of 16.07 points to a low incidence of cybersickness symptoms, supporting the system’s comfort during short-duration use. A clear improvement in task execution time and user confidence was observed across trials, confirming the interface's effectiveness despite minor limitations in synchronization and visual stability. The system architecture emphasizes modularity, scalability, and extensibility, making it suitable for advanced research and practical applications in teleoperation. Key contributions include the validation of a ROS2–Unity integration model for immersive control, a working force-feedback implementation, and insights into usability trade-offs in impedance-based motion control. This work contributes to closing the gap in remote human–robot collaboration and sets a foundation for future innovations around teleoperation and: digital twins, augmented reality, and adaptive haptic control.Este proyecto presenta el diseño, desarrollo y evaluación de un sistema de teleoperación de realidad virtual para controlar el brazo robótico Franka Research 3. Abordando las limitaciones de las interfaces de visualización 2D tradicionales, la solución propuesta permite un control intuitivo a través de una arquitectura distribuida que integra el middleware ROS2, Unity Engine, una cámara estéreo ZED Mini, y el visor de realidad virtual Meta Quest 2 con sus controles. Siguiendo una metodología iterativa basada en el Proceso Racional Unificado y los principios del diseño concurrente, el sistema se desarrolló de forma incremental y se probó con 14 participantes. Entre las funcionalidades básicas se encuentra la transmisión de imágenes estéreo en tiempo real y la manipulación con joystick. Las métricas de rendimiento y evaluaciones centradas en el usuario, recopiladas a través de la Escala de Usabilidad del Sistema (SUS), el Índice de Carga de Tareas de la NASA (NASA-TLX), el Cuestionario de Enfermedad de Realidad Virtual (VRSQ) y el Cuestionario de Experiencia del Usuario (UEQ); demostraron la capacidad de aprendizaje, la usabilidad y la viabilidad operativa generales del sistema. La puntuación del SUS de 61,25 sugiere un sistema utilizable con espacio de mejora para alcanzar la total facilidad de uso. La puntuación NASA-TLX de 31,2 refleja una baja carga de trabajo percibida. La puntuación VRSQ de 16,07 apunta a una baja incidencia de síntomas de cibersickness, lo que respalda la comodidad del sistema durante el uso de corta duración. Se observó una clara mejora en el tiempo de ejecución de la tarea y en la confianza del usuario a lo largo de los ensayos, lo que confirma la eficacia de la interfaz a pesar de pequeñas limitaciones en la sincronización y la estabilidad visual. La arquitectura del sistema hace hincapié en la modularidad y escalabilidad, lo que lo hace adecuado para la investigación y las aplicaciones prácticas en teleoperación. Las contribuciones clave incluyen la validación de un modelo de integración ROS2-Unity para el control inmersivo, una implementación de retroalimentación háptica, y conocimientos sobre las compensaciones de usabilidad en el control de movimiento basado en impedancia. Este trabajo contribuye a cerrar la brecha en la colaboración remota entre humanos y robots, colaborando a la fundamentación para futuras innovaciones en gemelos digitales, realidad aumentada y control háptico adaptativo en interfaces de teleoperaciónPasantía de investigación (ngeniero Mecatrónico)-- Universidad Autónoma de Occidente, 2025PregradoIngeniero(a) Mecatrónico(a)64 páginasapplication/pdfengUniversidad Autónoma de OccidenteIngeniería MecatrónicaFacultad de Ingeniería y Ciencias BásicasCali)-- Universidad Autónoma de Occidente, 2025https://creativecommons.org/licenses/by-nc-nd/4.0/info:eu-repo/semantics/openAccessAtribución-NoComercial-SinDerivadas 4.0 Internacional (CC BY-NC-ND 4.0)http://purl.org/coar/access_right/c_abf2Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de TwenteTrabajo de grado - Pregradohttp://purl.org/coar/resource_type/c_7a1fTextinfo:eu-repo/semantics/bachelorThesishttp://purl.org/redcol/resource_type/TPinfo:eu-repo/semantics/publishedVersionhttp://purl.org/coar/version/c_970fb48d4fbd8a85[1] T. Fong, C. Thorpe, and C. Baur, “Collaboration, Dialogue, Human-Robot Interaction,” in Robotics Research, Berlin, Heidelberg: Springer Berlin Heidelberg, 2003, pp. 255–266. doi: 10.1007/3-540-36460-9_17.[2] M. Quigley et al., “ROS: an open-source Robot Operating System”, Accessed: Mar. 23, 2025. [Online]. Available: http://www.robotics.stanford.edu/~ang/papers/icraoss09-ROS.pdf[3] M. A. Goodrich and A. C. Schultz, “Human-Robot Interaction: A Survey,” Foundations and Trends® in Human-Computer Interaction, vol. 1, no. 3, pp. 203–275, 2007, doi: 10.1561/1100000005.[4] A. Kok Arslan, “A Conceptual Framework For AR/VR Integration Into Our Every Day Lives,” Journal of Multidisciplinary Engineering Science and Technology (JMEST), vol. 7, pp. 2458–9403, 2020, Accessed: Jul. 08, 2024. [Online]. Available: www.jmest.org[5] M. J. Page et al., “The PRISMA 2020 statement: an updated guideline for reporting systematic reviews,” Syst Rev, vol. 10, no. 1, Dec. 2021, doi: 10.1186/S13643-021-01626-4.[6] S. I. Abdelmaksoud, M. H. Al-Mola, G. E. M. Abro, and V. S. Asirvadam, “In-Depth Review of Advanced Control Strategies and Cutting-Edge Trends in Robot Manipulators: Analyzing the Latest Developments and Techniques,” IEEE Access, vol. 12, pp. 47672–47701, Apr. 2024, doi: 10.1109/ACCESS.2024.3383782.[7] Franka Robotics, “Franka Control Interface Documentation.” Accessed: Mar. 23, 2025. [Online]. Available: https://support.franka.de/docs/index.html[8] Franka Robotics, “Franka Research 3.” Accessed: Mar. 23, 2025. [Online]. Available: https://franka.de/products/franka-research-3[9] Y. P. Su, X. Q. Chen, C. Zhou, L. H. Pearson, C. G. Pretty, and J. G. Chase, “Integrating Virtual, Mixed, and Augmented Reality into Remote Robotic Applications: A Brief Review of Extended Reality-Enhanced Robotic Systems for Intuitive Telemanipulation and Telemanufacturing Tasks in Hazardous Conditions,” 2023, Multidisciplinary Digital Publishing Institute (MDPI). doi: 10.3390/app132212129.[10] A. Martín-Barrio, J. J. Roldán, S. Terrile, J. del Cerro, and A. Barrientos, “Application of immersive technologies and natural language to hyper-redundant robot teleoperation,” Virtual Real, vol. 24, no. 3, pp. 541–555, 2020, doi: 10.1007/s10055-019-00414-9.[11] K. Duan and Z. Zou, “Morphology agnostic gesture mapping for intuitive teleoperation of construction robots,” Advanced Engineering Informatics, vol. 62, 2024, doi: 10.1016/j.aei.2024.102600.[12] D. Sun, A. Kiselev, Q. Liao, T. Stoyanov, and A. Loutfi, “A New Mixed-Reality-Based Teleoperation System for Telepresence and Maneuverability Enhancement,” IEEE Trans Hum Mach Syst, vol. 50, no. 1, pp. 55–67, 2020, doi: 10.1109/THMS.2019.2960676.[13] E. Coronado, S. Itadera, and I. G. Ramirez-Alpizar, “Integrating Virtual, Mixed, and Augmented Reality to Human–Robot Interaction Applications Using Game Engines: A Brief Review of Accessible Software Tools and Frameworks,” Applied Sciences (Switzerland), vol. 13, no. 3, 2023, doi: 10.3390/app13031292.[14] Q. Zhang, Q. Liu, J. Duan, and J. Qin, “Research on Teleoperated Virtual Reality Human–Robot Five-Dimensional Collaboration System,” Biomimetics, vol. 8, no. 8, 2023, doi: 10.3390/biomimetics8080605.[15] B. R. Galarza, P. Ayala, S. Manzano, and M. V Garcia, “Virtual Reality Teleoperation System for Mobile Robot Manipulation,” Robotics 2023, Vol. 12, Page 163, vol. 12, no. 6, p. 163, 2023, doi: 10.3390/ROBOTICS12060163.[16] K. A. Szczurek, R. M. Prades, E. Matheson, J. Rodriguez-Nogueira, and M. Di Castro, “Multimodal Multi-User Mixed Reality Human-Robot Interface for Remote Operations in Hazardous Environments,” IEEE Access, vol. 11, pp. 17305–17333, 2023, doi: 10.1109/ACCESS.2023.3245833.[17] F. Huang, X. Yang, T. Yan, and Z. Chen, “Telepresence augmentation for visual and haptic guided immersive teleoperation of industrial manipulator,” ISA Trans, vol. 150, pp. 262–277, Jul. 2024, doi: 10.1016/j.isatra.2024.05.003.[18] B. Bejczy et al., “Mixed reality interface for improving mobile manipulator teleoperation in contamination critical applications,” in Procedia Manufacturing, Elsevier B.V., 2020, pp. 620–626. doi: 10.1016/j.promfg.2020.10.087.[19] Unity Technologies, “Unity Documentation.” Accessed: Mar. 23, 2025. [Online]. Available: https://docs.unity.com/[20] Open Robotics, “ROS2 Humble Documentation.” Accessed: Mar. 23, 2025. [Online]. Available: https://docs.ros.org/en/humble/index.html[21] Unity Technologies, “Unity Robotic Hub - GitHub Repository.” Accessed: Mar. 23, 2025. [Online]. Available: https://github.com/Unity-Technologies/Unity-Robotics-Hub[22] D. J. Rea and S. H. Seo, “Still Not Solved: A Call for Renewed Focus on User-Centered Teleoperation Interfaces,” Front Robot AI, vol. 9, p. 704225, Mar. 2022, doi: 10.3389/FROBT.2022.704225/BIBTEX.[23] R. Hetrick, N. Amerson, B. Kim, E. Rosen, E. J. D. Visser, and E. Phillips, “Comparing Virtual Reality Interfaces for the Teleoperation of Robots,” 2020 Systems and Information Engineering Design Symposium, SIEDS 2020, Apr. 2020, doi: 10.1109/SIEDS49339.2020.9106630.[24] D. Whitney, E. Rosen, E. Phillips, G. Konidaris, and S. Tellex, “Comparing Robot Grasping Teleoperation Across Desktop and Virtual Reality with ROS Reality,” Springer Proceedings in Advanced Robotics, vol. 10, pp. 335–350, 2020, doi: 10.1007/978-3-030-28619-4_28.[25] P. Nandhini, P. Chellammal, J. S. Jaslin, S. Harthy Ruby Priya, M. Uma, and R. Kaviyaraj, “Teleoperation in the Age of Mixed Reality: VR, AR, and ROS Integration for Human-Robot Direct Interaction,” 2023 4th International Conference on Electronics and Sustainable Communication Systems, ICESC 2023 - Proceedings, pp. 240–245, 2023, doi: 10.1109/ICESC57686.2023.10193567.[26] D. of B. E. University of Twente, “Nakama Robotics Lab.” Accessed: Mar. 23, 2025. [Online]. Available: https://www.utwente.nl/en/et/be/research/nakama_robotics_lab/[27] A. Mohan et al., “Telesurgery and Robotics: An Improved and Efficient Era,” Cureus, vol. 13, no. 3, Mar. 2021, doi: 10.7759/CUREUS.14124.[28] “Teleoperation with haptic feedback and VR training | MTC.” Accessed: Aug. 19, 2024. [Online]. Available: https://cms.the-mtc.org/teleoperation-haptic-feedback-and-virtual-reality-training-hazardous-environments[29] A. U. Krishnan, T. C. Lin, and Z. Li, “Design Interface Mapping for Efficient Free-form Tele-manipulation,” IEEE International Conference on Intelligent Robots and Systems, vol. 2022-October, pp. 6221–6226, 2022, doi: 10.1109/IROS47612.2022.9982149.[30] P. Y. Reyes-Delgado, M. Mora, H. A. Duran-Limon, L. C. Rodríguez-Martínez, R. V O’Connor, and R. Mendoza-Gonzalez, “The strengths and weaknesses of software architecture design in the RUP, MSF, MBASE and RUP-SOA methodologies: A conceptual review,” Comput Stand Interfaces, vol. 47, pp. 24–41, 2016, doi: 10.1016/j.csi.2016.02.005.[31] K. T. Ulrich and S. D. Eppinger, Product design and development, 6th ed. New York: McGraw-Hill Education, 2016. Accessed: Mar. 30, 2025. [Online]. Available: https://archive.org/details/productdesigndev0000ulri_r6y7[32] Instituto Nacional de Seguridad y Salud en el Trabajo (INSST), “NTP 544: Estimación de la carga mental de trabajo: el método NASA-TLX,” España, Feb. 2000. Accessed: Mar. 30, 2025. [Online]. Available: https://www.insst.es/documentacion/colecciones-tecnicas/ntp-notas-tecnicas-de-prevencion/16-serie-ntp-numeros-541-a-575-ano-2001/ntp-544-estimacion-de-la-carga-mental-de-trabajo-el-metodo-nasa-tlx-2000[33] P. Kourtesis, J. Linnell, R. Amir, F. Argelaguet, and S. E. MacPherson, “Cybersickness in Virtual Reality Questionnaire (CSQ-VR): A Validation and Comparison against SSQ and VRSQ,” Virtual Worlds - MDPI, vol. 1, Feb. 2023, doi: 10.36227/techrxiv.21973640.v1.[34] M. Schrepp, “User Experience Questionnaire Handbook,” Sep. 2023, doi: 10.13140/RG.2.1.2815.0245.[35] J. Brooke, “SUS: A ‘Quick and Dirty’ Usability Scale,” Usability Evaluation In Industry, pp. 207–212, 1996, doi: 10.1201/9781498710411-35.Ingeniería MecatrónicaInterfaces de teleoperaciónInteracción humano-robotRealidad virtualManipulador robóticoUsabilidad del sistemaTeleoperation InterfacesHuman Robot InteractionVirtual RealityRobotic ManipulatorsSystem UsabilityComunidad generalPublicationORIGINALT11420_Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de Twente.pdfT11420_Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de Twente.pdfArchivo texto completo del trabajo de grado, PDFapplication/pdf2577047https://red.uao.edu.co/bitstreams/c3c72cfa-68f9-4320-a961-1505c493a740/download1fa09822ffe568d2d8fb86da056f4091MD51T11420A_ANEXO A. Systematic Literature Review.zipT11420A_ANEXO A. Systematic Literature Review.zipANEXO A. Systematic Literature Reviewapplication/zip654827https://red.uao.edu.co/bitstreams/37d68d12-0ed3-4806-8d68-73bfe48831b8/download021e82b70d99038bdb7b8e705df90a52MD512T11420B_ANEXO B. RUP. Rational Unified Process Full Documentation.zipT11420B_ANEXO B. RUP. Rational Unified Process Full Documentation.zipANEXO B. RUP. Rational Unified Process Full Documentationapplication/zip2167105https://red.uao.edu.co/bitstreams/e82b1461-0d1c-4869-a2fb-e954f2286b54/download3df0ab7065d840c2c41902540344105cMD516T11420C_ANEXO C. Project GitHub Repository.C.urlT11420C_ANEXO C. Project GitHub Repository.C.urlANEXO C. Project GitHub Repositoryapplication/octet-stream69https://red.uao.edu.co/bitstreams/af64f5e9-fb12-4cdc-9548-24c6b74ab7f0/download5b975c9368dc35f928993c9262bbfb2cMD52T11420D_ANEXO D. DockerHub ZED Image.urlT11420D_ANEXO D. DockerHub ZED Image.urlANEXO D. DockerHub ZED Imageapplication/octet-stream67https://red.uao.edu.co/bitstreams/9fc25dac-0a7e-452e-8f69-43fa8be0b53f/downloade2917de5901352bb490601ba2fd9f565MD53T11420E_ANEXO E DockerHub Franka Image.urlT11420E_ANEXO E DockerHub Franka Image.urlANEXO E DockerHub Franka Imageapplication/octet-stream72https://red.uao.edu.co/bitstreams/9c38c0a7-3d48-40db-aae4-f36a10a08859/downloadc5dcacbfd7b6d3d039b0a226466c16bcMD54T11420F_ANNEX F. User Test Questionnaire.pdfT11420F_ANNEX F. User Test Questionnaire.pdfANNEX F User Test Questionnaireapplication/pdf854515https://red.uao.edu.co/bitstreams/1d73ac4b-2551-4a7a-b089-ca75d7b81968/download0143926f7c9b38a8cd3a5c8d32835c7dMD55T11420G_ANNEX G. Test Protocol.pdfT11420G_ANNEX G. Test Protocol.pdfANNEX G. Test Protocolapplication/pdf303051https://red.uao.edu.co/bitstreams/89d8bb85-e94d-47a9-8e69-731336c600e0/downloadd9e111ca253ed63a49872bc11b17210dMD56T11420H_ANNEX H. Informed Consent.pdfT11420H_ANNEX H. Informed Consent.pdfANEXO H. Informed Consentapplication/pdf197043https://red.uao.edu.co/bitstreams/1eda86cb-6e51-4377-9187-010fa9a44a9b/download9bedfecf7b540d75db48f29bd3502385MD57T11420I_ANEXO I. Results Data.zipT11420I_ANEXO I. Results Data.zipANEXO I. Results Data.zipapplication/zip187766https://red.uao.edu.co/bitstreams/702104b0-93bc-4d5c-a9dd-434a24d56913/downloadedfca0b38fd984b08856ee7dd6b3bdd3MD514T11420J_ANEXO J. Video Full System Working.mp4T11420J_ANEXO J. Video Full System Working.mp4ANEXO J. Video Full System Workingvideo/mp4102262466https://red.uao.edu.co/bitstreams/5ea93a4f-3254-4f04-832e-068798b43d56/download8c4ff23c5426b01a7261fcdd037060aaMD58T11420K_ANEXO K. Innovation Guide.zipT11420K_ANEXO K. Innovation Guide.zipANEXO K. Innovation Guideapplication/zip302452https://red.uao.edu.co/bitstreams/ab76d373-9314-4840-805c-03b9e5922015/downloada62cfc5bdd9d9098320477bb8d3a2d57MD515T11420L_ANEXO L. SP-Sophia-Labs GitHub Repository.urlT11420L_ANEXO L. SP-Sophia-Labs GitHub Repository.urlANEXO L. SP-Sophia-Labs GitHub Repositoryapplication/octet-stream155https://red.uao.edu.co/bitstreams/9839c0e6-d248-42cf-b41e-e2b46e958441/downloadad3189faace84fe8b2ec043e847d3f5aMD59T11420M_ANEXO M. Extend Robotics Webpage.urlT11420M_ANEXO M. Extend Robotics Webpage.urlANEXO M. Extend Robotics Webpageapplication/octet-stream55https://red.uao.edu.co/bitstreams/d822a641-8a11-450b-a859-6cd27a2f4532/downloade5e614e1c1f2b98afef00ee5571b5daeMD510T11420N_ANEXO N. Unity Robotics - ROS TCP Connector.urlT11420N_ANEXO N. Unity Robotics - ROS TCP Connector.urlANEXO N. Unity Robotics - ROS TCP Connectorapplication/octet-stream79https://red.uao.edu.co/bitstreams/ad54c85b-0dab-4f15-912d-57d4ef3962a2/download4a7704ca8df073e3d0aa710b401e933dMD511TA11420_Autorización trabajo de grado.pdfTA11420_Autorización trabajo de grado.pdfAutorización para publicación del trabajo de gradoapplication/pdf753045https://red.uao.edu.co/bitstreams/94bb5dc1-fdfd-409f-99a8-92f513e7c953/downloadae9912ec1717a1c3749dcfd3a3643796MD517LICENSElicense.txtlicense.txttext/plain; charset=utf-81672https://red.uao.edu.co/bitstreams/9d8e94b2-2c8f-416d-91c8-1dc09bed87bb/download6987b791264a2b5525252450f99b10d1MD513TEXTT11420_Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de Twente.pdf.txtT11420_Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de Twente.pdf.txtExtracted texttext/plain96072https://red.uao.edu.co/bitstreams/3a7fb950-4b19-4ba4-a876-32aca5e9dde2/download189ae7a21f770d5facbc7186ffc8d935MD518T11420F_ANNEX F. User Test Questionnaire.pdf.txtT11420F_ANNEX F. User Test Questionnaire.pdf.txtExtracted texttext/plain16https://red.uao.edu.co/bitstreams/e25d43f0-f7bd-4b19-8465-ea93a3bd182d/download24d68299ae13852334a122ea657bbaaaMD520T11420G_ANNEX G. Test Protocol.pdf.txtT11420G_ANNEX G. Test Protocol.pdf.txtExtracted texttext/plain7402https://red.uao.edu.co/bitstreams/ff90325d-df14-48c6-b6e7-3b1256fbc787/downloadd4db8392b77073594aaa0edaee1b31d2MD522T11420H_ANNEX H. Informed Consent.pdf.txtT11420H_ANNEX H. Informed Consent.pdf.txtExtracted texttext/plain5377https://red.uao.edu.co/bitstreams/cd467e6b-c875-4892-be09-fdcfc03e48ff/download8505994a5ef9174396ba65a2e700697dMD524TA11420_Autorización trabajo de grado.pdf.txtTA11420_Autorización trabajo de grado.pdf.txtExtracted texttext/plain5364https://red.uao.edu.co/bitstreams/eb1dca96-66d8-476f-bbdc-ef5929ac9797/download4eaf0bbddc8d84f3430d8de2447bc658MD526THUMBNAILT11420_Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de Twente.pdf.jpgT11420_Desarrollo de una interfaz de usuario para la teleoperación de un robot manipulador usando herramientas de realidad virtual en el laboratorio de robótica Nakama de la Universidad de Twente.pdf.jpgGenerated Thumbnailimage/jpeg7996https://red.uao.edu.co/bitstreams/8f6bbfbe-6a08-4da1-b072-372d6a613cd6/download9113568f96d73b8b9425bb14b098db00MD519T11420F_ANNEX F. User Test Questionnaire.pdf.jpgT11420F_ANNEX F. User Test Questionnaire.pdf.jpgGenerated Thumbnailimage/jpeg6043https://red.uao.edu.co/bitstreams/5aec91de-9634-4171-9601-3ed5673e3245/downloadfcd8ac075e59bdc77a1ea27d2cd1767eMD521T11420G_ANNEX G. Test Protocol.pdf.jpgT11420G_ANNEX G. Test Protocol.pdf.jpgGenerated Thumbnailimage/jpeg12761https://red.uao.edu.co/bitstreams/06bfb751-8ad2-40a6-b5c3-3b0ad7c3ecdb/download07ca1afcce94059792c37140cfc7eeefMD523T11420H_ANNEX H. Informed Consent.pdf.jpgT11420H_ANNEX H. Informed Consent.pdf.jpgGenerated Thumbnailimage/jpeg14482https://red.uao.edu.co/bitstreams/ea6d591f-d461-43f7-b526-74571df93c5e/download4851c92072d435796d8de82f374ca23cMD525TA11420_Autorización trabajo de grado.pdf.jpgTA11420_Autorización trabajo de grado.pdf.jpgGenerated Thumbnailimage/jpeg13345https://red.uao.edu.co/bitstreams/a282dd34-c28a-4481-a549-dce3b71bd95a/downloadb9480f22b1be82aefdd180f6d599c1adMD52710614/16179oai:red.uao.edu.co:10614/161792025-06-21 03:02:23.643https://creativecommons.org/licenses/by-nc-nd/4.0/)-- Universidad Autónoma de Occidente, 2025open.accesshttps://red.uao.edu.coRepositorio Digital Universidad Autonoma de Occidenterepositorio@uao.edu.coPHA+RUwgQVVUT1IgYXV0b3JpemEgYSBsYSBVbml2ZXJzaWRhZCBBdXTDs25vbWEgZGUgT2NjaWRlbnRlLCBkZSBmb3JtYSBpbmRlZmluaWRhLCBwYXJhIHF1ZSBlbiBsb3MgdMOpcm1pbm9zIGVzdGFibGVjaWRvcyBlbiBsYSBMZXkgMjMgZGUgMTk4MiwgbGEgTGV5IDQ0IGRlIDE5OTMsIGxhIERlY2lzacOzbiBhbmRpbmEgMzUxIGRlIDE5OTMsIGVsIERlY3JldG8gNDYwIGRlIDE5OTUgeSBkZW3DoXMgbGV5ZXMgeSBqdXJpc3BydWRlbmNpYSB2aWdlbnRlIGFsIHJlc3BlY3RvLCBoYWdhIHB1YmxpY2FjacOzbiBkZSBlc3RlIGNvbiBmaW5lcyBlZHVjYXRpdm9zLiBQQVJBR1JBRk86IEVzdGEgYXV0b3JpemFjacOzbiBhZGVtw6FzIGRlIHNlciB2w6FsaWRhIHBhcmEgbGFzIGZhY3VsdGFkZXMgeSBkZXJlY2hvcyBkZSB1c28gc29icmUgbGEgb2JyYSBlbiBmb3JtYXRvIG8gc29wb3J0ZSBtYXRlcmlhbCwgdGFtYmnDqW4gcGFyYSBmb3JtYXRvIGRpZ2l0YWwsIGVsZWN0csOzbmljbywgdmlydHVhbCwgcGFyYSB1c29zIGVuIHJlZCwgSW50ZXJuZXQsIGV4dHJhbmV0LCBpbnRyYW5ldCwgYmlibGlvdGVjYSBkaWdpdGFsIHkgZGVtw6FzIHBhcmEgY3VhbHF1aWVyIGZvcm1hdG8gY29ub2NpZG8gbyBwb3IgY29ub2Nlci4gRUwgQVVUT1IsIGV4cHJlc2EgcXVlIGVsIGRvY3VtZW50byAodHJhYmFqbyBkZSBncmFkbywgcGFzYW50w61hLCBjYXNvcyBvIHRlc2lzKSBvYmpldG8gZGUgbGEgcHJlc2VudGUgYXV0b3JpemFjacOzbiBlcyBvcmlnaW5hbCB5IGxhIGVsYWJvcsOzIHNpbiBxdWVicmFudGFyIG5pIHN1cGxhbnRhciBsb3MgZGVyZWNob3MgZGUgYXV0b3IgZGUgdGVyY2Vyb3MsIHkgZGUgdGFsIGZvcm1hLCBlbCBkb2N1bWVudG8gKHRyYWJham8gZGUgZ3JhZG8sIHBhc2FudMOtYSwgY2Fzb3MgbyB0ZXNpcykgZXMgZGUgc3UgZXhjbHVzaXZhIGF1dG9yw61hIHkgdGllbmUgbGEgdGl0dWxhcmlkYWQgc29icmUgw6lzdGUuIFBBUkFHUkFGTzogZW4gY2FzbyBkZSBwcmVzZW50YXJzZSBhbGd1bmEgcmVjbGFtYWNpw7NuIG8gYWNjacOzbiBwb3IgcGFydGUgZGUgdW4gdGVyY2VybywgcmVmZXJlbnRlIGEgbG9zIGRlcmVjaG9zIGRlIGF1dG9yIHNvYnJlIGVsIGRvY3VtZW50byAoVHJhYmFqbyBkZSBncmFkbywgUGFzYW50w61hLCBjYXNvcyBvIHRlc2lzKSBlbiBjdWVzdGnDs24sIEVMIEFVVE9SLCBhc3VtaXLDoSBsYSByZXNwb25zYWJpbGlkYWQgdG90YWwsIHkgc2FsZHLDoSBlbiBkZWZlbnNhIGRlIGxvcyBkZXJlY2hvcyBhcXXDrSBhdXRvcml6YWRvczsgcGFyYSB0b2RvcyBsb3MgZWZlY3RvcywgbGEgVW5pdmVyc2lkYWQgIEF1dMOzbm9tYSBkZSBPY2NpZGVudGUgYWN0w7phIGNvbW8gdW4gdGVyY2VybyBkZSBidWVuYSBmZS4gVG9kYSBwZXJzb25hIHF1ZSBjb25zdWx0ZSB5YSBzZWEgZW4gbGEgYmlibGlvdGVjYSBvIGVuIG1lZGlvIGVsZWN0csOzbmljbyBwb2Ryw6EgY29waWFyIGFwYXJ0ZXMgZGVsIHRleHRvIGNpdGFuZG8gc2llbXByZSBsYSBmdWVudGUsIGVzIGRlY2lyIGVsIHTDrXR1bG8gZGVsIHRyYWJham8geSBlbCBhdXRvci4gRXN0YSBhdXRvcml6YWNpw7NuIG5vIGltcGxpY2EgcmVudW5jaWEgYSBsYSBmYWN1bHRhZCBxdWUgdGllbmUgRUwgQVVUT1IgZGUgcHVibGljYXIgdG90YWwgbyBwYXJjaWFsbWVudGUgbGEgb2JyYS48L3A+Cg==