Facial emotion recognition through artificial intelligence

This paper introduces a study employing artificial intelligence (AI) to utilize computer vision algorithms for detecting human emotions in video content during user interactions with diverse visual stimuli. The research aims to unveil the creation of software capable of emotion detection by leveragi...

Full description

Autores:
Peláez Ayala, Carlos Alberto
Solano Alegría, Andrés Fernando
Ballesteros, Jesús A.
Ramírez V., Gabriel M.
Moreira, Fernando
Tipo de recurso:
Article of investigation
Fecha de publicación:
2024
Institución:
Universidad Autónoma de Occidente
Repositorio:
RED: Repositorio Educativo Digital UAO
Idioma:
eng
OAI Identifier:
oai:red.uao.edu.co:10614/16233
Acceso en línea:
https://hdl.handle.net/10614/16233
https://red.uao.edu.co/
Palabra clave:
Facial emotion
Recognition
A.I.
Convolutional neural network
Images
Rights
openAccess
License
Derechos reservados - Frontiers Media S.A., 2024
id REPOUAO2_ea12eb6fc4e3af10b208c685de22cdba
oai_identifier_str oai:red.uao.edu.co:10614/16233
network_acronym_str REPOUAO2
network_name_str RED: Repositorio Educativo Digital UAO
repository_id_str
dc.title.eng.fl_str_mv Facial emotion recognition through artificial intelligence
dc.title.alternative.spa.fl_str_mv Reconocimiento de emociones faciales mediante inteligencia artificial
title Facial emotion recognition through artificial intelligence
spellingShingle Facial emotion recognition through artificial intelligence
Facial emotion
Recognition
A.I.
Convolutional neural network
Images
title_short Facial emotion recognition through artificial intelligence
title_full Facial emotion recognition through artificial intelligence
title_fullStr Facial emotion recognition through artificial intelligence
title_full_unstemmed Facial emotion recognition through artificial intelligence
title_sort Facial emotion recognition through artificial intelligence
dc.creator.fl_str_mv Peláez Ayala, Carlos Alberto
Solano Alegría, Andrés Fernando
Ballesteros, Jesús A.
Ramírez V., Gabriel M.
Moreira, Fernando
dc.contributor.author.none.fl_str_mv Peláez Ayala, Carlos Alberto
Solano Alegría, Andrés Fernando
Ballesteros, Jesús A.
Ramírez V., Gabriel M.
Moreira, Fernando
dc.subject.proposal.eng.fl_str_mv Facial emotion
Recognition
A.I.
Convolutional neural network
Images
topic Facial emotion
Recognition
A.I.
Convolutional neural network
Images
description This paper introduces a study employing artificial intelligence (AI) to utilize computer vision algorithms for detecting human emotions in video content during user interactions with diverse visual stimuli. The research aims to unveil the creation of software capable of emotion detection by leveraging AI algorithms and image processing pipelines to identify users’ facial expressions. The process involves assessing users through images and facilitating the implementation of computer vision algorithms aligned with psychological theories defining emotions and their recognizable features. The study demonstrates the feasibility of emotion recognition through convolutional neural networks (CNN) and software development and training based on facial expressions. The results highlight successful emotion identification; however, precision improvement necessitates further training for contexts with more diverse images and additional algorithms to distinguish closely related emotional patterns. The discussion and conclusions emphasize the potential of A.I. and computer vision algorithms in emotion detection, providing insights into software development, ongoing training, and the evolving landscape of emotion recognition technology. Further training is necessary for contexts with more diverse images, alongside additional algorithms that can eectively distinguish between facial expressions depicting closely related emotional patterns, enhancing certainty and accuracy
publishDate 2024
dc.date.issued.none.fl_str_mv 2024
dc.date.accessioned.none.fl_str_mv 2025-07-29T14:03:44Z
dc.date.available.none.fl_str_mv 2025-07-29T14:03:44Z
dc.type.spa.fl_str_mv Artículo de revista
dc.type.coarversion.fl_str_mv http://purl.org/coar/version/c_970fb48d4fbd8a85
dc.type.coar.eng.fl_str_mv http://purl.org/coar/resource_type/c_2df8fbb1
dc.type.content.eng.fl_str_mv Text
dc.type.driver.eng.fl_str_mv info:eu-repo/semantics/article
dc.type.redcol.eng.fl_str_mv http://purl.org/redcol/resource_type/ART
dc.type.version.eng.fl_str_mv info:eu-repo/semantics/publishedVersion
format http://purl.org/coar/resource_type/c_2df8fbb1
status_str publishedVersion
dc.identifier.citation.eng.fl_str_mv Ballesteros, J. A.; Ramírez V., G. M.; Moreira, F.; Peláez Ayala, C. A. y Solano Alegría, A. F. (2024). Facial emotion recognition through artificial intelligence. Frontiers in Computer Science. Vol. 6. p.p. 1-14. DOI 10.3389/fcomp.2024.1359471
dc.identifier.issn.spa.fl_str_mv 26249898
dc.identifier.uri.none.fl_str_mv https://hdl.handle.net/10614/16233
dc.identifier.doi.spa.fl_str_mv DOI 10.3389/fcomp.2024.1359471
dc.identifier.instname.spa.fl_str_mv Universidad Autónoma de Occidente
dc.identifier.reponame.spa.fl_str_mv Respositorio Educativo Digital UAO
dc.identifier.repourl.none.fl_str_mv https://red.uao.edu.co/
identifier_str_mv Ballesteros, J. A.; Ramírez V., G. M.; Moreira, F.; Peláez Ayala, C. A. y Solano Alegría, A. F. (2024). Facial emotion recognition through artificial intelligence. Frontiers in Computer Science. Vol. 6. p.p. 1-14. DOI 10.3389/fcomp.2024.1359471
26249898
DOI 10.3389/fcomp.2024.1359471
Universidad Autónoma de Occidente
Respositorio Educativo Digital UAO
url https://hdl.handle.net/10614/16233
https://red.uao.edu.co/
dc.language.iso.eng.fl_str_mv eng
language eng
dc.relation.citationendpage.spa.fl_str_mv 14
dc.relation.citationstartpage.spa.fl_str_mv 1
dc.relation.citationvolume.spa.fl_str_mv 6
dc.relation.ispartofjournal.eng.fl_str_mv Frontiers in Computer Science
dc.relation.references.none.fl_str_mv Albaladejo, X., Díaz, J. R., Quesada, A. X., & Iglesias, J. (2021). Proyectosagiles.org. https://proyectosagiles.org/pm-partners
Banafa, A. (2016). ¿Qué es la computación afectiva? OpenMind BBVA. https://www.bbvaopenmind.com/tecnologia/mundo-digital/que-es-la-computacion-afectiva/
Bledsoe, W. W. (1966). Man-machine facial recognition: Report on a large-scale experiment (Technical Report PRI 22). Panoramic Research.
Centeno, I. D. P. (2021). MTCNN face detection implementation for TensorFlow, as a pip package. https://github.com/ipazc/mtcnn
Chollet, F. (2017). Xception: deep learning with depthwise separable convolutions. En Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 1251–1258). https://doi.org/10.1109/CVPR.2017.195
Darwin, C., & Prodger, P. (1996). The expression of the emotions in man and animals. Oxford University Press.
Ekman, P. (1994). Strong evidence for universals in facial expressions: A reply to Russell’s mistaken critique. Psychological Bulletin, 115(2), 268–287. https://doi.org/10.1037/0033-2909.115.2.268
Ekman, P. (1999). Basic emotions. En Handbook of Cognition and Emotion (pp. 45–60). https://doi.org/10.1002/0470013494.ch3
Ekman, P., Sorenson, E., & Friesen, W. (1969). Pan-cultural elements in facial displays of emotion. Science, 164(3875), 86–88. https://doi.org/10.1126/science.164.3875.86
Frijda, N. H. (2017). The laws of emotion. Psychology Press.
García, A. R. (2013). La educación emocional, el autoconcepto, la autoestima y su importancia en la infancia. Estudios y propuestas socioeducativas, 44, 241–257.
Ghotbi, N. (2023). The ethics of emotional artificial intelligence: A mixed method analysis. Asian Bioethics Review, 15, 417–430. https://doi.org/10.1007/s41649-022-00237-y
Hernández Sampieri, R., Fernández, C., & Baptista, L. C. (2003). Metodología de la investigación. McGraw Hill.
Kaggle. (2019). FER−2013. https://www.kaggle.com/
Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2017). ImageNet classification with deep convolutional neural networks. Communications of the ACM, 60(6), 84–90. https://doi.org/10.1145/3065386
Lee, Y. S., & Park, W. H. (2022). Diagnosis of depressive disorder model on facial expression based on fast R-CNN. Diagnostics, 12(2), 317. https://doi.org/10.3390/diagnostics12020317
Lu, X. (2022). Deep learning based emotion recognition and visualization of figural representation. Frontiers in Psychology, 12, 818833. https://doi.org/10.3389/fpsyg.2021.818833
Mathworks. (2023). Integral image. https://www.mathworks.com/help/images/integral-image.html
Monteith, S., Glenn, T., Geddes, J., Whybrow, P. C., & Bauer, M. (2022). Commercial use of emotion artificial intelligence (AI): Implications for psychiatry. Current Psychiatry Reports, 24, 203–211. https://doi.org/10.1007/s11920-022-01330-7
Plutchik, R. (2001). The nature of emotions. American Scientist, 89(4), 334–350. https://doi.org/10.1511/2001.28.334
Plutchik, R. E., & Conte, H. R. (1997). Circumplex models of personality and emotions. American Psychological Association.
Russell, J. A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), 1161–1178. https://doi.org/10.1037/h0077714
Russell, J. A. (1997). Reading emotions from and into faces: Resurrecting a dimensional-contextual perspective. En J. A. Russell & J. M. Fernández-Dols (Eds.), The psychology of facial expression (pp. 295–320). Cambridge University Press.
Salovey, P., & Mayer, J. (1990). Emotional intelligence. Imagination, Cognition and Personality, 9(3), 185–211. https://doi.org/10.2190/DUGG-P24E-52WK-6CDG
Sambare, M. (2023). Kaggle. FER-013: Learn facial expressions from a image. https://www.kaggle.com/datasets/msambare/fer2013
Schapire, R. E. (2013). Explaining AdaBoost. En Empirical inference: Festschrift in honor of Vladimir N. Vapnik (pp. 37–52). Springer. https://doi.org/10.1007/978-3-642-41136-6_5
Simonyan, K., & Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition. arXiv preprint, arXiv:1409.1556. https://doi.org/10.48550/arXiv.1409.1556
Sotil, D. A. (2022). RPubs. https://rpubs.com/
Tanabe, H., Shiraishi, T., Sato, H., Nihei, M., Inoue, T., & Kuwabara, C. (2023). A concept for emotion recognition systems for children with profound intellectual and multiple disabilities based on artificial intelligence using physiological and motion signals. Disability and Rehabilitation: Assistive Technology, 1–8. https://doi.org/10.1080/17483107.2023.2170478
Thomas, J. R., Nelson, J. K., & Silverman, J. (2005). Research methods in physical activity (5th ed.). Human Kinetics.
Wang, Y. Q. (2014). An analysis of the Viola-Jones face detection algorithm. Image Processing On Line, 4, 128–148. https://doi.org/10.5201/ipol.2014.104
Zhang, K., Zhang, Z., Li, Z., & Qiao, Y. (2016). Joint face detection and alignment using multitask cascaded convolutional networks. IEEE Signal Processing Letters, 23(10), 1499–1503. https://doi.org/10.1109/LSP.2016.2603342
Zhao, J., Wu, M., Zhou, L., Wang, X., & Jia, J. (2022). Cognitive psychology-based artificial intelligence review. Frontiers in Neuroscience, 16, 1024316. https://doi.org/10.3389/fnins.2022.1024316
dc.rights.spa.fl_str_mv Derechos reservados - Frontiers Media S.A., 2024
dc.rights.coar.fl_str_mv http://purl.org/coar/access_right/c_abf2
dc.rights.uri.eng.fl_str_mv https://creativecommons.org/licenses/by-nc-nd/4.0/
dc.rights.accessrights.eng.fl_str_mv info:eu-repo/semantics/openAccess
dc.rights.creativecommons.spa.fl_str_mv Atribución-NoComercial-SinDerivadas 4.0 Internacional (CC BY-NC-ND 4.0)
rights_invalid_str_mv Derechos reservados - Frontiers Media S.A., 2024
https://creativecommons.org/licenses/by-nc-nd/4.0/
Atribución-NoComercial-SinDerivadas 4.0 Internacional (CC BY-NC-ND 4.0)
http://purl.org/coar/access_right/c_abf2
eu_rights_str_mv openAccess
dc.format.extent.spa.fl_str_mv 14 páginas
dc.format.mimetype.none.fl_str_mv application/pdf
dc.publisher.eng.fl_str_mv Frontiers Media SA
dc.publisher.place.spa.fl_str_mv Suiza
institution Universidad Autónoma de Occidente
bitstream.url.fl_str_mv https://red.uao.edu.co/bitstreams/6e4fa4b9-0b24-4df8-952f-02600ddadb50/download
https://red.uao.edu.co/bitstreams/39b4c299-52cd-4dd5-a7e5-3f158354d355/download
https://red.uao.edu.co/bitstreams/4edf9f2a-9d3e-4b2c-a642-5669736382e0/download
https://red.uao.edu.co/bitstreams/68626c1a-627c-4963-b315-f0a2d88883b2/download
bitstream.checksum.fl_str_mv 2238c388011dc8f927439bfbf9617801
6987b791264a2b5525252450f99b10d1
00e7160614ec050fa84fbd9f378d424c
ef700c2cb1e3e122c1f08eef54152741
bitstream.checksumAlgorithm.fl_str_mv MD5
MD5
MD5
MD5
repository.name.fl_str_mv Repositorio Digital Universidad Autonoma de Occidente
repository.mail.fl_str_mv repositorio@uao.edu.co
_version_ 1851053155652993024
spelling Peláez Ayala, Carlos Albertovirtual::6191-1Solano Alegría, Andrés Fernandovirtual::6192-1Ballesteros, Jesús A.Ramírez V., Gabriel M.Moreira, Fernando2025-07-29T14:03:44Z2025-07-29T14:03:44Z2024Ballesteros, J. A.; Ramírez V., G. M.; Moreira, F.; Peláez Ayala, C. A. y Solano Alegría, A. F. (2024). Facial emotion recognition through artificial intelligence. Frontiers in Computer Science. Vol. 6. p.p. 1-14. DOI 10.3389/fcomp.2024.135947126249898https://hdl.handle.net/10614/16233DOI 10.3389/fcomp.2024.1359471Universidad Autónoma de OccidenteRespositorio Educativo Digital UAOhttps://red.uao.edu.co/This paper introduces a study employing artificial intelligence (AI) to utilize computer vision algorithms for detecting human emotions in video content during user interactions with diverse visual stimuli. The research aims to unveil the creation of software capable of emotion detection by leveraging AI algorithms and image processing pipelines to identify users’ facial expressions. The process involves assessing users through images and facilitating the implementation of computer vision algorithms aligned with psychological theories defining emotions and their recognizable features. The study demonstrates the feasibility of emotion recognition through convolutional neural networks (CNN) and software development and training based on facial expressions. The results highlight successful emotion identification; however, precision improvement necessitates further training for contexts with more diverse images and additional algorithms to distinguish closely related emotional patterns. The discussion and conclusions emphasize the potential of A.I. and computer vision algorithms in emotion detection, providing insights into software development, ongoing training, and the evolving landscape of emotion recognition technology. Further training is necessary for contexts with more diverse images, alongside additional algorithms that can eectively distinguish between facial expressions depicting closely related emotional patterns, enhancing certainty and accuracyEste artículo presenta un estudio que emplea inteligencia artificial (IA) para utilizar algoritmos de visión artificial con el fin de detectar emociones humanas en contenido de video durante las interacciones del usuario con diversos estímulos visuales. La investigación busca revelar la creación de software capaz de detectar emociones mediante el uso de algoritmos de IA y canales de procesamiento de imágenes para identificar las expresiones faciales de los usuarios. El proceso implica evaluar a los usuarios a través de imágenes y facilitar la implementación de algoritmos de visión artificial alineados con las teorías psicológicas que definen las emociones y sus características reconocibles. El estudio demuestra la viabilidad del reconocimiento de emociones mediante redes neuronales convolucionales (CNN) y el desarrollo y entrenamiento de software basado en expresiones faciales. Los resultados destacan el éxito en la identificación de emociones; sin embargo, la mejora de la precisión requiere mayor entrenamiento para contextos con imágenes más diversas y algoritmos adicionales para distinguir patrones emocionales estrechamente relacionados. La discusión y las conclusiones enfatizan el potencial de la IA y los algoritmos de visión artificial en la detección de emociones, proporcionando información sobre el desarrollo de software, la capacitación continua y el panorama en evolución de la tecnología de reconocimiento de emociones. Es necesario más entrenamiento para contextos con imágenes más diversas, junto con algoritmos adicionales que puedan distinguir eficazmente entre expresiones faciales que representan patrones emocionales estrechamente relacionados, mejorando la certeza y la precisión14 páginasapplication/pdfengFrontiers Media SASuizaDerechos reservados - Frontiers Media S.A., 2024https://creativecommons.org/licenses/by-nc-nd/4.0/info:eu-repo/semantics/openAccessAtribución-NoComercial-SinDerivadas 4.0 Internacional (CC BY-NC-ND 4.0)http://purl.org/coar/access_right/c_abf2Facial emotion recognition through artificial intelligenceReconocimiento de emociones faciales mediante inteligencia artificialArtículo de revistahttp://purl.org/coar/resource_type/c_2df8fbb1Textinfo:eu-repo/semantics/articlehttp://purl.org/redcol/resource_type/ARTinfo:eu-repo/semantics/publishedVersionhttp://purl.org/coar/version/c_970fb48d4fbd8a851416Frontiers in Computer ScienceAlbaladejo, X., Díaz, J. R., Quesada, A. X., & Iglesias, J. (2021). Proyectosagiles.org. https://proyectosagiles.org/pm-partnersBanafa, A. (2016). ¿Qué es la computación afectiva? OpenMind BBVA. https://www.bbvaopenmind.com/tecnologia/mundo-digital/que-es-la-computacion-afectiva/Bledsoe, W. W. (1966). Man-machine facial recognition: Report on a large-scale experiment (Technical Report PRI 22). Panoramic Research.Centeno, I. D. P. (2021). MTCNN face detection implementation for TensorFlow, as a pip package. https://github.com/ipazc/mtcnnChollet, F. (2017). Xception: deep learning with depthwise separable convolutions. En Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 1251–1258). https://doi.org/10.1109/CVPR.2017.195Darwin, C., & Prodger, P. (1996). The expression of the emotions in man and animals. Oxford University Press.Ekman, P. (1994). Strong evidence for universals in facial expressions: A reply to Russell’s mistaken critique. Psychological Bulletin, 115(2), 268–287. https://doi.org/10.1037/0033-2909.115.2.268Ekman, P. (1999). Basic emotions. En Handbook of Cognition and Emotion (pp. 45–60). https://doi.org/10.1002/0470013494.ch3Ekman, P., Sorenson, E., & Friesen, W. (1969). Pan-cultural elements in facial displays of emotion. Science, 164(3875), 86–88. https://doi.org/10.1126/science.164.3875.86Frijda, N. H. (2017). The laws of emotion. Psychology Press.García, A. R. (2013). La educación emocional, el autoconcepto, la autoestima y su importancia en la infancia. Estudios y propuestas socioeducativas, 44, 241–257.Ghotbi, N. (2023). The ethics of emotional artificial intelligence: A mixed method analysis. Asian Bioethics Review, 15, 417–430. https://doi.org/10.1007/s41649-022-00237-yHernández Sampieri, R., Fernández, C., & Baptista, L. C. (2003). Metodología de la investigación. McGraw Hill.Kaggle. (2019). FER−2013. https://www.kaggle.com/Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2017). ImageNet classification with deep convolutional neural networks. Communications of the ACM, 60(6), 84–90. https://doi.org/10.1145/3065386Lee, Y. S., & Park, W. H. (2022). Diagnosis of depressive disorder model on facial expression based on fast R-CNN. Diagnostics, 12(2), 317. https://doi.org/10.3390/diagnostics12020317Lu, X. (2022). Deep learning based emotion recognition and visualization of figural representation. Frontiers in Psychology, 12, 818833. https://doi.org/10.3389/fpsyg.2021.818833Mathworks. (2023). Integral image. https://www.mathworks.com/help/images/integral-image.htmlMonteith, S., Glenn, T., Geddes, J., Whybrow, P. C., & Bauer, M. (2022). Commercial use of emotion artificial intelligence (AI): Implications for psychiatry. Current Psychiatry Reports, 24, 203–211. https://doi.org/10.1007/s11920-022-01330-7Plutchik, R. (2001). The nature of emotions. American Scientist, 89(4), 334–350. https://doi.org/10.1511/2001.28.334Plutchik, R. E., & Conte, H. R. (1997). Circumplex models of personality and emotions. American Psychological Association.Russell, J. A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), 1161–1178. https://doi.org/10.1037/h0077714Russell, J. A. (1997). Reading emotions from and into faces: Resurrecting a dimensional-contextual perspective. En J. A. Russell & J. M. Fernández-Dols (Eds.), The psychology of facial expression (pp. 295–320). Cambridge University Press.Salovey, P., & Mayer, J. (1990). Emotional intelligence. Imagination, Cognition and Personality, 9(3), 185–211. https://doi.org/10.2190/DUGG-P24E-52WK-6CDGSambare, M. (2023). Kaggle. FER-013: Learn facial expressions from a image. https://www.kaggle.com/datasets/msambare/fer2013Schapire, R. E. (2013). Explaining AdaBoost. En Empirical inference: Festschrift in honor of Vladimir N. Vapnik (pp. 37–52). Springer. https://doi.org/10.1007/978-3-642-41136-6_5Simonyan, K., & Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition. arXiv preprint, arXiv:1409.1556. https://doi.org/10.48550/arXiv.1409.1556Sotil, D. A. (2022). RPubs. https://rpubs.com/Tanabe, H., Shiraishi, T., Sato, H., Nihei, M., Inoue, T., & Kuwabara, C. (2023). A concept for emotion recognition systems for children with profound intellectual and multiple disabilities based on artificial intelligence using physiological and motion signals. Disability and Rehabilitation: Assistive Technology, 1–8. https://doi.org/10.1080/17483107.2023.2170478Thomas, J. R., Nelson, J. K., & Silverman, J. (2005). Research methods in physical activity (5th ed.). Human Kinetics.Wang, Y. Q. (2014). An analysis of the Viola-Jones face detection algorithm. Image Processing On Line, 4, 128–148. https://doi.org/10.5201/ipol.2014.104Zhang, K., Zhang, Z., Li, Z., & Qiao, Y. (2016). Joint face detection and alignment using multitask cascaded convolutional networks. IEEE Signal Processing Letters, 23(10), 1499–1503. https://doi.org/10.1109/LSP.2016.2603342Zhao, J., Wu, M., Zhou, L., Wang, X., & Jia, J. (2022). Cognitive psychology-based artificial intelligence review. Frontiers in Neuroscience, 16, 1024316. https://doi.org/10.3389/fnins.2022.1024316Facial emotionRecognitionA.I.Convolutional neural networkImagesComunidad generalPublication9f7bdf54-fa5c-4798-b9fc-e16e61d706c2virtual::6191-16a771250-a687-430c-8edb-9196fd571e88virtual::6192-19f7bdf54-fa5c-4798-b9fc-e16e61d706c2virtual::6191-16a771250-a687-430c-8edb-9196fd571e88virtual::6192-1https://scholar.google.com/citations?user=MP7k658AAAAJ&hl=esvirtual::6191-1https://scholar.google.com.co/citations?hl=en&view_op=list_works&gmla=AJsN-F4QDq8pkOLSSaoszpKq5X6X8nqBQ36qx-OuF3W1NGOVKA4HF61QJTf6uORr6u5g7TZdeDsYAqqs2KjF7Hptqqnub0s8rw&user=rBCmL3kAAAAJvirtual::6192-10000-0003-1747-3691virtual::6191-10000-0002-1159-3767virtual::6192-1https://scienti.minciencias.gov.co/cvlac/visualizador/generarCurriculoCv.do?cod_rh=0000229318virtual::6191-1https://scienti.minciencias.gov.co/cvlac/visualizador/generarCurriculoCv.do?cod_rh=0001363310virtual::6192-1ORIGINALFacial_emotion_recognition_through_artificial_intelligence.pdfFacial_emotion_recognition_through_artificial_intelligence.pdfapplication/pdf2516005https://red.uao.edu.co/bitstreams/6e4fa4b9-0b24-4df8-952f-02600ddadb50/download2238c388011dc8f927439bfbf9617801MD51LICENSElicense.txtlicense.txttext/plain; charset=utf-81672https://red.uao.edu.co/bitstreams/39b4c299-52cd-4dd5-a7e5-3f158354d355/download6987b791264a2b5525252450f99b10d1MD52TEXTFacial_emotion_recognition_through_artificial_intelligence.pdf.txtFacial_emotion_recognition_through_artificial_intelligence.pdf.txtExtracted texttext/plain61724https://red.uao.edu.co/bitstreams/4edf9f2a-9d3e-4b2c-a642-5669736382e0/download00e7160614ec050fa84fbd9f378d424cMD53THUMBNAILFacial_emotion_recognition_through_artificial_intelligence.pdf.jpgFacial_emotion_recognition_through_artificial_intelligence.pdf.jpgGenerated Thumbnailimage/jpeg12311https://red.uao.edu.co/bitstreams/68626c1a-627c-4963-b315-f0a2d88883b2/downloadef700c2cb1e3e122c1f08eef54152741MD5410614/16233oai:red.uao.edu.co:10614/162332025-07-31 03:02:00.619https://creativecommons.org/licenses/by-nc-nd/4.0/Derechos reservados - Frontiers Media S.A., 2024open.accesshttps://red.uao.edu.coRepositorio Digital Universidad Autonoma de Occidenterepositorio@uao.edu.coPHA+RUwgQVVUT1IgYXV0b3JpemEgYSBsYSBVbml2ZXJzaWRhZCBBdXTDs25vbWEgZGUgT2NjaWRlbnRlLCBkZSBmb3JtYSBpbmRlZmluaWRhLCBwYXJhIHF1ZSBlbiBsb3MgdMOpcm1pbm9zIGVzdGFibGVjaWRvcyBlbiBsYSBMZXkgMjMgZGUgMTk4MiwgbGEgTGV5IDQ0IGRlIDE5OTMsIGxhIERlY2lzacOzbiBhbmRpbmEgMzUxIGRlIDE5OTMsIGVsIERlY3JldG8gNDYwIGRlIDE5OTUgeSBkZW3DoXMgbGV5ZXMgeSBqdXJpc3BydWRlbmNpYSB2aWdlbnRlIGFsIHJlc3BlY3RvLCBoYWdhIHB1YmxpY2FjacOzbiBkZSBlc3RlIGNvbiBmaW5lcyBlZHVjYXRpdm9zLiBQQVJBR1JBRk86IEVzdGEgYXV0b3JpemFjacOzbiBhZGVtw6FzIGRlIHNlciB2w6FsaWRhIHBhcmEgbGFzIGZhY3VsdGFkZXMgeSBkZXJlY2hvcyBkZSB1c28gc29icmUgbGEgb2JyYSBlbiBmb3JtYXRvIG8gc29wb3J0ZSBtYXRlcmlhbCwgdGFtYmnDqW4gcGFyYSBmb3JtYXRvIGRpZ2l0YWwsIGVsZWN0csOzbmljbywgdmlydHVhbCwgcGFyYSB1c29zIGVuIHJlZCwgSW50ZXJuZXQsIGV4dHJhbmV0LCBpbnRyYW5ldCwgYmlibGlvdGVjYSBkaWdpdGFsIHkgZGVtw6FzIHBhcmEgY3VhbHF1aWVyIGZvcm1hdG8gY29ub2NpZG8gbyBwb3IgY29ub2Nlci4gRUwgQVVUT1IsIGV4cHJlc2EgcXVlIGVsIGRvY3VtZW50byAodHJhYmFqbyBkZSBncmFkbywgcGFzYW50w61hLCBjYXNvcyBvIHRlc2lzKSBvYmpldG8gZGUgbGEgcHJlc2VudGUgYXV0b3JpemFjacOzbiBlcyBvcmlnaW5hbCB5IGxhIGVsYWJvcsOzIHNpbiBxdWVicmFudGFyIG5pIHN1cGxhbnRhciBsb3MgZGVyZWNob3MgZGUgYXV0b3IgZGUgdGVyY2Vyb3MsIHkgZGUgdGFsIGZvcm1hLCBlbCBkb2N1bWVudG8gKHRyYWJham8gZGUgZ3JhZG8sIHBhc2FudMOtYSwgY2Fzb3MgbyB0ZXNpcykgZXMgZGUgc3UgZXhjbHVzaXZhIGF1dG9yw61hIHkgdGllbmUgbGEgdGl0dWxhcmlkYWQgc29icmUgw6lzdGUuIFBBUkFHUkFGTzogZW4gY2FzbyBkZSBwcmVzZW50YXJzZSBhbGd1bmEgcmVjbGFtYWNpw7NuIG8gYWNjacOzbiBwb3IgcGFydGUgZGUgdW4gdGVyY2VybywgcmVmZXJlbnRlIGEgbG9zIGRlcmVjaG9zIGRlIGF1dG9yIHNvYnJlIGVsIGRvY3VtZW50byAoVHJhYmFqbyBkZSBncmFkbywgUGFzYW50w61hLCBjYXNvcyBvIHRlc2lzKSBlbiBjdWVzdGnDs24sIEVMIEFVVE9SLCBhc3VtaXLDoSBsYSByZXNwb25zYWJpbGlkYWQgdG90YWwsIHkgc2FsZHLDoSBlbiBkZWZlbnNhIGRlIGxvcyBkZXJlY2hvcyBhcXXDrSBhdXRvcml6YWRvczsgcGFyYSB0b2RvcyBsb3MgZWZlY3RvcywgbGEgVW5pdmVyc2lkYWQgIEF1dMOzbm9tYSBkZSBPY2NpZGVudGUgYWN0w7phIGNvbW8gdW4gdGVyY2VybyBkZSBidWVuYSBmZS4gVG9kYSBwZXJzb25hIHF1ZSBjb25zdWx0ZSB5YSBzZWEgZW4gbGEgYmlibGlvdGVjYSBvIGVuIG1lZGlvIGVsZWN0csOzbmljbyBwb2Ryw6EgY29waWFyIGFwYXJ0ZXMgZGVsIHRleHRvIGNpdGFuZG8gc2llbXByZSBsYSBmdWVudGUsIGVzIGRlY2lyIGVsIHTDrXR1bG8gZGVsIHRyYWJham8geSBlbCBhdXRvci4gRXN0YSBhdXRvcml6YWNpw7NuIG5vIGltcGxpY2EgcmVudW5jaWEgYSBsYSBmYWN1bHRhZCBxdWUgdGllbmUgRUwgQVVUT1IgZGUgcHVibGljYXIgdG90YWwgbyBwYXJjaWFsbWVudGUgbGEgb2JyYS48L3A+Cg==