Data-set class-balancing and the convolutional vision transformer: An analysis of chest radiographs for SARS-CoV-2 detection

Introducción: La pandemia de COVID-19 ha impactado la salud pública y las economías mundiales. Las radiografías de tórax (RTC) desempeñan un papel fundamental en el cribado, especialmente en regiones con recursos limitados. Este trabajo se centra en el desarrollo de un modelo de aprendizaje profundo...

Full description

Autores:
Escobar Ortiz, Andres F.
Pardo-Cabrera, Josh
Hurtado López, Julián
Ramírez Moreno, David Fernando
Amezquita-Dussan, María A.
Galindo-Sánchez, Juan S.
Sua-Villegas, Luz F.
Fernández-Trujillo, Liliana
Tipo de recurso:
Article of investigation
Fecha de publicación:
2024
Institución:
Universidad Autónoma de Occidente
Repositorio:
RED: Repositorio Educativo Digital UAO
Idioma:
eng
OAI Identifier:
oai:red.uao.edu.co:10614/16207
Acceso en línea:
https://hdl.handle.net/10614/16207
https://doi.org/10.1016/j.bspc.2024.106190
https://red.uao.edu.co/
Palabra clave:
COVID-19
Diagnóstico por imágenes
Radiografía
Tórax
Redes neuronales
Computadora
Aprendizaje profundo
Diagnostic imaging
Radiography
Thorax
Neural networks
Computer
Rights
openAccess
License
Derechos reservados - Elsevier, 2024
id REPOUAO2_a1b6585d89d99ad4322a5d2325619841
oai_identifier_str oai:red.uao.edu.co:10614/16207
network_acronym_str REPOUAO2
network_name_str RED: Repositorio Educativo Digital UAO
repository_id_str
dc.title.eng.fl_str_mv Data-set class-balancing and the convolutional vision transformer: An analysis of chest radiographs for SARS-CoV-2 detection
dc.title.translated.spa.fl_str_mv Balance de clases de conjuntos de datos y el transformador de visión convolucional: un análisis de radiografías de tórax para la detección del SARS-CoV-2
title Data-set class-balancing and the convolutional vision transformer: An analysis of chest radiographs for SARS-CoV-2 detection
spellingShingle Data-set class-balancing and the convolutional vision transformer: An analysis of chest radiographs for SARS-CoV-2 detection
COVID-19
Diagnóstico por imágenes
Radiografía
Tórax
Redes neuronales
Computadora
Aprendizaje profundo
Diagnostic imaging
Radiography
Thorax
Neural networks
Computer
title_short Data-set class-balancing and the convolutional vision transformer: An analysis of chest radiographs for SARS-CoV-2 detection
title_full Data-set class-balancing and the convolutional vision transformer: An analysis of chest radiographs for SARS-CoV-2 detection
title_fullStr Data-set class-balancing and the convolutional vision transformer: An analysis of chest radiographs for SARS-CoV-2 detection
title_full_unstemmed Data-set class-balancing and the convolutional vision transformer: An analysis of chest radiographs for SARS-CoV-2 detection
title_sort Data-set class-balancing and the convolutional vision transformer: An analysis of chest radiographs for SARS-CoV-2 detection
dc.creator.fl_str_mv Escobar Ortiz, Andres F.
Pardo-Cabrera, Josh
Hurtado López, Julián
Ramírez Moreno, David Fernando
Amezquita-Dussan, María A.
Galindo-Sánchez, Juan S.
Sua-Villegas, Luz F.
Fernández-Trujillo, Liliana
dc.contributor.author.none.fl_str_mv Escobar Ortiz, Andres F.
Pardo-Cabrera, Josh
Hurtado López, Julián
Ramírez Moreno, David Fernando
Amezquita-Dussan, María A.
Galindo-Sánchez, Juan S.
Sua-Villegas, Luz F.
Fernández-Trujillo, Liliana
dc.subject.proposal.spa.fl_str_mv COVID-19
Diagnóstico por imágenes
Radiografía
Tórax
Redes neuronales
Computadora
Aprendizaje profundo
topic COVID-19
Diagnóstico por imágenes
Radiografía
Tórax
Redes neuronales
Computadora
Aprendizaje profundo
Diagnostic imaging
Radiography
Thorax
Neural networks
Computer
dc.subject.proposal.eng.fl_str_mv Diagnostic imaging
Radiography
Thorax
Neural networks
Computer
description Introducción: La pandemia de COVID-19 ha impactado la salud pública y las economías mundiales. Las radiografías de tórax (RTC) desempeñan un papel fundamental en el cribado, especialmente en regiones con recursos limitados. Este trabajo se centra en el desarrollo de un modelo de aprendizaje profundo para la detección de COVID-19 en RTC, con el objetivo de abordar los desafíos asociados con la interpretación manual. Métodos: Nuestro enfoque integra Transformadores de Visión Convolucional con Redes Neuronales Convolucionales tradicionales para el análisis de radiografías de tórax . Utilizando un conjunto de datos de 5572 radiografías de tórax, que incluían diversos grados de COVID-19 y otros tipos de neumonía, el modelo se entrenó con múltiples configuraciones y se evaluó mediante diversas métricas. Cada Configuración de Conjunto de Datos (DSC) implicó 10 iteraciones de entrenamiento. Empleamos una técnica de visualización de la atención mejorada para identificar áreas clave en las radiografías. Resultados: Este trabajo presenta los cinco mejores modelos según los resultados de entrenamiento y validación en diferentes clases radiográficas. El rendimiento del modelo varió según la configuración del conjunto de datos, en particular al distinguir la COVID-19 de otros tipos de neumonía. DSC 6 resultó ser el más eficaz, con una sensibilidad del 97,1 % y una tasa de falsos negativos del 2,9 % durante el entrenamiento, y manteniendo un alto rendimiento durante la validación con una sensibilidad del 87,2 % y una tasa de falsos negativos del 12,8 %. Conclusión: Esta fusión de ingeniería y experiencia médica da como resultado una herramienta de detección eficiente. La crisis de la COVID-19 subraya la importancia de mejorar la preparación sanitaria. Nuestro modelo propuesto alcanza un promedio de recuperación, precisión y sensibilidad superior al 90 %, junto con una baja tasa de falsos negativos. Identifica la COVID-19 de forma fiable en diferentes grados de gravedad y la distingue eficazmente de otros tipos de neumonía, lo que consolida su utilidad como una herramienta robusta para la detección de la COVID-19
publishDate 2024
dc.date.issued.none.fl_str_mv 2024
dc.date.accessioned.none.fl_str_mv 2025-07-08T17:16:01Z
dc.date.available.none.fl_str_mv 2025-07-08T17:16:01Z
dc.type.spa.fl_str_mv Artículo de revista
dc.type.coarversion.fl_str_mv http://purl.org/coar/version/c_970fb48d4fbd8a85
dc.type.coar.eng.fl_str_mv http://purl.org/coar/resource_type/c_2df8fbb1
dc.type.content.eng.fl_str_mv Text
dc.type.driver.eng.fl_str_mv info:eu-repo/semantics/article
dc.type.redcol.eng.fl_str_mv http://purl.org/redcol/resource_type/ART
dc.type.version.eng.fl_str_mv info:eu-repo/semantics/publishedVersion
format http://purl.org/coar/resource_type/c_2df8fbb1
status_str publishedVersion
dc.identifier.citation.eng.fl_str_mv Escobar Ortiz, A. F.; Pardo-Cabrera, J.; Hurtado López, J.; Ramirez-Moreno, D. F.; Amezquita-Dussan, M. A.; Galindo-Sánchez, J. S.; Sua-Villegas, L. F. y Fernández-Trujillo, L. (2024). Data-set class-balancing and the convolutional vision transformer: An analysis of chest radiographs for SARS-CoV-2 detection. Biomedical Signal Processing and Control. Vol. 93. p.p. 1-13. https://doi.org/10.1016/j.bspc.2024.106190
dc.identifier.uri.none.fl_str_mv https://hdl.handle.net/10614/16207
dc.identifier.doi.spa.fl_str_mv https://doi.org/10.1016/j.bspc.2024.106190
dc.identifier.eissn.spa.fl_str_mv 17468108
dc.identifier.instname.spa.fl_str_mv Universidad Autónoma de Occidente
dc.identifier.reponame.spa.fl_str_mv Respositorio Educativo Digital UAO
dc.identifier.repourl.none.fl_str_mv https://red.uao.edu.co/
identifier_str_mv Escobar Ortiz, A. F.; Pardo-Cabrera, J.; Hurtado López, J.; Ramirez-Moreno, D. F.; Amezquita-Dussan, M. A.; Galindo-Sánchez, J. S.; Sua-Villegas, L. F. y Fernández-Trujillo, L. (2024). Data-set class-balancing and the convolutional vision transformer: An analysis of chest radiographs for SARS-CoV-2 detection. Biomedical Signal Processing and Control. Vol. 93. p.p. 1-13. https://doi.org/10.1016/j.bspc.2024.106190
17468108
Universidad Autónoma de Occidente
Respositorio Educativo Digital UAO
url https://hdl.handle.net/10614/16207
https://doi.org/10.1016/j.bspc.2024.106190
https://red.uao.edu.co/
dc.language.iso.eng.fl_str_mv eng
language eng
dc.relation.citationendpage.spa.fl_str_mv 13
dc.relation.citationstartpage.spa.fl_str_mv 1
dc.relation.citationvolume.spa.fl_str_mv 93
dc.relation.ispartofjournal.eng.fl_str_mv Biomedical Signal Processing and Control
dc.relation.references.none.fl_str_mv [1] S. Zhao, Q. Lin, J. Ran, S.S. Musa, G. Yang, W. Wang, Y. Lou, D. Gao, L. Yang, D. He, M.H. Wang, Preliminary estimation of the basic reproduction number of novel coronavirus (2019-nCoV) in China, from 2019 to 2020: A data-driven analysis in the early phase of the outbreak, Int. J. Infect. Dis. 92 (2020) 214–217.
[2] K. Cao, T. Deng, C. Zhang, L. Lu, L. Li, A CNN-transformer fusion network for COVID-19 CXR image classification, PLoS ONE 17 (10) (2022) 1–16.
[3] J. Bonet-Morón, D. Ricciulli-Marín, G.J. Pérez-Valbuena, L.A. Galvis-Aponte, E.A. Haddad, I.F. Araújo, F.S. Perobelli, Regional economic impact of COVID-19 in Colombia: An input–output approach, Reg. Sci. Policy Pract. 12 (6) (2020) 1123–1150.
[4] WHO, WHO coronavirus (COVID-19) dashboard, 2022, https://covid19.who.int [Internet]. [Retrieved November 18, 2022].
[5] INS, Coronavirus Colombia, 2022, https://www.ins.gov.co/Noticias/paginas/ coronavirus.aspx. [Internet]. [Retrieved November 21, 2022].
[6] A.I. Vecino-Ortiz, J. Villanueva Congote, S. Zapata Bedoya, Z.M. Cucunuba, Impact of contact tracing on COVID-19 mortality: An impact evaluation using surveillance data from Colombia, PLoS ONE 16 (3) (2021) 1–12.
[7] D. Fisher, A. Wilder-Smith, The global community needs to swiftly ramp up the response to contain COVID-19, Lancet 395 (10230) (2020) 1109–1110.
[8] J. Sánchez-Duque, L. Arce-Villalobos, A. Rodríguez-Morales, Enfermedad por coronavirus 2019 (COVID-19) en américa latina: papel de la atención primaria en la preparación y respuesta, Aten. Primaria 52 (6) (2020) 369–372.
[9] M. Cascella, M. Rajnik, A. Aleem, S.C. Dulebohn, R. Di Napoli, Features, evaluation, and treatment of coronavirus (COVID-19), Statpearls (2022) [internet].
[10] COVID-19 Treatment Guidelines, Clinical spectrum, 2022, https://www. covid19treatmentguidelines.nih.gov/overview/clinical-spectrum/. [Retrieved November 22, 2022].
[11] Z. Wu, J.M. McGoogan, Characteristics of and important lessons from the coronavirus disease 2019 (COVID-19) outbreak in China: Summary of a report of 72314 cases from the Chinese center for disease control and prevention, JAMA 323 (13) (2020) 1239–1242.
[12] N. Chen, M. Zhou, X. Dong, J. Qu, F. Gong, Y. Han, Y. Qiu, J. Wang, Y. Liu, Y. Wei, J. Xia, T. Yu, X. Zhang, L. Zhang, Epidemiological and clinical characteristics of 99 cases of 2019 novel coronavirus pneumonia in wuhan, China: a descriptive study, Lancet 395 (10223) (2020) 507–513.
[13] C. Huang, Y. Wang, X. Li, L. Ren, J. Zhao, Y. Hu, L. Zhang, G. Fan, J. Xu, X. Gu, Z. Cheng, T. Yu, J. Xia, Y. Wei, W. Wu, X. Xie, W. Yin, H. Li, M. Liu, Y. Xiao, H. Gao, L. Guo, J. Xie, G. Wang, R. Jiang, Z. Gao, Q. Jin, J. Wang, B. Cao, Clinical features of patients infected with 2019 novel coronavirus in wuhan, China, Lancet 395 (10223) (2020) 497–506.
[14] T. Struyf, J. Deeks, J. Dinnes, Y. Takwoingi, C. Davenport, M. Leeflang, R. Spijker, L. Hooft, D. Emperador, S. Dittrich, J. Domen, S.A. Horn, A. Van den Bruel, Signs and symptoms to determine if a patient presenting in primary care or hospital outpatient settings has COVID-19 disease, Cochrane Database Syst. Rev. (7) (2020).
[15] S. Asif, Y. Wenhui, K. Amjad, H. Jin, Y. Tao, S. Jinhai, Detection of COVID-19 from chest X-ray images: Boosting the performance with convolutional neural network and transfer learning, Expert Syst. 39 (7) (2022) e13099.
[16] M. Alhasan, M. Hasaneen, Digital imaging, technologies and artificial intelligence applications during COVID-19 pandemic, Comput. Med. Imaging Graph.: Off. J. Comput. Med. Imaging Soc. 91 (2021) 101933.
[17] J.P. Kanne, B.P. Little, J.H. Chung, B.M. Elicker, L.H. Ketai, Essentials for radiologists on COVID-19: An update—Radiology scientific expert panel, Radiology 296 (2) (2020) E113–E114.
[18] Cochrane, How accurate is chest imaging for diagnosing COVID-19?, 2022, https://www.cochrane.org/CD013639/INFECTN_how-accurate-chest-imagingdiagnosing- covid-19. [Internet]. [Accessed: 28-Nov-2022].
[19] X. Xie, Z. Zhong, W. Zhao, C. Zheng, F. Wang, J. Liu, Chest CT for typical coronavirus disease 2019 (COVID-19) pneumonia: Relationship to negative RT-PCR testing, Radiology 296 (2) (2020) E41–E45.
[20] L. Wang, Z.Q. Lin, A. Wong, COVID-net: a tailored deep convolutional neural network design for detection of COVID-19 cases from chest X-ray images, Sci. Rep. 10 (1) (2020) 19549.
[21] H.Y.F. Wong, H.Y.S. Lam, A.H.-T. Fong, S.T. Leung, T.W.-Y. Chin, C.S.Y. Lo, M.M.-S. Lui, J.C.Y. Lee, K.W.-H. Chiu, T.W.-H. Chung, E.Y.P. Lee, E.Y.F. Wan, I.F.N. Hung, T.P.W. Lam, M.D. Kuo, M.-Y. Ng, Frequency and distribution of chest radiographic findings in patients positive for COVID-19, Radiology 296 (2) (2020) E72–E78, PMID: 32216717.
[22] G.D. Rubin, C.J. Ryerson, L.B. Haramati, N. Sverzellati, J.P. Kanne, S. Raoof, N.W. Schluger, A. Volpi, J.-J. Yim, I.B. Martin, D.J. Anderson, C. Kong, T. Altes, A. Bush, S.R. Desai, J. Goldin, J.M. Goo, M. Humbert, Y. Inoue, H.-U. Kauczor, F. Luo, P.J. Mazzone, M. Prokop, M. Remy-Jardin, L. Richeldi, C.M. Schaefer- Prokop, N. Tomiyama, A.U. Wells, A.N. Leung, The role of chest imaging in patient management during the COVID-19 pandemic: A multinational consensus statement from the fleischner society, Chest 158 (1) (2020) 106–116.
[23] S. Stephanie, T. Shum, H. Cleveland, S.R. Challa, A. Herring, F.L. Jacobson, H. Hatabu, S.C. Byrne, K. Shashi, T. Araki, J.A. Hernandez, C.S. White, R. Hossain, A.R. Hunsaker, M.M. Hammer, Determinants of chest radiography sensitivity for COVID-19: A multi-institutional study in the United States, Radiol.: Cardiothorac. Imaging 2 (5) (2020) e200337.
[24] S. Wang, B. Kang, J. Ma, X. Zeng, M. Xiao, J. Guo, M. Cai, J. Yang, Y. Li, X. Meng, B. Xu, A deep learning algorithm using CT images to screen for corona virus disease (COVID-19), Eur. Radiol. 31 (8) (2021) 6096–6104.
[25] A. Alqahtani, M.M. Zahoor, R. Nasrullah, A. Fareed, A.A. Cheema, A. Shahrose, M. Irfan, A. Alqhatani, A.A. Alsulami, M. Zaffar, S. Rahman, Computer aided COVID-19 diagnosis in pandemic era using CNN in chest X-ray images, Life 12 (11) (2022).
[26] V. Gupta, N. Jain, J. Sachdeva, M. Gupta, S. Mohan, M.Y. Bajuri, A. Ahmadian, Improved COVID-19 detection with chest x-ray images using deep learning, Multimed. Tools Appl. 81 (26) (2022) 37657–37680.
[27] A. Parvaiz, M.A. Khalid, R. Zafar, H. Ameer, M. Ali, M.M. Fraz, Vision transformers in medical computer vision – a contemplative retrospection, 2022, arXiv abs/2203.15269.
[28] B. Pardamean, T.W. Cenggoro, R. Rahutomo, A. Budiarto, E.K. Karuppiah, Transfer learning from chest X-Ray pre-trained convolutional neural network for learning mammogram data, Procedia Comput. Sci. 135 (2018) 400–407.
[29] Y. Zhang, L. Deng, H. Zhu, W. Wang, Z. Ren, Q. Zhou, S. Lu, S. Sun, Z. Zhu, J.M. Gorriz, S. Wang, Deep learning in food category recognition, Inf. Fusion 98 (2023) 101859, Under a Creative Commons license. [
30] S.-Y. Lu, D.R. Nayak, S.-H. Wang, Y.-D. Zhang, A cerebral microbleed diagnosis method via FeatureNet and ensembled randomized neural networks, Appl. Soft Comput. 109 (2021) 107567, Available online.
[31] S. Lu, S.-H. Wang, Y.-D. Zhang, Detection of abnormal brain in MRI via improved AlexNet and ELM optimized by chaotic bat algorithm, Neural Comput. Appl. 33 (2021) 10799–10811, Published online 13 June 2020.
[32] H. Wu, B. Xiao, N. Codella, M. Liu, X. Dai, L. Yuan, L. Zhang, CvT: Introducing convolutions to vision transformers, 2021, arXiv:2103.15808.
[33] Y. Xie, B. Yang, Q. Guan, J. Zhang, Q. Wu, Y. Xia, Attention mechanisms in medical image segmentation: A survey, 2023, arXiv preprint arXiv:2305.17937.
[34] J.-L. He, L. Luo, Z.-D. Luo, J.-X. Lyu, M.-Y. Ng, X.-P. Shen, Z. Wen, Diagnostic performance between CT and initial real-time RT-PCR for clinically suspected 2019 coronavirus disease (COVID-19) patients outside wuhan, China, Respir. Med. 168 (2020) 105980.
[35] Y. Fang, H. Zhang, J. Xie, M. Lin, L. Ying, P. Pang, W. Ji, Sensitivity of chest CT for COVID-19: Comparison to RT-PCR, Radiology 296 (2) (2020) E115–E117, PMID: 32073353.
[36] Health C for D and R, False positive results with BD SARS-CoV-2 reagents for the BD max system - letter to clinical laboratory staff and health care providers. FDA, 2023, https://www.fda.gov/medical-devices/letters-health-careproviders/ false-positive-results-bd-sars-cov-2-reagents-bd-max-system-letterclinical- laboratory-staff-and. Retrieved Jun 3, 2023. [Internet]. [Accessed: 2021-Jul-29].
[37] L.M. Kucirka, S.A. Lauer, O. Laeyendecker, D. Boon, J. Lessler, Variation in false-negative rate of reverse transcriptase polymerase chain reaction–based SARS-CoV-2 tests by time since exposure, Ann. Internal Med. 173 (4) (2020) 262–267, PMID: 32422057.
[38] I.W. Pray, L. Ford, D. Cole, C. Lee, J.P. Bigouette, G.R. Abedi, D. Bushman, M.J. Delahoy, D. Currie, B. Cherney, M. Kirby, G. Fajardo, M. Caudill, K. Langolf, J. Kahrs, P. Kelly, C. Pitts, A. Lim, N. Aulik, A. Tamin, J.L. Harcourt, K. Queen, J. Zhang, B. Whitaker, H. Browne, M. Medrzycki, P. Shewmaker, J. Folster, B. Bankamp, M.D. Bowen, N.J. Thornburg, K. Goffard, B. Limbago, A. Bateman, J.E. Tate, D. Gieryn, H.L. Kirking, R. Westergaard, M. Killerby, CDC COVID-19 Surge Laboratory Group, Performance of an Antigen-Based test for asymptomatic and symptomatic SARS-CoV-2 testing at two university campuseswisconsin, September-October 2020, MMWR Morb. Mortal. Wkly. Rep. 69 (5152) (2021) 1642–1647.
[39] J. Dinnes, J. Deeks, A. Adriano, S. Berhane, C. Davenport, S. Dittrich, D. Emperador, Y. Takwoingi, J. Cunningham, S. Beese, J. Dretzke, L. Ferrante di Ruffano, I. Harris, M. Price, S. Taylor-Phillips, L. Hooft, M. Leeflang, R. Spijker, A. Van den Bruel, Rapid, point-of-care antigen and molecular-based tests for diagnosis of SARS-CoV-2 infection, Cochrane Database Syst. Rev. (8) (2020).
[40] V.T. Chu, N.G. Schwartz, M.A.P. Donnelly, M.R. Chuey, R. Soto, A.R. Yousaf, E.N. Schmitt-Matzen, S. Sleweon, J. Ruffin, N. Thornburg, J.L. Harcourt, A. Tamin, G. Kim, J.M. Folster, L.J. Hughes, S. Tong, G. Stringer, B.A. Albanese, S.E. Totten, M.M. Hudziec, S.R. Matzinger, E.A. Dietrich, S.W. Sheldon, S. Stous, E.C. McDonald, B. Austin, M.E. Beatty, J.E. Staples, M.E. Killerby, C.H. Hsu, J.E. Tate, H.L. Kirking, A. Matanock, COVID-19 Household Transmission Team, Comparison of home antigen testing with RT-PCR and viral culture during the course of SARS-CoV-2 infection, JAMA Internal Med. 182 (7) (2022) 701–709.
[41] F.C. Fang, S.N. Naccache, A.L. Greninger, The laboratory diagnosis of coronavirus disease 2019— frequently asked questions, Clin. Infect. Dis. 71 (11) (2020) 2996–3001.
[42] G. Developers, Imbalanced data, 2022, Retrieved from https://developers.google. com/machine-learning/data-prep/construct/sampling-splitting/imbalanced-data.
[43] A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A.N. Gomez, L. Kaiser, I. Polosukhin, Attention is all you need, 2017, CoRR abs/1706.03762.
[44] P. Ratan, What is the convolutional neural network architecture?, 2020, https://www.analyticsvidhya.com/blog/2020/10/what-is-the-convolutionalneural- network-architecture/. Accessed: 2023-01-30.
[45] O. Khare, S. Gandhi, A. Rahalkar, S. Mane, YOLOv8-based visual detection of road hazards: Potholes, sewer covers, and manholes, in: 2023 IEEE Pune Section International Conference (PuneCon), 2023, pp. 1–6.
[46] O. Dogan, S. Tiwari, M.A. Jabbar, S. Guggari, A systematic review on AI/ML approaches against COVID-19 outbreak, Complex Intell. Syst. 7 (5) (2021) 2655–2678.
[47] I.D. Apostolopoulos, T.A. Mpesiana, Covid-19: automatic detection from X-ray images utilizing transfer learning with convolutional neural networks, Phys. Eng. Sci. Med. 43 (2) (2020) 635–640.
[48] M.E.H. Chowdhury, T. Rahman, A. Khandakar, R. Mazhar, M.A. Kadir, Z.B. Mahbub, K.R. Islam, M.S. Khan, A. Iqbal, N.A. Emadi, M.B.I. Reaz, M.T. Islam, Can AI help in screening viral and COVID-19 pneumonia? IEEE Access 8 (2020) 132665–132676.
[49] M.A. Khan, M. Azhar, K. Ibrar, A. Alqahtani, S. Alsubai, A. Binbusayyis, Y.J. Kim, B. Chang, COVID-19 classification from chest X-Ray images: A framework of deep explainable artificial intelligence, Comput. Intell. Neurosci. (2022).
[50] J. Civit-Masot, F. Luna-Perejón, M. Domínguez Morales, A. Civit, Deep learning system for COVID-19 diagnosis aid using X-ray pulmonary images, Appl. Sci. 10 (13) (2020).
[51] T. Ozturk, M. Talo, E.A. Yildirim, U.B. Baloglu, O. Yildirim, U. Rajendra Acharya, Automated detection of COVID-19 cases using deep neural networks with X-ray images, Comput. Biol. Med. 121 (2020) 103792.
[52] S.H. Yoo, H. Geng, T.L. Chiu, S.K. Yu, D.C. Cho, J. Heo, M.S. Choi, I.H. Choi, C. Cung Van, N.V. Nhung, B.J. Min, H. Lee, Deep learning-based decision-tree classifier for COVID-19 diagnosis from chest X-ray imaging, Front. Med. 7 (2020).
[53] A. Chaddad, L. Hassan, C. Desrosiers, Deep CNN models for predicting COVID-19 in CT and x-ray images, J. Med. Imaging 8 (S1) (2021) 014502.
[54] P.K. Sethy, S.K. Behera, P.K. Ratha, P. Biswas, Detection of coronavirus disease (COVID-19) based on deep features and support vector machine, Int. J. Math. Eng. Manag. Sci. 5 (4) (2020) 643–651.
[55] E.E.-D. Hemdan, M.A. Shouman, M.E. Karar, COVIDX-net: A framework of deep learning classifiers to diagnose COVID-19 in X-Ray images, 2020, arXiv abs/2003.11055.
[56] A. Narin, C. Kaya, Z. Pamuk, Automatic detection of coronavirus disease (COVID- 19) using X-ray images and deep convolutional neural networks, PAA Pattern Anal. Appl. 24 (3) (2021) 1207–1220.
[57] J.E. Lee, M. Hwang, Y.-H. Kim, M.J. Chung, B.H. Sim, K.J. Chae, J.Y. Yoo, Y.J. Jeong, Imaging and clinical features of COVID-19 breakthrough infections: A multicenter study, Radiology 303 (3) (2022) 682–692, PMID: 35103535.
[58] A. El-Fiky, M.A. Shouman, S. Hamada, A. El-Sayed, M.E. Karar, Multi-label transfer learning for identifying lung diseases using chest X-Rays, in: 2021 International Conference on Electronic Engineering, ICEEM, 2021, pp. 1–6.
[59] R.-K. Sheu, L.-C. Chen, C.-L. Wu, M.S. Pardeshi, K.-C. Pai, C.-C. Huang, C.-Y. Chen, W.-C. Chen, Multi-modal data analysis for pneumonia status prediction using deep learning (MDA-PSP), Diagnostics 12 (7) (2022).
dc.rights.spa.fl_str_mv Derechos reservados - Elsevier, 2024
dc.rights.coar.fl_str_mv http://purl.org/coar/access_right/c_abf2
dc.rights.uri.eng.fl_str_mv https://creativecommons.org/licenses/by-nc-nd/4.0/
dc.rights.accessrights.eng.fl_str_mv info:eu-repo/semantics/openAccess
dc.rights.creativecommons.spa.fl_str_mv Atribución-NoComercial-SinDerivadas 4.0 Internacional (CC BY-NC-ND 4.0)
rights_invalid_str_mv Derechos reservados - Elsevier, 2024
https://creativecommons.org/licenses/by-nc-nd/4.0/
Atribución-NoComercial-SinDerivadas 4.0 Internacional (CC BY-NC-ND 4.0)
http://purl.org/coar/access_right/c_abf2
eu_rights_str_mv openAccess
dc.format.extent.spa.fl_str_mv 13 páginas
dc.format.mimetype.none.fl_str_mv application/pdf
dc.publisher.eng.fl_str_mv ScienceDirect
dc.publisher.place.spa.fl_str_mv Países Bajos
institution Universidad Autónoma de Occidente
bitstream.url.fl_str_mv https://red.uao.edu.co/bitstreams/18a09c71-17d4-4647-853d-9663d1c33592/download
https://red.uao.edu.co/bitstreams/e030871d-d081-4e10-8e2f-018f0b91a8ca/download
https://red.uao.edu.co/bitstreams/b2d85ed8-fe5a-4c3d-877c-c19d4ef97100/download
https://red.uao.edu.co/bitstreams/c4f1f161-b905-4bbb-a623-0213672b72d8/download
bitstream.checksum.fl_str_mv ceabaf84bbf327048d239d1e1ac37984
6987b791264a2b5525252450f99b10d1
4a8911323ba038f648394230e1e56690
044c2e348389f542e7e9763b1a27c538
bitstream.checksumAlgorithm.fl_str_mv MD5
MD5
MD5
MD5
repository.name.fl_str_mv Repositorio Digital Universidad Autonoma de Occidente
repository.mail.fl_str_mv repositorio@uao.edu.co
_version_ 1851053186575499264
spelling Escobar Ortiz, Andres F.Pardo-Cabrera, JoshHurtado López, Juliánvirtual::6098-1Ramírez Moreno, David Fernandovirtual::6099-1Amezquita-Dussan, María A.Galindo-Sánchez, Juan S.Sua-Villegas, Luz F.Fernández-Trujillo, Liliana2025-07-08T17:16:01Z2025-07-08T17:16:01Z2024Escobar Ortiz, A. F.; Pardo-Cabrera, J.; Hurtado López, J.; Ramirez-Moreno, D. F.; Amezquita-Dussan, M. A.; Galindo-Sánchez, J. S.; Sua-Villegas, L. F. y Fernández-Trujillo, L. (2024). Data-set class-balancing and the convolutional vision transformer: An analysis of chest radiographs for SARS-CoV-2 detection. Biomedical Signal Processing and Control. Vol. 93. p.p. 1-13. https://doi.org/10.1016/j.bspc.2024.106190https://hdl.handle.net/10614/16207https://doi.org/10.1016/j.bspc.2024.10619017468108Universidad Autónoma de OccidenteRespositorio Educativo Digital UAOhttps://red.uao.edu.co/Introducción: La pandemia de COVID-19 ha impactado la salud pública y las economías mundiales. Las radiografías de tórax (RTC) desempeñan un papel fundamental en el cribado, especialmente en regiones con recursos limitados. Este trabajo se centra en el desarrollo de un modelo de aprendizaje profundo para la detección de COVID-19 en RTC, con el objetivo de abordar los desafíos asociados con la interpretación manual. Métodos: Nuestro enfoque integra Transformadores de Visión Convolucional con Redes Neuronales Convolucionales tradicionales para el análisis de radiografías de tórax . Utilizando un conjunto de datos de 5572 radiografías de tórax, que incluían diversos grados de COVID-19 y otros tipos de neumonía, el modelo se entrenó con múltiples configuraciones y se evaluó mediante diversas métricas. Cada Configuración de Conjunto de Datos (DSC) implicó 10 iteraciones de entrenamiento. Empleamos una técnica de visualización de la atención mejorada para identificar áreas clave en las radiografías. Resultados: Este trabajo presenta los cinco mejores modelos según los resultados de entrenamiento y validación en diferentes clases radiográficas. El rendimiento del modelo varió según la configuración del conjunto de datos, en particular al distinguir la COVID-19 de otros tipos de neumonía. DSC 6 resultó ser el más eficaz, con una sensibilidad del 97,1 % y una tasa de falsos negativos del 2,9 % durante el entrenamiento, y manteniendo un alto rendimiento durante la validación con una sensibilidad del 87,2 % y una tasa de falsos negativos del 12,8 %. Conclusión: Esta fusión de ingeniería y experiencia médica da como resultado una herramienta de detección eficiente. La crisis de la COVID-19 subraya la importancia de mejorar la preparación sanitaria. Nuestro modelo propuesto alcanza un promedio de recuperación, precisión y sensibilidad superior al 90 %, junto con una baja tasa de falsos negativos. Identifica la COVID-19 de forma fiable en diferentes grados de gravedad y la distingue eficazmente de otros tipos de neumonía, lo que consolida su utilidad como una herramienta robusta para la detección de la COVID-19Introduction: The COVID-19 pandemic has had impact on global public health and economies. Chest X-rays (CXRs) play a critical role in screening, especially in resource-constrained regions. This work focuses on the development of a deep learning model for detecting COVID-19 in CXRs images, aiming to address the challenges associated with manual interpretation. Methods: Our approach integrates Convolutional Vision Transformers with traditional Convolutional Neural Networks for CXRs analysis. Using a dataset comprising 5572 CXRs, including various COVID-19 severities and other pneumonia types, the model underwent training across multiple configurations and evaluation using various metrics. Each Dataset Configuration (DSC) involved 10 training iterations. We employed an enhanced attention visualization technique to identify key areas in X-rays. Results: This work presents the top five models based on training and validation outcomes across different radiographic classes. The model’s performance varied with dataset configurations, particularly in distinguishing COVID-19 from other pneumonia types. DSC 6 emerged as the most effective, demonstrating 97.1% sensitivity and a 2.9% false negative rate during training, and maintaining high performance during validation with 87.2% sensitivity and a 12.8% false negative rate. Conclusion: This fusion of engineering and medical expertise yields an efficient screening tool. The COVID-19 crisis underscores the importance of enhanced healthcare preparedness. Our proposed model achieves over 90% average recall, accuracy, and sensitivity, coupled with a low false negative rate. It reliably identifies COVID-19 across different severities and effectively distinguishes it from other pneumonia types, establishing its utility as a robust COVID-19 detection tool13 páginasapplication/pdfengScienceDirectPaíses BajosDerechos reservados - Elsevier, 2024https://creativecommons.org/licenses/by-nc-nd/4.0/info:eu-repo/semantics/openAccessAtribución-NoComercial-SinDerivadas 4.0 Internacional (CC BY-NC-ND 4.0)http://purl.org/coar/access_right/c_abf2Data-set class-balancing and the convolutional vision transformer: An analysis of chest radiographs for SARS-CoV-2 detectionBalance de clases de conjuntos de datos y el transformador de visión convolucional: un análisis de radiografías de tórax para la detección del SARS-CoV-2Artículo de revistahttp://purl.org/coar/resource_type/c_2df8fbb1Textinfo:eu-repo/semantics/articlehttp://purl.org/redcol/resource_type/ARTinfo:eu-repo/semantics/publishedVersionhttp://purl.org/coar/version/c_970fb48d4fbd8a8513193Biomedical Signal Processing and Control[1] S. Zhao, Q. Lin, J. Ran, S.S. Musa, G. Yang, W. Wang, Y. Lou, D. Gao, L. Yang, D. He, M.H. Wang, Preliminary estimation of the basic reproduction number of novel coronavirus (2019-nCoV) in China, from 2019 to 2020: A data-driven analysis in the early phase of the outbreak, Int. J. Infect. Dis. 92 (2020) 214–217.[2] K. Cao, T. Deng, C. Zhang, L. Lu, L. Li, A CNN-transformer fusion network for COVID-19 CXR image classification, PLoS ONE 17 (10) (2022) 1–16.[3] J. Bonet-Morón, D. Ricciulli-Marín, G.J. Pérez-Valbuena, L.A. Galvis-Aponte, E.A. Haddad, I.F. Araújo, F.S. Perobelli, Regional economic impact of COVID-19 in Colombia: An input–output approach, Reg. Sci. Policy Pract. 12 (6) (2020) 1123–1150.[4] WHO, WHO coronavirus (COVID-19) dashboard, 2022, https://covid19.who.int [Internet]. [Retrieved November 18, 2022].[5] INS, Coronavirus Colombia, 2022, https://www.ins.gov.co/Noticias/paginas/ coronavirus.aspx. [Internet]. [Retrieved November 21, 2022].[6] A.I. Vecino-Ortiz, J. Villanueva Congote, S. Zapata Bedoya, Z.M. Cucunuba, Impact of contact tracing on COVID-19 mortality: An impact evaluation using surveillance data from Colombia, PLoS ONE 16 (3) (2021) 1–12.[7] D. Fisher, A. Wilder-Smith, The global community needs to swiftly ramp up the response to contain COVID-19, Lancet 395 (10230) (2020) 1109–1110.[8] J. Sánchez-Duque, L. Arce-Villalobos, A. Rodríguez-Morales, Enfermedad por coronavirus 2019 (COVID-19) en américa latina: papel de la atención primaria en la preparación y respuesta, Aten. Primaria 52 (6) (2020) 369–372.[9] M. Cascella, M. Rajnik, A. Aleem, S.C. Dulebohn, R. Di Napoli, Features, evaluation, and treatment of coronavirus (COVID-19), Statpearls (2022) [internet].[10] COVID-19 Treatment Guidelines, Clinical spectrum, 2022, https://www. covid19treatmentguidelines.nih.gov/overview/clinical-spectrum/. [Retrieved November 22, 2022].[11] Z. Wu, J.M. McGoogan, Characteristics of and important lessons from the coronavirus disease 2019 (COVID-19) outbreak in China: Summary of a report of 72314 cases from the Chinese center for disease control and prevention, JAMA 323 (13) (2020) 1239–1242.[12] N. Chen, M. Zhou, X. Dong, J. Qu, F. Gong, Y. Han, Y. Qiu, J. Wang, Y. Liu, Y. Wei, J. Xia, T. Yu, X. Zhang, L. Zhang, Epidemiological and clinical characteristics of 99 cases of 2019 novel coronavirus pneumonia in wuhan, China: a descriptive study, Lancet 395 (10223) (2020) 507–513.[13] C. Huang, Y. Wang, X. Li, L. Ren, J. Zhao, Y. Hu, L. Zhang, G. Fan, J. Xu, X. Gu, Z. Cheng, T. Yu, J. Xia, Y. Wei, W. Wu, X. Xie, W. Yin, H. Li, M. Liu, Y. Xiao, H. Gao, L. Guo, J. Xie, G. Wang, R. Jiang, Z. Gao, Q. Jin, J. Wang, B. Cao, Clinical features of patients infected with 2019 novel coronavirus in wuhan, China, Lancet 395 (10223) (2020) 497–506.[14] T. Struyf, J. Deeks, J. Dinnes, Y. Takwoingi, C. Davenport, M. Leeflang, R. Spijker, L. Hooft, D. Emperador, S. Dittrich, J. Domen, S.A. Horn, A. Van den Bruel, Signs and symptoms to determine if a patient presenting in primary care or hospital outpatient settings has COVID-19 disease, Cochrane Database Syst. Rev. (7) (2020).[15] S. Asif, Y. Wenhui, K. Amjad, H. Jin, Y. Tao, S. Jinhai, Detection of COVID-19 from chest X-ray images: Boosting the performance with convolutional neural network and transfer learning, Expert Syst. 39 (7) (2022) e13099.[16] M. Alhasan, M. Hasaneen, Digital imaging, technologies and artificial intelligence applications during COVID-19 pandemic, Comput. Med. Imaging Graph.: Off. J. Comput. Med. Imaging Soc. 91 (2021) 101933.[17] J.P. Kanne, B.P. Little, J.H. Chung, B.M. Elicker, L.H. Ketai, Essentials for radiologists on COVID-19: An update—Radiology scientific expert panel, Radiology 296 (2) (2020) E113–E114.[18] Cochrane, How accurate is chest imaging for diagnosing COVID-19?, 2022, https://www.cochrane.org/CD013639/INFECTN_how-accurate-chest-imagingdiagnosing- covid-19. [Internet]. [Accessed: 28-Nov-2022].[19] X. Xie, Z. Zhong, W. Zhao, C. Zheng, F. Wang, J. Liu, Chest CT for typical coronavirus disease 2019 (COVID-19) pneumonia: Relationship to negative RT-PCR testing, Radiology 296 (2) (2020) E41–E45.[20] L. Wang, Z.Q. Lin, A. Wong, COVID-net: a tailored deep convolutional neural network design for detection of COVID-19 cases from chest X-ray images, Sci. Rep. 10 (1) (2020) 19549.[21] H.Y.F. Wong, H.Y.S. Lam, A.H.-T. Fong, S.T. Leung, T.W.-Y. Chin, C.S.Y. Lo, M.M.-S. Lui, J.C.Y. Lee, K.W.-H. Chiu, T.W.-H. Chung, E.Y.P. Lee, E.Y.F. Wan, I.F.N. Hung, T.P.W. Lam, M.D. Kuo, M.-Y. Ng, Frequency and distribution of chest radiographic findings in patients positive for COVID-19, Radiology 296 (2) (2020) E72–E78, PMID: 32216717.[22] G.D. Rubin, C.J. Ryerson, L.B. Haramati, N. Sverzellati, J.P. Kanne, S. Raoof, N.W. Schluger, A. Volpi, J.-J. Yim, I.B. Martin, D.J. Anderson, C. Kong, T. Altes, A. Bush, S.R. Desai, J. Goldin, J.M. Goo, M. Humbert, Y. Inoue, H.-U. Kauczor, F. Luo, P.J. Mazzone, M. Prokop, M. Remy-Jardin, L. Richeldi, C.M. Schaefer- Prokop, N. Tomiyama, A.U. Wells, A.N. Leung, The role of chest imaging in patient management during the COVID-19 pandemic: A multinational consensus statement from the fleischner society, Chest 158 (1) (2020) 106–116.[23] S. Stephanie, T. Shum, H. Cleveland, S.R. Challa, A. Herring, F.L. Jacobson, H. Hatabu, S.C. Byrne, K. Shashi, T. Araki, J.A. Hernandez, C.S. White, R. Hossain, A.R. Hunsaker, M.M. Hammer, Determinants of chest radiography sensitivity for COVID-19: A multi-institutional study in the United States, Radiol.: Cardiothorac. Imaging 2 (5) (2020) e200337.[24] S. Wang, B. Kang, J. Ma, X. Zeng, M. Xiao, J. Guo, M. Cai, J. Yang, Y. Li, X. Meng, B. Xu, A deep learning algorithm using CT images to screen for corona virus disease (COVID-19), Eur. Radiol. 31 (8) (2021) 6096–6104.[25] A. Alqahtani, M.M. Zahoor, R. Nasrullah, A. Fareed, A.A. Cheema, A. Shahrose, M. Irfan, A. Alqhatani, A.A. Alsulami, M. Zaffar, S. Rahman, Computer aided COVID-19 diagnosis in pandemic era using CNN in chest X-ray images, Life 12 (11) (2022).[26] V. Gupta, N. Jain, J. Sachdeva, M. Gupta, S. Mohan, M.Y. Bajuri, A. Ahmadian, Improved COVID-19 detection with chest x-ray images using deep learning, Multimed. Tools Appl. 81 (26) (2022) 37657–37680.[27] A. Parvaiz, M.A. Khalid, R. Zafar, H. Ameer, M. Ali, M.M. Fraz, Vision transformers in medical computer vision – a contemplative retrospection, 2022, arXiv abs/2203.15269.[28] B. Pardamean, T.W. Cenggoro, R. Rahutomo, A. Budiarto, E.K. Karuppiah, Transfer learning from chest X-Ray pre-trained convolutional neural network for learning mammogram data, Procedia Comput. Sci. 135 (2018) 400–407.[29] Y. Zhang, L. Deng, H. Zhu, W. Wang, Z. Ren, Q. Zhou, S. Lu, S. Sun, Z. Zhu, J.M. Gorriz, S. Wang, Deep learning in food category recognition, Inf. Fusion 98 (2023) 101859, Under a Creative Commons license. [30] S.-Y. Lu, D.R. Nayak, S.-H. Wang, Y.-D. Zhang, A cerebral microbleed diagnosis method via FeatureNet and ensembled randomized neural networks, Appl. Soft Comput. 109 (2021) 107567, Available online.[31] S. Lu, S.-H. Wang, Y.-D. Zhang, Detection of abnormal brain in MRI via improved AlexNet and ELM optimized by chaotic bat algorithm, Neural Comput. Appl. 33 (2021) 10799–10811, Published online 13 June 2020.[32] H. Wu, B. Xiao, N. Codella, M. Liu, X. Dai, L. Yuan, L. Zhang, CvT: Introducing convolutions to vision transformers, 2021, arXiv:2103.15808.[33] Y. Xie, B. Yang, Q. Guan, J. Zhang, Q. Wu, Y. Xia, Attention mechanisms in medical image segmentation: A survey, 2023, arXiv preprint arXiv:2305.17937.[34] J.-L. He, L. Luo, Z.-D. Luo, J.-X. Lyu, M.-Y. Ng, X.-P. Shen, Z. Wen, Diagnostic performance between CT and initial real-time RT-PCR for clinically suspected 2019 coronavirus disease (COVID-19) patients outside wuhan, China, Respir. Med. 168 (2020) 105980.[35] Y. Fang, H. Zhang, J. Xie, M. Lin, L. Ying, P. Pang, W. Ji, Sensitivity of chest CT for COVID-19: Comparison to RT-PCR, Radiology 296 (2) (2020) E115–E117, PMID: 32073353.[36] Health C for D and R, False positive results with BD SARS-CoV-2 reagents for the BD max system - letter to clinical laboratory staff and health care providers. FDA, 2023, https://www.fda.gov/medical-devices/letters-health-careproviders/ false-positive-results-bd-sars-cov-2-reagents-bd-max-system-letterclinical- laboratory-staff-and. Retrieved Jun 3, 2023. [Internet]. [Accessed: 2021-Jul-29].[37] L.M. Kucirka, S.A. Lauer, O. Laeyendecker, D. Boon, J. Lessler, Variation in false-negative rate of reverse transcriptase polymerase chain reaction–based SARS-CoV-2 tests by time since exposure, Ann. Internal Med. 173 (4) (2020) 262–267, PMID: 32422057.[38] I.W. Pray, L. Ford, D. Cole, C. Lee, J.P. Bigouette, G.R. Abedi, D. Bushman, M.J. Delahoy, D. Currie, B. Cherney, M. Kirby, G. Fajardo, M. Caudill, K. Langolf, J. Kahrs, P. Kelly, C. Pitts, A. Lim, N. Aulik, A. Tamin, J.L. Harcourt, K. Queen, J. Zhang, B. Whitaker, H. Browne, M. Medrzycki, P. Shewmaker, J. Folster, B. Bankamp, M.D. Bowen, N.J. Thornburg, K. Goffard, B. Limbago, A. Bateman, J.E. Tate, D. Gieryn, H.L. Kirking, R. Westergaard, M. Killerby, CDC COVID-19 Surge Laboratory Group, Performance of an Antigen-Based test for asymptomatic and symptomatic SARS-CoV-2 testing at two university campuseswisconsin, September-October 2020, MMWR Morb. Mortal. Wkly. Rep. 69 (5152) (2021) 1642–1647.[39] J. Dinnes, J. Deeks, A. Adriano, S. Berhane, C. Davenport, S. Dittrich, D. Emperador, Y. Takwoingi, J. Cunningham, S. Beese, J. Dretzke, L. Ferrante di Ruffano, I. Harris, M. Price, S. Taylor-Phillips, L. Hooft, M. Leeflang, R. Spijker, A. Van den Bruel, Rapid, point-of-care antigen and molecular-based tests for diagnosis of SARS-CoV-2 infection, Cochrane Database Syst. Rev. (8) (2020).[40] V.T. Chu, N.G. Schwartz, M.A.P. Donnelly, M.R. Chuey, R. Soto, A.R. Yousaf, E.N. Schmitt-Matzen, S. Sleweon, J. Ruffin, N. Thornburg, J.L. Harcourt, A. Tamin, G. Kim, J.M. Folster, L.J. Hughes, S. Tong, G. Stringer, B.A. Albanese, S.E. Totten, M.M. Hudziec, S.R. Matzinger, E.A. Dietrich, S.W. Sheldon, S. Stous, E.C. McDonald, B. Austin, M.E. Beatty, J.E. Staples, M.E. Killerby, C.H. Hsu, J.E. Tate, H.L. Kirking, A. Matanock, COVID-19 Household Transmission Team, Comparison of home antigen testing with RT-PCR and viral culture during the course of SARS-CoV-2 infection, JAMA Internal Med. 182 (7) (2022) 701–709.[41] F.C. Fang, S.N. Naccache, A.L. Greninger, The laboratory diagnosis of coronavirus disease 2019— frequently asked questions, Clin. Infect. Dis. 71 (11) (2020) 2996–3001.[42] G. Developers, Imbalanced data, 2022, Retrieved from https://developers.google. com/machine-learning/data-prep/construct/sampling-splitting/imbalanced-data.[43] A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A.N. Gomez, L. Kaiser, I. Polosukhin, Attention is all you need, 2017, CoRR abs/1706.03762.[44] P. Ratan, What is the convolutional neural network architecture?, 2020, https://www.analyticsvidhya.com/blog/2020/10/what-is-the-convolutionalneural- network-architecture/. Accessed: 2023-01-30.[45] O. Khare, S. Gandhi, A. Rahalkar, S. Mane, YOLOv8-based visual detection of road hazards: Potholes, sewer covers, and manholes, in: 2023 IEEE Pune Section International Conference (PuneCon), 2023, pp. 1–6.[46] O. Dogan, S. Tiwari, M.A. Jabbar, S. Guggari, A systematic review on AI/ML approaches against COVID-19 outbreak, Complex Intell. Syst. 7 (5) (2021) 2655–2678.[47] I.D. Apostolopoulos, T.A. Mpesiana, Covid-19: automatic detection from X-ray images utilizing transfer learning with convolutional neural networks, Phys. Eng. Sci. Med. 43 (2) (2020) 635–640.[48] M.E.H. Chowdhury, T. Rahman, A. Khandakar, R. Mazhar, M.A. Kadir, Z.B. Mahbub, K.R. Islam, M.S. Khan, A. Iqbal, N.A. Emadi, M.B.I. Reaz, M.T. Islam, Can AI help in screening viral and COVID-19 pneumonia? IEEE Access 8 (2020) 132665–132676.[49] M.A. Khan, M. Azhar, K. Ibrar, A. Alqahtani, S. Alsubai, A. Binbusayyis, Y.J. Kim, B. Chang, COVID-19 classification from chest X-Ray images: A framework of deep explainable artificial intelligence, Comput. Intell. Neurosci. (2022).[50] J. Civit-Masot, F. Luna-Perejón, M. Domínguez Morales, A. Civit, Deep learning system for COVID-19 diagnosis aid using X-ray pulmonary images, Appl. Sci. 10 (13) (2020).[51] T. Ozturk, M. Talo, E.A. Yildirim, U.B. Baloglu, O. Yildirim, U. Rajendra Acharya, Automated detection of COVID-19 cases using deep neural networks with X-ray images, Comput. Biol. Med. 121 (2020) 103792.[52] S.H. Yoo, H. Geng, T.L. Chiu, S.K. Yu, D.C. Cho, J. Heo, M.S. Choi, I.H. Choi, C. Cung Van, N.V. Nhung, B.J. Min, H. Lee, Deep learning-based decision-tree classifier for COVID-19 diagnosis from chest X-ray imaging, Front. Med. 7 (2020).[53] A. Chaddad, L. Hassan, C. Desrosiers, Deep CNN models for predicting COVID-19 in CT and x-ray images, J. Med. Imaging 8 (S1) (2021) 014502.[54] P.K. Sethy, S.K. Behera, P.K. Ratha, P. Biswas, Detection of coronavirus disease (COVID-19) based on deep features and support vector machine, Int. J. Math. Eng. Manag. Sci. 5 (4) (2020) 643–651.[55] E.E.-D. Hemdan, M.A. Shouman, M.E. Karar, COVIDX-net: A framework of deep learning classifiers to diagnose COVID-19 in X-Ray images, 2020, arXiv abs/2003.11055.[56] A. Narin, C. Kaya, Z. Pamuk, Automatic detection of coronavirus disease (COVID- 19) using X-ray images and deep convolutional neural networks, PAA Pattern Anal. Appl. 24 (3) (2021) 1207–1220.[57] J.E. Lee, M. Hwang, Y.-H. Kim, M.J. Chung, B.H. Sim, K.J. Chae, J.Y. Yoo, Y.J. Jeong, Imaging and clinical features of COVID-19 breakthrough infections: A multicenter study, Radiology 303 (3) (2022) 682–692, PMID: 35103535.[58] A. El-Fiky, M.A. Shouman, S. Hamada, A. El-Sayed, M.E. Karar, Multi-label transfer learning for identifying lung diseases using chest X-Rays, in: 2021 International Conference on Electronic Engineering, ICEEM, 2021, pp. 1–6.[59] R.-K. Sheu, L.-C. Chen, C.-L. Wu, M.S. Pardeshi, K.-C. Pai, C.-C. Huang, C.-Y. Chen, W.-C. Chen, Multi-modal data analysis for pneumonia status prediction using deep learning (MDA-PSP), Diagnostics 12 (7) (2022).COVID-19Diagnóstico por imágenesRadiografíaTóraxRedes neuronalesComputadoraAprendizaje profundoDiagnostic imagingRadiographyThoraxNeural networksComputerComunidad generalPublication77636374-92e2-4d63-9b49-bdb0a2ea1182virtual::6098-161e20236-82c5-4dcc-b05c-0eaa9ac06b11virtual::6099-177636374-92e2-4d63-9b49-bdb0a2ea1182virtual::6098-161e20236-82c5-4dcc-b05c-0eaa9ac06b11virtual::6099-1https://scholar.google.com/citations?user=7Pqx31YAAAAJ&hl=esvirtual::6098-1https://scholar.google.com/citations?user=RTce1fkAAAAJ&hl=esvirtual::6099-10000-0002-3773-0598virtual::6098-10000-0003-2372-3554virtual::6099-1https://scienti.minciencias.gov.co/cvlac/visualizador/generarCurriculoCv.do?cod_rh=0000828963virtual::6098-1https://scienti.minciencias.gov.co/cvlac/visualizador/generarCurriculoCv.do?cod_rh=0000353744virtual::6099-1ORIGINALData-set_class-balancing_and_the_convolutional_vision_transformer_an_analysis_of_chest_radiographs_for_SARS-CoV-2_detection.pdfData-set_class-balancing_and_the_convolutional_vision_transformer_an_analysis_of_chest_radiographs_for_SARS-CoV-2_detection.pdfArchivo texto completo del artículo de revista, PDFapplication/pdf2381101https://red.uao.edu.co/bitstreams/18a09c71-17d4-4647-853d-9663d1c33592/downloadceabaf84bbf327048d239d1e1ac37984MD51LICENSElicense.txtlicense.txttext/plain; charset=utf-81672https://red.uao.edu.co/bitstreams/e030871d-d081-4e10-8e2f-018f0b91a8ca/download6987b791264a2b5525252450f99b10d1MD52TEXTData-set_class-balancing_and_the_convolutional_vision_transformer_an_analysis_of_chest_radiographs_for_SARS-CoV-2_detection.pdf.txtData-set_class-balancing_and_the_convolutional_vision_transformer_an_analysis_of_chest_radiographs_for_SARS-CoV-2_detection.pdf.txtExtracted texttext/plain95656https://red.uao.edu.co/bitstreams/b2d85ed8-fe5a-4c3d-877c-c19d4ef97100/download4a8911323ba038f648394230e1e56690MD53THUMBNAILData-set_class-balancing_and_the_convolutional_vision_transformer_an_analysis_of_chest_radiographs_for_SARS-CoV-2_detection.pdf.jpgData-set_class-balancing_and_the_convolutional_vision_transformer_an_analysis_of_chest_radiographs_for_SARS-CoV-2_detection.pdf.jpgGenerated Thumbnailimage/jpeg15002https://red.uao.edu.co/bitstreams/c4f1f161-b905-4bbb-a623-0213672b72d8/download044c2e348389f542e7e9763b1a27c538MD5410614/16207oai:red.uao.edu.co:10614/162072025-07-10 03:01:48.518https://creativecommons.org/licenses/by-nc-nd/4.0/Derechos reservados - Elsevier, 2024open.accesshttps://red.uao.edu.coRepositorio Digital Universidad Autonoma de Occidenterepositorio@uao.edu.coPHA+RUwgQVVUT1IgYXV0b3JpemEgYSBsYSBVbml2ZXJzaWRhZCBBdXTDs25vbWEgZGUgT2NjaWRlbnRlLCBkZSBmb3JtYSBpbmRlZmluaWRhLCBwYXJhIHF1ZSBlbiBsb3MgdMOpcm1pbm9zIGVzdGFibGVjaWRvcyBlbiBsYSBMZXkgMjMgZGUgMTk4MiwgbGEgTGV5IDQ0IGRlIDE5OTMsIGxhIERlY2lzacOzbiBhbmRpbmEgMzUxIGRlIDE5OTMsIGVsIERlY3JldG8gNDYwIGRlIDE5OTUgeSBkZW3DoXMgbGV5ZXMgeSBqdXJpc3BydWRlbmNpYSB2aWdlbnRlIGFsIHJlc3BlY3RvLCBoYWdhIHB1YmxpY2FjacOzbiBkZSBlc3RlIGNvbiBmaW5lcyBlZHVjYXRpdm9zLiBQQVJBR1JBRk86IEVzdGEgYXV0b3JpemFjacOzbiBhZGVtw6FzIGRlIHNlciB2w6FsaWRhIHBhcmEgbGFzIGZhY3VsdGFkZXMgeSBkZXJlY2hvcyBkZSB1c28gc29icmUgbGEgb2JyYSBlbiBmb3JtYXRvIG8gc29wb3J0ZSBtYXRlcmlhbCwgdGFtYmnDqW4gcGFyYSBmb3JtYXRvIGRpZ2l0YWwsIGVsZWN0csOzbmljbywgdmlydHVhbCwgcGFyYSB1c29zIGVuIHJlZCwgSW50ZXJuZXQsIGV4dHJhbmV0LCBpbnRyYW5ldCwgYmlibGlvdGVjYSBkaWdpdGFsIHkgZGVtw6FzIHBhcmEgY3VhbHF1aWVyIGZvcm1hdG8gY29ub2NpZG8gbyBwb3IgY29ub2Nlci4gRUwgQVVUT1IsIGV4cHJlc2EgcXVlIGVsIGRvY3VtZW50byAodHJhYmFqbyBkZSBncmFkbywgcGFzYW50w61hLCBjYXNvcyBvIHRlc2lzKSBvYmpldG8gZGUgbGEgcHJlc2VudGUgYXV0b3JpemFjacOzbiBlcyBvcmlnaW5hbCB5IGxhIGVsYWJvcsOzIHNpbiBxdWVicmFudGFyIG5pIHN1cGxhbnRhciBsb3MgZGVyZWNob3MgZGUgYXV0b3IgZGUgdGVyY2Vyb3MsIHkgZGUgdGFsIGZvcm1hLCBlbCBkb2N1bWVudG8gKHRyYWJham8gZGUgZ3JhZG8sIHBhc2FudMOtYSwgY2Fzb3MgbyB0ZXNpcykgZXMgZGUgc3UgZXhjbHVzaXZhIGF1dG9yw61hIHkgdGllbmUgbGEgdGl0dWxhcmlkYWQgc29icmUgw6lzdGUuIFBBUkFHUkFGTzogZW4gY2FzbyBkZSBwcmVzZW50YXJzZSBhbGd1bmEgcmVjbGFtYWNpw7NuIG8gYWNjacOzbiBwb3IgcGFydGUgZGUgdW4gdGVyY2VybywgcmVmZXJlbnRlIGEgbG9zIGRlcmVjaG9zIGRlIGF1dG9yIHNvYnJlIGVsIGRvY3VtZW50byAoVHJhYmFqbyBkZSBncmFkbywgUGFzYW50w61hLCBjYXNvcyBvIHRlc2lzKSBlbiBjdWVzdGnDs24sIEVMIEFVVE9SLCBhc3VtaXLDoSBsYSByZXNwb25zYWJpbGlkYWQgdG90YWwsIHkgc2FsZHLDoSBlbiBkZWZlbnNhIGRlIGxvcyBkZXJlY2hvcyBhcXXDrSBhdXRvcml6YWRvczsgcGFyYSB0b2RvcyBsb3MgZWZlY3RvcywgbGEgVW5pdmVyc2lkYWQgIEF1dMOzbm9tYSBkZSBPY2NpZGVudGUgYWN0w7phIGNvbW8gdW4gdGVyY2VybyBkZSBidWVuYSBmZS4gVG9kYSBwZXJzb25hIHF1ZSBjb25zdWx0ZSB5YSBzZWEgZW4gbGEgYmlibGlvdGVjYSBvIGVuIG1lZGlvIGVsZWN0csOzbmljbyBwb2Ryw6EgY29waWFyIGFwYXJ0ZXMgZGVsIHRleHRvIGNpdGFuZG8gc2llbXByZSBsYSBmdWVudGUsIGVzIGRlY2lyIGVsIHTDrXR1bG8gZGVsIHRyYWJham8geSBlbCBhdXRvci4gRXN0YSBhdXRvcml6YWNpw7NuIG5vIGltcGxpY2EgcmVudW5jaWEgYSBsYSBmYWN1bHRhZCBxdWUgdGllbmUgRUwgQVVUT1IgZGUgcHVibGljYXIgdG90YWwgbyBwYXJjaWFsbWVudGUgbGEgb2JyYS48L3A+Cg==