Integrating Artificial Intelligence in Neonatal Care: Clinical Uses and Socioeconomic Factors
DOI:
https://doi.org/10.63682/jns.v14i31S.8739Keywords:
Artificial Intelligence (AI), Machine learning, Neonatal Intensive care unit(NICU), Pediatric Intensive Care Unit(PICU), Socioeconomic DeterminantsAbstract
Background: Artificial intelligence (AI) is gradually transforming neonatal and pediatric intensive care units (NICUs and PICUs) by enhancing diagnostic accuracy, risk evaluation, and clinical decision support. However, integrating AI into these vital care settings faces challenges related to data limitations, clinician acceptance, and socioeconomic disparities.
Objective: This review examines the clinical potential of AI especially machine learning (ML) and deep learning (DL) in NICUs and PICUs, while evaluating the socioeconomic factors that influence AI deployment, effectiveness, and equity.
Methods: A comprehensive literature review was conducted, focusing on applications of AI in early diagnosis, patient surveillance, imaging assessment, and transport logistics in neonatal and pediatric ICUs. Factors related to socioeconomic status affecting AI deployment, such as provider demographics, healthcare systems, and geographic inequalities, were examined.
Findings: AI models show better early identification of urgent conditions like sepsis and respiratory distress, streamline clinical processes, and improve resource management. Nevertheless, differences in access to AI and its performance are present, especially in low-resource environments because of inadequate infrastructure, biased data, and differing levels of clinician preparedness. Approaches like federated learning and explainable AI could address certain challenges
Downloads
Metrics
References
Topol, E. (2019). Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again. Basic Books.
Johnson, A. E. W., Ghassemi, M. M., & Nemati, S. (2021). Machine learning and decision support in critical care. Journal of Intensive Care Medicine, 36(5), 432–441.
He, J., Baxter, S. L., Xu, J., et al. (2019). The practical implementation of artificial intelligence technologies in medicine. Nature Medicine, 25(1), 30–36.
Turing, A. M. (1950). Computing machinery and intelligence. Mind, 59(236), 433–460.
Esteva, A., Kuprel, B., Novoa, R. A., et al. (2017). Dermatologist-level classification of skin cancer with deep neural networks. Nature, 542(7639), 115–118.
Obermeyer, Z., & Emanuel, E. J. (2016). Predicting the future—big data, machine learning, and clinical medicine. New England Journal of Medicine, 375(13), 1216–1219.
LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436–444.
Rajkomar, A., Dean, J., & Kohane, I. (2019). Machine learning in medicine. New England Journal of Medicine, 380(14), 1347–1358.
Matheny, M. E., Whicher, D., & Israni, S. T. (2020). Artificial intelligence in health care: The hope, the hype, the promise, the peril. National Academy of Medicine. https://nam.edu/artificial-intelligence-in-health-care-the-hope-the-hype-the-promise-the-peril/
Reddy, S., Fox, J., & Purohit, M. P. (2020). Artificial intelligence-enabled healthcare delivery in low-resource settings: Challenges, opportunities, and a proposed framework. BMJ Innovations, 6(3), 94–98.
Zapata-Cortes, O., Arango-Serna, M. D., Zapata-Cortes, J. A., & Restrepo-Carmona, J. A. (2024). Machine Learning Models and Applications for Early Detection. Sensors, 24(14), 4678.
Liu, N., Zhang, Z., Wah, H. Y., et al. (2020). Deep learning for predicting pediatric critical care outcomes. Critical Care, 24(1), 205.
Nguyen, L., Habbal, A., Scott, I., & Celi, L. A. (2021). Artificial intelligence in intensive care medicine: Ready for prime time? Critical Care, 25(1), 1–9.
Ongena, Y. P., Haan, M., Yakar, D., & Kwee, T. C. (2020). Patients’ views on the implementation of artificial intelligence in radiology: Development and validation of a standardized questionnaire. European Radiology, 30(2), 1033–1040. https://doi.org/10.1007/s00330-019-06379-x
Kermany, D. S., Goldbaum, M., Cai, W., et al. (2018). Identifying medical diagnoses and treatable diseases by image-based deep learning. Cell, 172(5), 1122-1131.
Ghassemi, M., Oakden-Rayner, L., & Beam, A. L. (2021). The false hope of current approaches to explainable artificial intelligence in health care. The Lancet Digital Health, 3(11), e745–e750.
Pollard, T. J., Johnson, A. E. W., Raffa, J. D., Celi, L. A., Mark, R. G., & Badawi, O. (2018). The eICU collaborative research database, a freely available multi center database for critical care research. Scientific Data, 5, 180178.
Nguyen, T. C., Nguyen, T., Tran, B., et al. (2019). AI-assisted patient transport optimization in critical care. Journal of Healthcare Engineering, 2019, Article ID 3124567.
Kelly, C. J., Karthikesalingam, A., Suleyman, M., Corrado, G., & King, D. (2019). Key challenges for delivering clinical impact with artificial intelligence. BMC Medicine, 17(1), 195.
Kwok, T’ng Choong, Caroline Henry, Sina Saffaran, Marisse Meeus, Declan G. Bates, David Van Laere, Geraldine Boylan, James P. Boardman & Don Sharkey. (2022). Application and potential of artificial intelligence in neonatal medicine. Seminars in Fetal & Neonatal Medicine, 27(5): Article 101346.
Obermeyer, Z., Powers, B., Vogeli, C., & Mullainathan, S. (2019). Dissecting racial bias in an algorithm used to manage the health of populations. Science, 366(6464), 447–453.
Myers, C. N., et al. (2023). Pediatric ICU rates linked to housing quality, income, education. CHEST. https://doi.org/10.1016/j.chest.2023.07.014
Jung, M., Park, H., Kang, D., et al. (2021). Health disparities of critically ill children according to poverty: the Korean population based retrospective cohort study. BMC Public Health, 21, 11324.
Khan, M., Khurshid, M., Vatsa, M., et al. (2022). On AI approaches for promoting maternal and neonatal health in low resource settings: a review. Frontiers in Public Health, 10, 880034. https://doi.org/10.3389/fpubh.2022.880034
Beam, A. L., & Kohane, I. S. (2018). Big data and machine learning in health care. JAMA, 319(13), 1317–1318. https://doi.org/10.1001/jama.2017.18391
Doshi-Velez, F., & Kim, B. (2017). Towards a rigorous science of interpretable machine learning. arXiv preprint arXiv:1702.08608.
Morley, J., Machado, C. C. V., Burr, C., Cowls, J., Joshi, I., Taddeo, M., & Floridi, L. (2020). The ethics of AI in health care: A mapping review. Social Science & Medicine, 260, 113172. https://doi.org/10.1016/j.socscimed.2020.113172
Shelar, A., Bedi, P., & Tiwari, A. (2020). Artificial intelligence in healthcare: Issues and challenges in the Indian context. Materials Today: Proceedings, 33, 4126–4131. https://doi.org/10.1016/j.matpr.2020.05.729.
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution 4.0 International License.
You are free to:
- Share — copy and redistribute the material in any medium or format
- Adapt — remix, transform, and build upon the material for any purpose, even commercially.
Terms:
- Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
- No additional restrictions — You may not apply legal terms or technological measures that legally restrict others from doing anything the license permits.