The integration of Artificial Intelligence (AI) in Intensive Care Units (ICUs) has the potential to transform critical care by enhancing diagnosis, management, and clinical decision-making. Generative and Predictive AI technologies offer new opportunities for personalized care and risk stratification, but their implementation must prioritize ethical standards, patient safety, and the sustainability of care delivery. With the EU AI-Act entering into force in February 2025, a structured and responsible adoption of AI is now imperative. This article outlines a strategic framework for ICU AI integration, emphasizing the importance of a formal declaration of intent by each unit, detailing current AI-use, implementation plans, and governance strategies. Central to this approach is the development of tailored AI education programs adapted to four distinct professional profiles, ranging from experienced clinicians with limited AI knowledge to new intensivists with strong AI backgrounds but limited clinical experience. Training must foster critical thinking, contextual interpretation, and a balanced relationship between AI tools and human judgment. A multidisciplinary support team should oversee ethical AI-use and continuous performance monitoring. Ultimately, aligning regulatory compliance with targeted education and practical implementation could enable a safe, effective, and ethically grounded use of AI in intensive care. This balanced approach would support a culture of transparency and accountability, while preserving the central role of human clinical reasoning and improving the overall quality of ICU care.

Artificial intelligence in healthcare: Tailoring education to meet EU AI-Act standards / Bignami, E.; Darhour, L. J.; Buhre, W.; Cecconi, M.; Bellini, V.. - In: HEALTH POLICY AND TECHNOLOGY. - ISSN 2211-8837. - 14:6(2025). [10.1016/j.hlpt.2025.101078]

Artificial intelligence in healthcare: Tailoring education to meet EU AI-Act standards

Bignami E.;Darhour L. J.;Bellini V.
2025-01-01

Abstract

The integration of Artificial Intelligence (AI) in Intensive Care Units (ICUs) has the potential to transform critical care by enhancing diagnosis, management, and clinical decision-making. Generative and Predictive AI technologies offer new opportunities for personalized care and risk stratification, but their implementation must prioritize ethical standards, patient safety, and the sustainability of care delivery. With the EU AI-Act entering into force in February 2025, a structured and responsible adoption of AI is now imperative. This article outlines a strategic framework for ICU AI integration, emphasizing the importance of a formal declaration of intent by each unit, detailing current AI-use, implementation plans, and governance strategies. Central to this approach is the development of tailored AI education programs adapted to four distinct professional profiles, ranging from experienced clinicians with limited AI knowledge to new intensivists with strong AI backgrounds but limited clinical experience. Training must foster critical thinking, contextual interpretation, and a balanced relationship between AI tools and human judgment. A multidisciplinary support team should oversee ethical AI-use and continuous performance monitoring. Ultimately, aligning regulatory compliance with targeted education and practical implementation could enable a safe, effective, and ethically grounded use of AI in intensive care. This balanced approach would support a culture of transparency and accountability, while preserving the central role of human clinical reasoning and improving the overall quality of ICU care.
2025
Artificial intelligence in healthcare: Tailoring education to meet EU AI-Act standards / Bignami, E.; Darhour, L. J.; Buhre, W.; Cecconi, M.; Bellini, V.. - In: HEALTH POLICY AND TECHNOLOGY. - ISSN 2211-8837. - 14:6(2025). [10.1016/j.hlpt.2025.101078]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11381/3033195
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact