We use some essential cookies to make this website work. We’d like to set some additional cookies to understand how you use the website and to improve it. We also use cookies set by other sites to help us deliver content from their services. You can read more about our cookies before you choose.
This is a new website – your feedback will help us to improve it.
3.2 Foundational AI education for all healthcare workers
Chapter 3: Suggested Educational Approach
This section outlines key educational and training requirements to develop artificial intelligence (AI)-related knowledge across the whole healthcare workforce.
This foundational artificial intelligence (AI) education will aim to develop awareness and familiarity, as opposed to understanding, skills and capabilities in healthcare workers. Education and training at this foundational level will need to be delivered to large numbers of existing staff and incorporated into educational programmes for future professionals.
Workers who may take up any of the archetype roles will require this foundational education prior to any advanced educational offerings.
Table 1 lists the educational requirements for foundational AI education, organised according to the factors that influence confidence in AI (see section 1.2 and the first report1).
It includes requirements for AI-specific literacy as an initial area of focus. As noted in the suggested pathways outlined in section 3.1, broader efforts to advance digital literacy amongst healthcare workers will be essential groundwork to AI education and training. AI-specific literacy can be supported by these broader efforts, including data literacy initiatives in development like ‘Dataversity’ within the AnalystX platform.43
The requirements in Table 1 can support the design of education and training programmes for various settings. These approaches could include incorporating the suggested requirements into undergraduate and postgraduate curricula, providing educational offerings tailored to specific roles and specialities, and offering online open access courses for continuing professional development, that can be tailored to individual need. The requirements can also inform considerations for specialist sections in existing provision, like the NHS Digital Academy offering, and related content in the NHS Learning Hub.
These educational efforts will need to extend beyond simply providing access to existing external education resources. In the context of educational material for data analysts, the Goldacre Review4 highlighted the current availability of ‘an almost limitless array of self-directed online teaching through services such as Coursera, or some MOOCs (Massive Open Online Courses), but no clear signposting or curation of ‘’journeys’’ through these courses, or guidance on which to choose.’ This is equally applicable to AI technologies, where a plethora of technical educational material is available. However, there is a dearth of material tackling some important healthcare-specific challenges like good practice in AI-assisted clinical decision making. A clear, structured educational strategy will be required to guide learners to the appropriate information to meet their AI learning needs.
Table 1: Requirements for foundational AI education
AI literacy - knowledge taxonomy
Awareness of examples of AI applications and algorithms used in health and care.
Awareness of types of healthcare problems best suited to AI.
Awareness of how data-driven algorithms learn.
Familiarity with the definitions of algorithms, AI, machine learning and deep learning and how they relate to each other.
Governance - knowledge taxonomy
Regulation and standards
Awareness of the importance of compliance with medical product regulation for AI (including UK Conformity Assesse (UKCA)/CE marking by the Medicines and Healthcare products Regulatory Agency (MHRA), and General Data Protection Regulation (GDPR), including applicable standards (for example, NHS Digital and ISO standards) for product development and risk management.
Familiarity with any developed guidance from regulators of healthcare workers on the development and use of technology including AI (including General Medical Council (GMC), Nursing and Midwifery Council (NMC), Health and Care Professions Council (HCPC)).
Familiarity with the regulation of healthcare settings and how this applies to the use of technology (including relevant guidance from care inspectorates like the Care Quality Commission (CQC)).
Evaluation and validation
Awareness of how AI algorithms are tested and validated.
Awareness of the difference between internal validation, external validation, and prospective clinical studies.
Awareness of the importance of reproducibility and generalisability of AI models, and the risk of data and model bias in AI, which may disadvantage specific groups or reinforce existing health inequalities.
Guidelines
Familiarity with where to find clinical guidelines that apply to using AI technologies.
Familiarity with good practice for use of AI when no product-specific clinical guidelines exist (for example, applying the principles of Good Medical Practice15 or Good Scientific Practice16).
Liability
Awareness of issues relating to personal and organisational liability for AI technologies.
Implementation - knowledge taxonomy
Strategy and culture
Awareness of the potential value of AI technologies for healthcare systems and for patients.
Awareness of the main risks of deploying AI technologies into clinical settings.
Awareness of examples of successful implementation of AI technologies in healthcare settings.
Awareness of the importance of a multi-disciplinary approach to AI implementation that involves clinical, technical and managerial roles.
Awareness that AI technologies may lead to inequitable distributions of patient outcomes or disadvantage certain patients.
Technical implementation
Awareness of the need for interoperability and seamless integration of AI systems.
Awareness of the importance of ongoing monitoring to ensure continued safe, ethical and effective use.
Familiarity with information governance principles and how these apply to patient data.
Clinical use - knowledge taxonomy
AI model and product design
Awareness of the limitation of ‘black-box’ AI, and attempts to address it including transparency initiatives and/or explainable AI.
Awareness of the potential risk of deskilling of the clinical workforce as a result of deploying AI technologies, and the importance of considering ways of mitigating this during product implementation.
Awareness of the potential risk of deskilling of the clinical workforce when using AI technologies, and the importance of considering ways of mitigating this during product implementation.
Familiarity with the difference between autonomous AI and human-in-the-loop systems.
Familiarly with the role and responsibilities of clinicians when using AI for clinical reasoning and decision making (CRDM).
Cognitive biases
Awareness of the risk of users being under- or over-confident in information derived from AI.
Awareness that cognitive biases (including automation bias and rejection bias) can affect decision making with AI.
Interface with patients
Awareness of the importance of fairness, transparency, and accountability when deploying AI technologies.
Awareness of the challenges of empowering patient choice when involving AI in CRDM and care pathways.
Clinical use - skill taxonomy
Capable of weighing a prediction from the AI against other forms of clinical and demographic information during clinical decision making.
Capable of counselling patients about the use of their data by the AI technology, and how this data is accessed, processed and stored.
Capable of explaining this tool and its impact on CRDM to the patient, including access to patient communication materials.
Capable of communicating the risks and limitations of the AI technology to guide the patient through decisions about their health.
4 Goldacre B, Morley J. Better, Broader, Safer: Using health data for research and analysis. A review commissioned by the Secretary of State for Health and Social Care. Department of Health and Social Care. 2022. https://www.goldacrereview.org/ Accessed May 24, 2022.