Advanced artificial intelligence (AI) education will enable healthcare workers to lead the deployment of AI technologies.

The broad aim of the advanced artificial intelligence (AI) education will be to develop in-depth understanding of, and skills related to, various subject areas. This will enable healthcare workers to lead different aspects of the deployment of AI technologies in health settings and advise others.

Interviewees for this research identified several key principles that can guide the education and training for each archetype, as well as non-educational factors that can enable such education and training. These are listed in Box 3.

Appendix A provides further related analysis and lists the advanced educational requirements across the workforce archetypes and the factors that influence confidence in AI.

These advanced requirements will be additional to the foundational requirements (outlined in section 3.2.) and the requirements for product-specific education (section 3.4).

Box 3: Guiding principles and enabling factors for advanced AI education

Shaper
Guiding principles Educating Shapers is a priority, as their decisions will have downstream effects on all other archetypes through governance, guidance, and system transformation.

The safe, effective and ethical use of AI in healthcare should be at the heart of Shaper education.

Shapers across different organisations should be encouraged to work collaboratively to share knowledge, align messaging and create complimentary frameworks in relation to AI technologies.

Awareness and appreciation of developments outside the expertise of the Shaper and their organisation is key to joined-up governance and regulation.

Engagement of Shapers with the Creator and Embedder archetypes is vital to ensure practical frameworks that enable rather than constrain digital transformation with AI.
Driver
Guiding principles Drivers need to be equipped to ask the relevant questions of an AI technology prior to procurement or commissioning (see Table A3). They should be able to critically appraise AI to make evidenced strategic commissioning decisions.

Drivers should promote a workplace culture that embraces innovation, entrepreneurship, continuous learning and multidisciplinary working.

Drivers should champion a culture of transparency and diversity to promote fairness and inclusivity in the development and use of AI.

Drivers should understand the value of AI specialists and champion AI multi-disciplinary teams (MDTs).

Educational resources for Drivers should be flexible, efficient, and accessible.
Creator
Guiding principles Creators should understand both the technical and clinical aspects of the problem addressed, and the AI approach employed.

Creators should understand and appreciate user design and workflow integration.

Knowledge of the potential clinical consequences of using AI and the legal positions of creators, providers and users of AI technologies are essential for Creators.

Fundamental statistical and data science literacy are crucial for Creators, enabling them to detect and mitigate risks from bias in algorithms.

The development of diverse and inclusive AI multi-disciplinary teams (MDTs) can encourage co-creation of AI technologies and enable Creators to share their knowledge and expertise with others.

Expansion of training for specialist data scientists and informaticians could equip more NHS professionals for Creator roles.

Accreditation and recognition for AI co-creators and informatics specialists can professionalise this archetype and enable up-skilling.
Embedder
Guiding principles Embedders can have different specialised skill sets including IT and IG specialists, data-scientists, software engineers, safety teams and specialist clinicians.

Embedders should understand a broad range of topics at a detailed level, ranging from governance requirements and evidence evaluation to technical knowledge about AI algorithms, algorithmic biases and the importance of AI workflow integration for clinical confidence.

Workforce transformation will be needed to equip the healthcare system with sufficient Embedders of AI technology. This will require:

- professionalisation of specialist embedder roles

- expansion of training for DDaT data professionals and clinical informaticians

- upskilling of existing clinical and scientific trainees in education related to AI, with flexible training schemes and career opportunities, funded time and incentives for digital health training
User
Guiding principles Advanced education for Users should focus on the human-AI interaction and the impact of AI technologies on clinical reasoning and decision making (CRDM).

Users should learn how to communicate with patients about AI technologies, acting as ‘AI counsellors’ to help guide patients in interpreting the results of AI and guiding them about issues like data security.

Education for Users should be tailored by a professional group, guided by the clinical scenarios for AI in that area and the setting for their use (for example, emergency versus planned care).

User education should reach clinicians in training as well as those who are fully qualified.

AI foundational and advanced User education should be incorporated within existing undergraduate and post graduate curricula.

Equitable access to training and support for existing clinicians will be required, at both foundational and advanced levels, including special efforts to engage and support the digitally unengaged or unconvinced.

Support for existing trainees’ education will be needed, including study leave, funding and protected time for digital and AI skills training.

Page last reviewed: 18 April 2023
Next review due: 18 April 2024