Digital, Artificial Intelligence and Robotics (DART-Ed) webinar series
We are running a series of webinars looking at recent exciting developments in artifical intelligence (AI), robotics and digital technologies.
On this page
Webinar 1 - Digital, Artificial Intelligence and Robotics across the system - an introduction
The first webinar took place on 18 November 2021 providing an introduction to the work of Health Education England’s DART-Ed programme. We looked at the wider landscape of activity across the system, particularly the work of our partners, including NHSX (now the NHS Transformation Directorate), and examining what the main issues, barriers and benefits are in implementing these technologies with an opportunity for participants to ask our expert panel questions and find out more.
On the panel were:
- Patrick Mitchell (Chair) - Director of Innovation, Digital and Transformation, Health Education England
- Dr Hatim Abdulhussein - Clinical Lead of DART-Ed Programme, Health Education England
- Richard Price - Project Manager, Technology Enhanced Learning, Health Education England
- Brhmie Balaram - Head of AI Research & Ethics, NHS AI Lab, NHSX
Watch a recording of this session below.
Watch webinar 1
0:08
Good to see so many people joining us this afternoon, my name is Patrick Mitchell.
0:13
I'm the Director of Innovation, Digital and transformation at Health Education England and the Senior
0:19
Responsible Officer for the Dart-ED programme, as well as the Digital Readiness Programme in HEE
0:25
Delighted, we've got so many people joining us. Can I ask people to turn their cameras off and to put themselves on mute, please, during the session?
0:36
And obviously as we come to Q&A, we can bring people back in,
0:42
but would be hugely helpful if you could cameras off and mics off. Thank you.
0:49
Absolutely delighted that HEE's able to work across
0:54
the digital skills agenda, particularly looking at the implications of AI Digital Health Care and Robotics on the NHS workforce
1:06
following the Topol review, which was published in February 2019.
1:11
This is one of a number of webinars we're going to be running, and I'm now going to ask the team to introduce themselves and then get into the presentation, so we'll kick off with Hatim.
1:22
Thanks, Patrick. My name is Hatim Abdulhussein, and I'm a GP Registrar in North West London, and I've been at Health Education England, working as part of our new Digital AI, Robotics and Education and Training programme.
1:32
Thanks Hatim, Brhmie?. Hi, everyone. It's good to be here, I'm Brhmie Balaram, I'm the Head of AI Research and Ethics in the NHS AI lab and I lead our AI ethics initiative.
1:44
Fantastic. And Richard. Good afternoon. I'm Richard Price. I'm the Learning Technologies Adviser here at Health Education
1:50
England, part of the Technology Enhanced Learning teams and we look at all of the different touch points where potentially technology might be able to support learners with their education and training.
2:01
Thank you, Richard. So we'll kick straight off. Hatim is going to give an introduction, Brhmie's
2:07
then going to do an introduction to the AI Lab and the collaboration with our programme.
2:12
And Richard is going to be talking about our technology enhanced learning programme and how it links to this piece of work. So over to you, Hatim.
2:21
Thank you. Right Patrick, I'm just going to bring my screen up and got some slides to share with me today
Overview
2:26
that we had over the year. So I mean, this really all started with the Topol review, which was published in 2019, which made quite clear
2:33
recommendations around education and training of our current and future workforce around increasing the numbers of clinicians with knowledge and skills around AI and Robotic technologies
2:42
and other digital health care technologies, as well as some specialist skills in our healthcare scientists, our technologies and analytics specialists as well.
2:49
And it's quite clear from our point of view that Health Education England to continue looking at our current workforce and how we can work in a digitally enabled system
2:56
and our future workforce to ensure that they will be able to work into a system that's going to continuously adapt and transform digitally.
3:04
Some of the definitions are really important. So the Topol Review defined AI as a broad, field of science encompassing not just computer science but psychology, philosophy, linguistics and other areas,
3:13
and is essentially concerned with getting computers to do to us that normally would require human intelligence. The Topol review also defined
3:19
digital medicine as digital products and services that are intended for the use of diagnosis, prevention, monitoring and treatment of disease, condition or syndrome and this can encompass as a whole range of technologies.
3:29
Many of these technologies we are now using daily in practise like telemedicine. There are many patients or consumers that are using some of these technologies themselves in terms of smartphone
3:37
applications and wearable devices, for example, and coming to have conversations with us about those. And as emerging sort of software that we use into the e-prescribing, electronic health care records,
3:46
point of care testing and in emerging technologies like an extended reality technology such as virtual reality,
3:51
augmented reality could have quite interesting healthcare applications. So our programme has three key projects.
3:57
The first project is looking at the impact of these AI and Digital healthcare technologies on education and training, which we've been working with a wide range of partners with.
4:04
The second project is with the Royal College of Surgeons looking at robotics literacy as part of their radar initiative.
4:10
And we're also been working quite closely with the British Association of Dermatologists looking at developing digital leaders in dermatology.
4:16
And we'd love to continue that sort of relationship with the British Association of Dermatologists but other faculties, other colleges and thinking about that journey.
4:23
All of this work is overseen by our digital, an education steering group. So we're looking at this from three means really one is horizon scanning, knowing what's coming.
4:31
And so we've been looking at A.I. technologies in particular and creating an A.I. road map to see how many technologies are out there currently in the system.
4:38
How far a day away from scalability or deployment. What kind of workforce groups are there and what kind of workforce impact that they can create.
4:46
And also taking some deep dives into a couple of these technologies to understand that a little bit further. We also need to be really proactive and really assess what the learning needs are.
4:54
So we're continuing some of the work I've done previously at HEE to develop a digital literacy framework to understand what their learning needs
4:59
are going to be and the capabilities that our efforts are going to need for A.I. and digital healthcare technologies with a view that actually this could provide a basis for a curriculum.
5:07
You know, you can look at this, you can pick up the capabilities and map it to your professional workforce group and build it into an undergraduate and postgraduate curriculum.
5:15
And it also can help us to direct where we're going to deliver some of our education and training and opportunities in this space.
5:21
So if you go to the learning hub now, you'll be able to see one of our catalogues, which is Digital and AI Robotic Technologies in education,
5:27
and there will be some learning materials that are already signposted to within that catalogue. But what we hope to do is continue
5:32
to build a catalogue for it to be sort of the go to space if you want to look for extra learning in this area.
5:38
And the reason why you're all here today is because you are interested. I want to quickly run through in terms of what the plan is for the future,
5:45
so this is the start of the DART-Ed Webinar Series. All webinars will be free to attend, all webinars will be recorded and added to the DART-Ed catalogue on the NHS Learning Hub.
5:53
And if you keep an eye on the DART-Ed website and the HEE digital readiness social media platforms,
5:58
that will give you an idea for when the webinars that will be coming in terms of exact dates and what they'll be covering.
6:03
This is our first webinar talking about Digital AI Robot is across the system. In a future webinars, we may explore themes such as ethics and confidence in AI and early next year,
6:11
in the spring of next year, we're going to be looking at nursing and midwifery and Digital AI and in that space,
6:16
we're going to have a spotlight in AI and health care in this late spring of next year, followed by a webinar over
6:22
the summer around Dentistry and culminating in a webinar on robotic assisted surgery in the autumn of next year.
6:29
So that's the current plan going forward. But we have to continue this webinar series and we're really pleased to see the engagement that we've had today.
6:35
If anyone wants to get in touch, feel free to reach out, and I'm now going to pass over to my colleague Richard, sorry to my colleague Brhmie.
6:44
Thanks, Adam. OK, so the NHS AI lab was established in 2019
NHSA Lab
6:50
with the mission to enable the development and adoption of safe, ethical and effective A.I. driven technologies in the UK health and care system.
6:57
So one of the main ways that we're doing that is through supporting innovators to carefully scale their technologies for use across the NHS.
7:05
And we primarily do this through the AI and Health and Care Awards. And we have a number of technology companies that are CE marked, meaning that they have
7:12
regulatory approval and that we are now supporting with evaluation of their products.
7:18
So alongside the AI Award programme, we also have skunkworks, which develops proof of concepts that are often proposed by health care practitioners who are who are identifying a particular need.
7:29
We have the AI imaging programme that is currently standing up, a national medical imaging platform to support
7:35
training and testing of imaging based AI products. And we also have the regulatory programme that is working with organisations like MHRA
7:44
and Nice to streamline the regulatory pathway. So, for example, they're collaborating on the creation of a multi-agency advice service,
7:52
which would bring together regulatory information and make it accessible through a single portal.
7:58
I lead our AI ethics programme, which was launched earlier this year in February. We support research and practical interventions that can complement
8:06
and strengthen existing efforts to validate, evaluate and regulate AI technologies in health and care.
8:12
And we have a particular focus on countering the inequalities that could emerge from the way that A.I.
8:19
is developed or deployed. And one of our main objectives is to support a key goal of the lab, which is to build trust and confidence in AI systems.
8:28
So we're therefore investing in a number of practical interventions, such as working with the Ada Lovelace Institute to develop a model for algorithmic
8:36
impact assessments that can be used as a condition of accessing data
8:42
from the National Medical Imaging Platform. The project that I wanted to highlight, however, is a partnership with Health Education England.
8:50
It's being led by Dr Michael Nix and Dr Annabel Painter with support from one of our colleagues in the lab, George,
8:56
and the project is specifically exploring how to build appropriate confidence in AI amongst the health and care workforce.
9:03
So we want staff to feel like they have enough confidence to use these technologies, but we don't want them to be overconfident in these technologies to the point where they are overriding
9:13
their own judgement or placing more trust than is warranted in a certain product.
9:18
So this is obviously a very difficult balance to strike, but that's what we're trying to aim for.
9:23
And I think part of how we will get there is by being clear about the expectations we have of these health care staff
9:30
when using these technologies. So, for example, in relation to their role in post-market surveillance,
9:35
there's a certain level of knowledge and understanding that health care staff need and that can be gained through a training course,
9:42
but they will also need to be supported at national and local levels. So this project also looks at the role the roles of senior leadership in driving
9:50
and shaping the use of these technologies in the workforce. Thank you, we're not going to be handing over to Richard Price.
Technology Enhanced Learning
10:00
Richard, thank you. Thank you Patrick, thank you Brhmie. Good to be here. My name's Richard Price, as I said and I work in a team called Technology Enhanced Learning.
10:09
So that's a team within Health Education England. It's a specialist team that's responsible for looking at all of the different points where potentially we might
10:17
come across education and training, where technology might be able to support that education and training.
10:23
And that's all the way through from a pre 16, when someone needs that first thought to think about creating health care through their undergraduate and postgraduate education
10:30
all the way into the workforce and potentially beyond as well. So, like I say, a huge area to cover. Look at this
10:39
training for 1.2/1.3 million people and how technology can potentially support all of those things, and we do that in a number of ways.
10:46
So the main delivery mechanism that everyone's familiar with our delivery platform, so Hasim mentioned briefly the learning hub, which is our new platform, and it provides free access
10:55
to a wide range of informal resources that are shared and contributed by the community.
11:01
So they're potentially things like e-Learning courses, but also things like video, audio images, podcasts,
11:07
web links, all of those different things that you might come across from an education and training context. We then have a second platform called e-Learning for healthcare, which is very much our formal offer.
11:17
It's 450 or so programmes on that, including our flagship programme, which is corona virus and the vaccination programme.
11:25
And we've had millions of users that have accessed that and gone through that training. So we're really proud of that and that was produced in response to the pandemic.
11:33
So that again is freely available and we're starting to see some content on AI starting to make its way into all of these platforms.
11:41
The third platform that we offer is called Digital Learning Solutions and that is very much about the I.T.
11:46
side of training. So it hosts that sort of national digital capability, digital literacy content, as well as sort of details
11:53
of clinical systems and the more familiar sort of office tools that you will be familiar with and things.
11:58
So quite a broad range of different systems that we, we offer as part of that. And alongside that, we have programmes that support simulation and immersive technologies.
12:08
So you might have seen some of our work around HoloLens, for example, which the head worn display that provides virtual augmented reality content.
12:18
We also have virtual reality offers and things like that, as well as part of that. Again, all available via these three platforms.
12:25
Third area just wanted to focus on was our regional networks. So these are a new set of partnership managers essentially that we've created at regional level
12:35
that give us access to individuals working in hospitals in local care offices and things like that.
12:42
And that is is a way of supporting those individuals, those organisations with access to I.T., for example.
12:47
So making sure they've got access to sufficient Wi-Fi and things like that to be able to deliver some of these AI enabled features that we're talking about today.
12:56
So it's a really exciting team. Lots happening, and that's a very, very brief overview.
13:04
Brilliant. Thank you very much, all three of our speakers. I hope that's given people a taste of what we're up to.
Capability Framework
13:12
I've got three questions that came in from people earlier,
13:17
which I'm going to pose to you all and then we'll have time for Q&A
13:22
from the audience, too. But the first question? What new skills to health care professionals need to develop and how do you plan to help
13:29
patients and staff adapt to technological changes? So Hatim, do you want to take that first?
13:37
Thanks, Patrick, this is where I believe our capability framework will come in. So this is why I mentioned a learning needs analysis beforehand.
13:44
We need to spend some time understanding what their skills are going to be. So how we've done this we worked with the University of Manchester.
13:50
We did a systematic review to look at the existing literature out there. We held a series of focus groups and workshops with key
13:58
subject matter experts those that are working directly with technologies on the front line. Some leaders in education as well, as well as industry experts who are developing and innovating in these terms
14:07
and these technologies to identify what they based on these emerging technologies are healthcare professionals are going to need to understand or need to know,
14:15
and this framework will provide guidance across various factors. Some of the skills that it talks about some of the tools that were mentioned in the Topol review around data
14:23
curation, integration, provenance, security and safety around that data.
14:28
It also goes into some basic skills around digital transformation. There are skills around some of the human factors.
14:35
So with interactiviting with these technologies, what kind of impact does it have on that relationship with your patient?
14:40
And what kind of impact does it have on the relationship with your team that you work within? And there are also factors around ethics, and I'm sure Brhmie will provide a lot of insight into this.
14:49
And these core themes will essentially form one of the curriculum in AI and Digital Healthcare that can then be lifted and used by educators
14:59
to be put into curricula and training, but also can be used by an individual learner who wants to learn in this space to say, actually, this is...
15:11
You've gone on mute Hatim. Oh sorry! Where did I cut off? So it can be used by educators to lift and be put into education and training,
15:19
but it can also be used by individual learner to say 'This is what I now need to know. How can I then use this to go out and find the resources and the knowledge that I need based on these factors?'
15:30
I think it empowers the individual and because we know actually and from the evidence that we are all seeking knowledge in our own time and space it around these things.
15:38
Thank you, Brhmie? And I think so. I think on this question, health care practitioners will need a foundation and digital literacy
15:47
to build from, but they also need to acquire AI specific skills and knowledge. So, for example, they will need to learn to understand and outputs and their limitations,
15:57
and they will need to be reasonably critical of the information presented by... to avoid automation bias. And
16:05
I also think that we should emphasise the need for soft skills to support changes in the relationship between patients and clinicians.
16:12
So as A.I. is increasingly used for diagnosis, for example, health care practitioners
16:18
will need to take on more of a health counselling role as opposed to just relaying information
16:23
as AI takes on more physical, repetitive and basic cognitive tasks that will be more important than ever
16:30
that clinicians are able to demonstrate empathy and emotional intelligence in their communication with patients.
16:36
And as for preparing the public, I think that there will need to be greater engagement
16:41
and education in order to ensure a holistic approach to deploying AI technology safely and effectively.
16:48
So, for example, there are some instances of health care practitioners in industry exploring how to engage patients on issues of consent.
16:56
And I think it would be useful if there could also be more engagement to demystify A.I.,
17:02
also to address fears and clarify the limitations of these technologies as well.
17:10
Brilliant, thank you. And, Richard. Thanks, Patrick. So I think what we're seeing in the education space is that people are starting to move into this hybrid working model.
17:21
So we're seeing a lot of people with portfolio careers sharing their skills and competencies across different organisational boundaries and things like that.
17:28
So I think what we're seeing is learners starting to manage their own, learning their own portfolios and their own
17:34
competencies as a result of that. So for me, I know both for me and Hatim have talked about this, but I think to me, the digital capabilities are going to be key.
17:43
So that is things like, to introduce another framework, I suppose, our own digital literacy framework of digital capabilities that are part of that.
17:51
We're going to need people that are flexible and adaptable to that different culture that's going to happen
17:56
as a result of these different ways of working and the introduction of these A.I. and robotic surgery technologies.
18:03
And I think to some extent, we're preparing a workforce for a future that we don't even know what it looks like yet.
18:08
The Topol review talked about some of those different changes that the challenges that are going to be presented, but actually, we're not
18:14
entirely clear on what that's going to look like ourselves yet. So a lot of this is pre-emptive work and and like I say, getting people ready for those changes.
18:21
And I think digital literacy really underpins that and key to getting people in the right, the right space the right frame of mind for that.
18:28
Thanks, Richard. second question, how are, or should HEI's quick pre-registration students
18:35
in the use of AI and Robotics in health care? So Brhmie, do you want to kick that one off?
Teaching AI
18:41
Yeah. So I think that higher education institutions can start to embed a basic level of A.I.
18:48
specific knowledge within clinical education. So, for example, they could teach students to appraise evidence from an AI derived information,
18:56
which would be similar to the usual criteria for assessing academic papers. They could teach students about how AI technologies are developed and trained for use in health care.
19:06
They could relay what's required to develop and maintain optimal A.I. solutions. So introducing concepts of data quality.
19:15
We could, I think we could also teach students about how AI is validated,
19:21
and I think it's really important that they reflect on critical points of integration with the health system and how poor implementation could actually weaken the performance of AI.
19:29
So I think it'd be really important for students to understand that algorithms can perform differently
19:34
in different settings because of the way that they're introduced or operationalised in a clinical pathway.
19:40
I think that there probably actually isn't a need to teach code or advanced statistics, so basic concepts and statistical concepts
19:46
are already being taught in medical school like diagnostic accuracy, sensitivity and specificity.
19:52
So I think that that should be sufficient. But again, I think that there needs to be an emphasis on soft skills.
19:59
So kind of what I mentioned earlier in terms of thinking about how the relationships between patients and clinicians are going to change
20:07
as a result of the introduction of some of these technologies. Brilliant, thank you, I'm glad.
20:13
I'm glad you said we don't need to teach code for this, not for this group of professions anyway.
20:18
We certainly need people who can do code, but that's in a different group of the digital workforce.
20:23
Richard. Thanks, Patrick. So as I said, I think we're preparing the workforce for a future that we don't even know
Personalized and adaptive learning
20:31
what that looks like yet, but I think what we are seeing is a trend towards as Brhmie touched upon, sort of e-patients, for want of a better word, so patients starting to manage their own care at home,
20:40
using wearable devices and things like that, and AI is really key to being able to analyse that and notify clinician when they need to do an intervention and things like that.
20:49
So I think we're going to need a workforce where clinicians are relying much more on that technology.
20:54
Ppotentially using more telehealth, telemedicine. So these are going to be skills that we need to teach our workforce to be able to deal with that different way of liaising with patients.
21:04
But still care being at the heart of that, still that that bedside manner, as we used to call it, being
21:10
at the heart of that So from an education point of view, I think where we're going to see AI coming in is much more
21:18
personalised and adaptive learning. So if you think about somebody actually doing a ward round, for example, they might have a tablet in front of them
21:24
that has a list of all of their patients that they're about to see. If we can map that information on the screen there to their own training record,
21:31
we can go, okay, this patient's, this clinician is about to see a patient with sepsis, for example. We know from their training record that they're not seen a patient with sepsis or they've not done
21:42
any training on sepsis for a number of years. So actually, we can intervene at that point, give them some personalised just in time learning,
21:49
which might be a video or something to give them that confidence to be able to treat that patient and give them the information they need.
21:56
So that's that kind of predicting learning of what they need. We can also start to see AI being used to
22:03
adapt the way that we deliver training. So if somebody's struggling with a particular subject and perhaps focus a little bit more on that and we can start
22:09
to look at outliers and things and support people before they start to struggle.
22:14
So actually, just in time interventions that are actually supporting those learners
22:20
before they even start to get to a point where the potentially going to fail or struggle with their course. So there's some real opportunities, I think, in how we can use A.I.
22:27
in the education space. Thank you. I think that one in particular around being able to track people and help them where
Automation and roles
22:35
they're particularly struggling with particular subject areas is going to be hugely helpful. When you think of the cost of attrition from health care programmes, it could be a real opportunity
22:45
there to gain and to make sure that we don't lose those people from the training pipeline.
22:52
Final question and this is one, I think for Hatim. What medical specialities and roles can we anticipate being heavily impacted
23:01
by automation, resulting in greater or fewer posts and vacancies? So Hatim, do you want to take that one?
23:08
Thanks Patrick. I mean, the first thing to say is this is a really interesting topic and one that we can really share a little bit of insight from the work
23:15
that we're doing around the AI roadmap and the dashboard that actually developing and we're soon to publish. So do keep your eyes out for that.
23:22
And that is really important because it gives us an idea of currently what kind of technologies are...
23:27
what kind of AI and data driven technologies are in the system and how many of them are there. And I mentioned before it was about clinical workforce groups that are going to be affected by these technologies.
23:36
And now the first thing I want to caveat with is actually regardless of the technology, or the workforce groups that we know are going to be affected by AI,
23:44
we have to recognise that actually to work effectively with AI, you still need a human in the loop. You need someone in that process.
23:51
So it's not about changing the number of people we necessarily need in all these areas and specialities.
23:57
It's more about how we support those areas, those working in those areas to work effectively with the technology that they're going to be using to enhance patient care
24:06
and optimise the patient pathways that we have in our current system. That's the most important thing to recognise.
24:12
We know that at the moment, there's a clear shortage of, you know, certain workforce groups
24:18
so that actually there's a big role here that digital transformation can play to actually enhance and support us
24:23
in delivering a care and making us sort of more efficient in the way we deliver care as well.
24:28
So the AI roadmap tells us that there are quite obvious clinical workforce groups that are most likely to be affected by AI and data driven technologies in the short term.
24:38
And these are groups such as Clinical Radiologists and Radiographers.
24:45
It's also workforce groups such as General Practise, Cardiology, Adult Nursing.
24:50
These are some of the workforce groups that are quite clearly coming up in that chain of a workforce insights that we do have.
24:56
Now the important thing to recognise, when I say a workforce group, there's a massive team that works around that workforce group.
25:03
So if we talk about general practise on a day to day basis, I work in a surgery with a practise nurse,
25:08
advanced nurse practitioner and sometimes paramedics and physician associates, significant administrative and reception support.
25:16
And so if I'm going to be using an AI and data driven technology in my practise, what impact does that have on my colleagues?
25:23
So if we say it affects the general practitioners, it's also going to affect that wider team. What does that mean in terms of changing workloads?
25:30
So if I'm using something to then triage patients through automation type of work that eConsult are doing and what they're being awarded an AI award for.
25:37
When they triage that patient it's going to go to a doctor, it might go to a pharmacist.
25:43
And who's the point of contact when that patient is triaged. Is that the receptionist? But this is what we need to try and delve deeper and understand
25:50
so the dashboard is really interesting because it gives us that initial insight. But now we need to look at specific case studies.
25:56
We need to look at specific technologies and what impact they're having at a patient level. So in AI roadmap report
26:01
that you'll soon hopefully have eyes on in the near future, there will be two case studies within that. And one of those is obviously Health,
26:07
which is the technology that is used on mental health wards and essentially allows patients to be monitored on the ward
26:13
by the nursing staff, by the clinical staff on the ward more effectively.
26:18
And in that case, study, we're really able to dive deep and say, 'Okay, what did actually mean
26:24
for the way we didn't deliver that care in that ward and the way you want it to those patients on the ward? And what impact do they have on the patients themselves?' And you find that actually
26:31
the patients found it really helpful because they didn't need to be woken up to have themselves to have their, you know, their observations taken every an hour or so, every two to three hours.
26:40
This was done autonomously. It was sent back to the nursing staff and it created a better relationship between the patient and between the staff delivering the care.
26:48
Another technology that were going to have a deeper dive into something called Alturnun Health, which looks at diagnostic radiography.
26:54
And again, if we're improving diagnostic radiography, what impact that it'll have on a radiographer and the radiologist that are within that workflow?
27:01
And also the medics who are requesting the scan in the first place. And ultimately, what is the patient impact?
27:06
So as we dive deeper into the understanding, we can actually use these technologies effectively but also understand what skills
27:12
we need to support in terms of our training for the workforce and are going to be using this technology. And I think that's the most important thing to recognise, rather than worrying about vacancies
27:21
and jobs is how can you make sure that we have the right skills to use technology?
27:30
Thanks very much, Hatim. Really helpful.
27:35
I want to open up to questions, and Chris Munch asks the first question, Chris do you want to...
27:44
answer openly to the audience? Just I'm just a word of caution, really.
Data and AI
27:51
We know that about 80% of health care data is unstructured, it's in a mess basically and is not easily available
28:00
to AI to create the algorithms. Is it not...
28:06
should we not be getting our data in order and into shape before we get too carried away with the potential for AI?
28:13
Thanks Chris, good challenge. Who wants to take that one first?
28:21
I'm happy to take it. I mean, I think I think it's very important. Chris, this is a really relevant point and we have to establish that actually from an education
28:28
and training point of view, it's irrespective of how good the technology is yet because we still need to understand
28:34
it. And actually, if we're going to get to where we want it to get, if you want that data to be more structured and to be used in practise, you need both data scientists.
28:41
You need understanding amongst healthcare professionals in terms of putting that data into the system, coding that data appropriately.
28:47
And I'll give you a very real life example of something that happened to me last week in practise. So I was there going through lab results. I was going through my labs.
28:55
I was looking at the COVID results that came back and I was just filing the ones that are positive and negative.
29:02
And my trainer came into the room and had a conversation with me and said, Actually, you know, the ones that we actually that come up with COVID positive, we need to make sure that we code them as COVID positive
29:10
something that hadn't been done or made aware of, but actually had a significant impact in terms of the day to day. NHS Digital, for example, collects
29:17
and a significant impact on any potential technologies that are going to be using this data. You know, so we don't code things properly.
29:22
We don't understand why that's important, that we're not going to get to where we want to get to. So that's really important to understand.
29:28
Is our role really from out from our NHS point of view is to ensure that we get the right
29:33
skills to actually get to the point where we want to get to and that we need to be doing that right now. So we're not necessarily running before we walk.
29:39
We're putting everything in place that we can do to help us walk in the first place.
29:45
Can I also comment on this question? Please do. Yeah, so I agree with what Hatim is saying in terms
Public data sets
29:53
of we're able to do a lot of things like simultaneously in terms of
29:59
making sure that the adoption of these technologies is effective and safe and ethical.
30:04
But just to highlight the importance of the question that was asked. I think that there's a really interesting analysis that was carried out
30:12
by the University of Cambridge relating to algorithms that were developed during the pandemic.
30:18
So they had reviewed a number of academic publications related to these algorithms, and they found that out of the thousands of papers
30:28
that there were only like 29 algorithms that had sufficient documentation to be reproducible,
30:35
but that a number of them still had severely biased datasets. They weren't externally validated and they didn't have proper sensitivity or robust analysis.
30:47
And what the authors of this paper talk about is this idea of Frankenstein public dataset.
30:53
So public datasets that are combined and are used by researchers and developers
31:01
who are unknowingly training and testing their models on datasets that are overlapping.
31:06
So I think that we recognised in the AI lab that there's a lot of
31:13
demand or something like the national medical imaging database that's essentially curating
31:20
a publicly available dataset that will hopefully be interoperable, but will also enable proper validation
31:27
of these types of algorithms so that they can be used on their intended populations in the UK.
31:34
So, so this is something that we are currently working on actively.
31:41
See, and if I could just add to that in terms of thinking about the education side of things again.
31:47
A lot of the data, I would go a step further than what Chris is asking this question. I think in some cases the data doesn't exist, so we need to start capturing that.
31:57
And actually, I think if you think about simulation, for example, simulation mannequins are really sophisticated now.
32:03
They can produce huge amounts of data and that tends to not be stored anywhere, captured anywhere.
32:08
So a lot of us are working behind scenes now to look at A.I. ready data standards. So the experience API being an example of that X API or Project Tin Can, as you might record in the past,
32:20
it's all about capturing all of that really rich data set in a format that's interoperable and allows us to do an awful lot with it.
32:28
I think the other thing I just wanted to briefly touch upon is the power of taxonomy in all of this.
32:33
I know those of you that know me, know that I. I bore everybody to death with my love of taxonomy. But you know what? They're really important. They're our super power.
32:41
If you don't get your clinical coding right, if you don't get your, you know, your richness of your data, it's impossible to match anything up.
32:50
If you're thinking about trying to map what I was talking about earlier clinical datasets with education dataset, the only way you're going to be able to do that is to have
32:57
a bunch of synonyms behind the scenes that link different clinical codes together. So there's some huge fundamental work to do that with taxonomy.
33:05
I'll get off my soap box now! Thank you. Thank you, Richard, for anybody who knows Richard and goes to meetings with Richard.
33:11
You know that taxonomy comes up at least once in the meeting, which is which is great because it really stresses the importance and reminds us of the importance of getting it right.
33:22
We've got a couple of questions coming in. So Dermot, first and then Rosie. Dermot, welcome.
33:28
Well, thank you very much. I kind of regret asking the question, but it's actually on taxonomies because
33:36
we have an AI which actually manages the mappings between clinical codes,
33:41
and it's just come to market this year and we're just seeking the right area of the NHS to bring it to.
33:47
We are trialling it with one of the large companies in the UK at the moment, but we really want to get it into a real clinical setting to find out how to make it ergonomic.
33:56
It's actually quite simple to operate, but we want to make it even simpler. So any advice on angles there would be really appreciated.
34:03
Who wants to take that one. I think it best be me if it's taxonomies. Go on, Richard.
taxonomies
34:11
I guess from our point of view, I think we've learnt a lot of lessons along the way when we've been trying to implement taxonomies.
34:18
I think we have an idea of reinventing the wheel if I'm honest. When we first started going, yeah, we will build our own taxonomy, that was a really stupid idea.
34:26
There's some fantastic open source taxonomies out that we can use, and I guess you end up sort of just doing the bit that you specialise in.
34:34
So use Nomed use mesh use the World Health Organisation ICD taxonomy.
34:39
It's all there for the taking. And what I would say is those are really specialist taxonomies and they tend not to cover
34:46
some of the nuances of the UK landscape. So that's where we've had to build our own taxonomy to fill those gaps.
34:54
And I think again, I would focus on your area that your specialist in. There's no point in building a taxonomy, I don't know anything about radiology, for example.
35:02
So why would I build a radiology taxonomy? I would build one about an idea about which is education and training,
35:08
and then outsource that to an area of the NHS that knows that area really well. And then you end up with specialist taxonomies being developed, maintained by specialists across the system
35:19
and sharing these as one big happy NHS family as it were. Thank you, Richard.
35:25
Rosie, your question? Yeah, hi, everybody.
simulation
35:30
I'm sorry to introduce myself, I'm Rosie, and I work for HEE Southeast and I'm Programme Lead for Simulation
35:36
TEL and Patient Safety, and I've now got a programme board with two associate, three associate deans.
35:43
And so we've got the governance structure. So we're in a really good set up. And I've just had funding for simulation fellows in each of our counties.
35:52
So actually I've got eight, hopefully starting in January and then another five next August, and they'll be doing a PG Cert in Simulation with this.
35:59
I just wanted to. It's not a question. I just want to say that this is what I've got, so I think it'd be great if we could link up.
36:04
Or, you know, when I get these fellows, how could I steer them into their projects bringing something, you know, a really coordinated kind of output at the end of it?
36:13
So just to just say hello really and hopefully invite you at some point to come and talk to my sim fellow network
36:20
would be brilliant. Thanks, Rosie.
36:26
Can I just say a few words about this? I mean, there's two aspects of this is one to use in digital and education
36:31
technology and arts learning and simulation are obviously the core to that. And actually, if you build on simulation,
36:36
and you use immersive technologies on top of that and maybe in the future start to use, you know,
36:41
A.I. and data principles on top of that as well to create really interesting educational offer. that can be fascinating.
36:48
And then of course, there's the other side of that is around the literacy and the capabilities that you want to get into training.
36:53
And some of that is getting hands on experience of these technologies and practise. And some of that is it's just understanding what you need to know at a base level.
37:00
So from our point of view this is exactly what we're trying to do. We're trying to build that network, you know, so we're trying to understand what's happening regionally
37:07
and really understand what a good evidence and the good practise is happening and be able to scale it up as well and shared.
37:13
So I would love to connect to Rosie. Thanks Hatim. Can I put another question that's come in separately from the audience, how do you plan
37:20
to help patients and staff adapt to technological change? Who wants to take that in the first instance.
adapting to technological change
37:32
I can have a go. So I think the first thing is is we have to recognise that everyone has different levels of digital literacy and digital capability,
37:42
and we have to recognise that because it's really important because it gives us a base from where to work from.
37:47
Another reason why it's really important to recognise that because what we want to avoid here is digital exclusion. We don't want patients missing out on good health care because of not being able to access care due to technology.
37:57
What we want to do is we want to use technology to enhance patient care. So with that context, I think it's really important that we're going to need
38:04
to be able to educate and support and increase skills amongst patients. And I think that's something that the NHS need to really start thinking about.
38:12
And it's really important because as health care professionals, we can do that to a certain extent because we always have patient contact.
38:18
But at the same time, we also have lots of other responsibilities, and the core part about patient contact is providing care and providing, you know, health care.
38:27
So it shouldn't be really necessarily built into our role that it's a must something that we have to do,
38:32
but it's something that we should be able to help a patient through a process. But we still need some sort of digital champions
38:37
who are going to help patients become more digitally literate and use some of the tools that they might be using to access care. That's really important.
38:43
And in terms of staff, again, I think organisations need to take responsibility for this to some extent and recognise that this is something that we need to work on.
38:51
We need to make sure that our organisational members are digitally literate to a certain extent to use the tools that they need to use day in, day out.
38:58
I've had many an occasion, and so many on the call will have had, where you've gone into a department and a technology has been brought in and you've not had any education and training around that.
39:07
And that is a concern for me because actually then it's expected to use a new system without the support to use that system.
39:14
And we're always going to have new systems because this transformation is happening rapidly. So we need to ensure that actually anything that is brought in, anything that is procured, anything that's implemented
39:22
has a quite clear education and training strategy around that to support a staff to use it.
39:28
Very important to me. Can I yeah, can I come in on this just thinking about patients specifically?
39:34
So you think that really early patient engagement is very important here and we can kind of see that in relation to like the rollout of like GPDR, for example.
39:47
So there was 1,000,000 people that opted out, which pushed back the rollout, and we're now reconsidering how we actually engage patients on this matter.
39:57
But I think that, you know, the lessons that can be learnt from something like that can also be applied within the context of how we roll out AI technology.
40:04
So for example, with our work with the Ada Lovelace Institute that I mentioned, we're we're trialling these impact assessments and there is going to be a public and patient engagement aspect to that.
40:15
So for example, when these technology companies or research institutions, apply for access
40:23
to data through the national medical imaging platform, we will ask them to fill out an assessment that prompts them to consider the legal, social and ethical implications
40:33
of their technologies, but we'll bring in patients to also help them consider some of the potential impacts of these technologies on different patients and communities,
40:42
so thinking about helping them think through potential unintended consequences, or even like the positive benefits of these technologies.
40:50
So I think that hopefully by involving patients at the really early stage that that will also positively influence how these technologies are developed
41:02
and then also for patients to better understand how these technologies
41:07
can be applied for their benefit. Thank you.
41:12
I've got a final question coming in. Do panellists feel that we are placing enough focus on DART-Ed in the curricula
41:19
for the composition of teams required to push forward development?
challenges
41:25
I mean, I'm going to say, no, I want to say we are, but I'm going to say the system perhaps is not quite where we would like it to be.
41:33
And that's fine because I think it's recognising that there's always lots of other challenges that we are dealing with and we are also still in the middle of the pandemic.
41:40
So we have to recognise that. But there is, I think there is some meaningful change happening. So the Royal College
41:46
of Radiologists, for example, have adapted their curriculum to ensure that there are aspects of AI within that and same for the radiographers and same for pharmacists in their post-graduate curriculum.
41:55
So these early examples are really useful because you need early implementers to be able to create further reports of effect and reports of change elsewhere.
42:04
And I understand that having an undergraduate level, there's many universities that are starting to look at this in their curricula and as many people are interested in this.
42:11
The challenge we do have is educators that understand this space. And I mean, that was highlighted at the Topol review. We need a cadre of educators that understand digital tech A.I.
42:19
to really get the changes that we want and need in practise. And I think that's starting to happen.
42:24
And so that's really, is if we can recognise those talented individuals that are keen to progress this agenda forward and keen to implement things locally in curricula,
42:33
in teaching and education, then we need to enable them, have to give them the space to be able to do that.
42:38
And often the challenge we will always have is, is there actually space in the curriculum for this? Well, I think there is, because digital is integrated in everything we do.
42:45
It's not something that you will only do in a certain placement, for example, or set certain piece, a piece of work or a certain professional group.
42:52
All of us will be interacting with technology. All of us will be affected by emerging technologies like AI and data driven technologies.
42:59
So it's something that we need to recognise. It doesn't need to be... it needs to be integrated into curricula. It doesn't need to be something that we need to add on.
43:07
It needs to be integrated in terms of what we already do. But I'd add to that if you look at the work that HEE's doing on its digital readiness programme, we're now broadening
43:17
the offer out for a whole range of learning materials, particularly that people can do in bite sized chunks
43:24
so that they can slowly develop the skills they require, the information they require across the whole, the whole digital spectrum,
43:33
and particularly now focussed on individual professions, not just the masses.
43:38
So although we're doing a self-assessment tool for digital skills for everybody in the NHS, which will be available after Christmas,
43:46
we're looking to now try and do really niche digital skills needs for the individual professions, working with the Chief
43:54
Professional Officers and their teams to work out what is needed. So we're still very much in the foothills, but look out for more and this will only be available
44:02
through the newly formed Digital Academy as we move that forward next year.
44:09
I'm conscious of time and we have 45 minutes and we're due to finish it
44:15
1:15pm. So can I thank our panel members for their time today and for sharing their thoughts?
44:23
And can I thank the audience for joining us? From what I hope will be one of a number of webinars on this topic,
44:33
so please do give us feedback on 'Did this tick the box as a starter for you?' Please let us know.
44:39
Other topics you'd like us to cover, and we'd be very happy to do so as we roll these webinars out in the new year.
44:48
So thank you very much and have a great afternoon.
Media last reviewed: 3 May 2023
Next review due: 3 May 2024
Webinar 2 - Building appropriate confidence in AI to enable safe and ethical adoption
The second instalment of the DART-Ed webinar series took place on 1 February 2022 looking at how safe, effective, and ethical adoption of Artificial Intelligence (AI) technology in healthcare relies on confidence in using AI products. Participants are introduced to the findings of a report, produced by HEE and NHSX, which will be published in Spring 2022, outlining the educational needs of NHS healthcare professionals, through a detailed understanding of how we build confidence in the use of AI algorithms.
On the panel were:
- Dr Maxine Mackintosh (Chair) - Programme Lead - Diversity, Genomics England
- Dr Hatim Abdulhussein - Clinical Lead of DART-Ed Programme, Health Education England
- Dr Annabelle Painter - Clinical fellow - AI and Workforce, Health Education England and NHSX
- Dr Michael Nix - Clinical Fellow - AI and workforce, Health Education England and NHSX
- George Onisiforou - Research Manager, NHS AI Lab, NHSX
Watch a recording of this session below.
Watch webinar 2
0:06
Cool, so it feels like not many people coming in anymore so we'll get going so panelists put your
0:12
camera on and welcome everyone. So my name is Maxine Macintosh. I'm the program lead for
0:17
an initiative at Genomics, England called diverse data which is all around, unsurprisingly making the genetic data
0:22
we have more diverse, so this session is all about building appropriate confidence in AI to
0:28
enable safe and ethical adoption. It's collaboration between NHS AI lab and the NHS transformation
0:33
Directorate and health Education England. You'll hear a lot more background as to DART-Ed and
0:38
about this program and about this research very shortly but the primary aim is to identify
0:44
requirements for knowledge, skills and capabilities to develop appropriate confidence in health care professionals to implement
0:50
and use AI safely and effectively. So this is one of three reports. The first one is a more kind of conceptual piece,
0:55
and then the second one is about educational pathways. And then there's gonna be a third output, which is really around the broader
1:02
skills and capabilities required for adopting AI and healthcare settings. So we have a truly amazing panel.
1:07
We've got a about, say, the top of the list is is me. I'm not on the panel, I'm just here to be the social
1:13
lubricant and I'm not amazing, so I've already introduced myself and Next up we've got a Hatim, Abdulhussain, who is the clinical lead of the DART
1:20
-ed program at Health Education England. We also have Annabelle Painter who is clinical fellow for AI and workforce
1:26
at Health Education England and NHSX, and we've got Mike Nix who is clinical fellow AI and workforce at
1:31
Health Education England and NHSX, and we've got George Onisiferou who is a research manager at NHS AI Lab
1:38
at NHSX, so a kind of good mix of health education England AI lab NHSX ,
1:44
and sufficiently complex names. I probably mispronounced some of them. So this is a 45 minute session
1:49
as I've said to you at the beginning, but a few people have joined since. This will be recorded
1:55
so anything you put in the chat or you say out loud will be there in perpetuity.
2:01
Much to dismay Maybe, as Weber reminder Microsoft cameras off just because of the the scrambling that
2:07
happens on the screen when we record it. And obviously if you want to get involved in social media, the handle is at HEE underscore Digi Ready.
2:15
So you've heard enough from me, I will hand over to the panel and 1st off I'm going to hand over to Hatim.
2:22
Thanks, Maxine. So I'm just gonna give us a brief introduction into where we are
2:29
with some of our DART-Ed work and in the previous web I gave an introduction into what the
2:34
data program necessarily involves and how it was developed from its conception around delivering on some
2:39
of the recommendations made in TOPOL review. Today I'm just going to give you a brief update on where we are before I pass on to my colleagues
2:45
who are going to do a lot of interesting talking around the core topic of today's webinar.
2:50
So I just want to tell you now that we published our AI road map it was published last week and
2:56
gives a clear oversight to us in terms of the types of taxonomies of technologies in terms of AI
3:01
in our current health care system. And you can see from this diagram here
3:06
we we found that 34% of current AI technologies that are around diagnostics, 29% automation service efficiency and in a
3:13
group of emerging technologies around people, medicine, remote monitoring and therapeutics.
3:20
Report will also give a bit of an expression in terms of how these technologies will then impact our
3:25
workforce and our different workforce groups. Very interestingly, you won't be surprised to some extent that radiologists and radiographers are
3:31
quite the topic type of their workforce. Groups that are likely to be affected by these technologies, but on top of that general practice,
3:37
cardiologists, nurses are other workforce groups that we feel will be impacted by AI.
3:42
And it has a couple of case studies in there which really dug deeper into that to help understand how
3:48
a couple of technologies that have been implemented in practice have affected the team that have been involved in using these technologies.
3:54
So it's a nice way to bring some of these. some of this data to life and it
4:00
helps us to think about how we're going to tackle the challenge in terms of developing the skills and
4:05
capabilities in our workforce to work effectively with AI, and in effective that's what the discussion
4:10
today is going to be about. Just a reminder that all the webinars are free to attend. They're all recorded and available
4:16
on the HEE YouTube channel as well as on the DART-Ed website, and we have a whole host of a future
4:22
topics that we will be covering through the webinar series. We're looking to have a webinar
4:28
specifically around nursing and midwifery. We're gonna have a spotlight on AI healthcare applications and hopefully a few of our TOPOL fellows will
4:35
come and share some of the projects that they've been working on In that webinar. We are going to have again a spotlight on
4:41
dentistry itself and how dentistry is getting digitally and AI ready. And I want to talk about the work
4:46
that we're doing around robotics, assisted surgery and robotics literacy with the Royal College of Surgeons. So keep a lookout on our social
4:53
media channels keep a lookout on the DART-Ed website for further information about these future webinars and if there's anything you want to feedback to us,
4:59
feel free to get in touch with myself directly. Thanks. Amazing thanks.
5:05
So now we're going to have about a 15 minute presentation from Mike and Annabelle.
5:12
And so whilst they're presenting, do think about some questions you have. I know that a number of you pre
5:17
submitted them the thing in my pre submission is that it's very easy for me as the chair to take them as my own.
5:22
So by all means repost them or re-ask them and put them in the chat and then depending
5:28
on how the conversation evolves, will ask you to un-mute, reveal yourself and
5:33
ask a question in person. So whilst Mike and Annabelle are presenting, please do think about
5:40
questions that you have for the speakers. I'm noticing some technical problems,
5:46
Mike? Yeah we seem to have a permissions problem on teams. I've just sent my slides to Hatim
5:52
by email so hopefully Hatim will be able to share them because I can't. So give us a few seconds
5:58
and hopefully we'll be up to speed with that. I'm really looking forward to the next slide please,
6:05
but I'm bringing them up now. I'll give you the full Chris Witty, Hatim! Thank you very much.
6:11
I can probably just start off to give a little bit of background until we we see the slides,
6:19
If that's OK. So yeah. Hi everyone, my name is George Onisforou and I'm a research
6:25
manager at NHS AI Lab and we conducted this research with Health Education England.
6:32
I worked with Annabelle and Mike on the coming reports. As a little bit of background,
6:38
this research involved a literature review. And we also interviewed over 50 individuals
6:46
in healthcare settings regulatory bodies, people in in the private sector,
6:52
developers of AI and also academics in this field. And we also tried to speak with
6:59
professionals with different levels of experience with AI technologies. And what we aim to do with this
7:06
research is to get an understanding of the requirements for knowledge, skills and capabilities that will
7:14
be needed to develop confidence in healthcare workers when using
7:19
AI technologies. The word confidence is important from early on in our discussions and also
7:25
what we saw in the literature review the terms trust and confidence can be
7:31
used interchangeably, but we did felt that they they need to be distinguished and we felt that
7:39
confidence is the most appropriate term for the context of using AI technologies
7:44
in health care settings as it allows for a little bit more of a dynamic exploration of what is involved.
7:50
Particularly in the different circumstances during clinical decision making that Mike will explain later.
7:58
So with this in mind, we went about trying to understand what influences levels of confidence and
8:07
developed this conceptual framework. If you can go to the next slide, please. And I should say that we will be focusing
8:16
on this framework in our initial report, which is a little bit more conceptual
8:22
in nature and and and it would set the scene for a second report that would
8:27
outline suggested educational pathways to develop this
8:33
confidence. And what we're presenting today is just like a sneak preview.
8:38
So please wait for for the final information and visuals in
8:43
the report that will come out. So what we're saying with this framework
8:48
is that professionals can develop confidence in the technology through two initial layers, the baseline
8:55
layer and the implementation layer. There are several factors under this that we will go more in detail.
9:02
And then there's two layers of confidence in an AI Technology can enable clinicians to
9:08
assess what is the appropriate level of confidence that they should have
9:14
in AI derived information during their clinical decision making. And
9:21
and that's the third layer. So we'll talk now a little bit more in detail about each of these layers.
9:27
First Annabelle will take us through the baseline layer. Thanks George.
9:33
So yes. So starting with this baseline layer so if you could go on to the next slide that would be great.
9:38
Thank you. So the baseline layer is really about the foundations of AI confidence.
9:44
So it's the key components that underpin confidence in the implementation
9:49
and clinical use layer. So what we're really saying here is that each of these components needs to have a really strong and robust
9:55
foundation so that we can build from that with any kind of AI implementation or use in a clinical setting.
10:01
So there are five components within the baseline layer which our product design, regulation and standards, evidence and
10:09
validation, guidance and liability. So I'm just going to go through each of those components in a bit more detail.
10:15
So starting off with product design with this component, what we're really talking about is
10:21
how do we create AI products in a way that inherently and fundamentally
10:27
improves end user confidence? And there are several facets to this, so some of them are fundamental things,
10:34
for example, what is the task AI is doing and what's the level of clinical
10:39
risk associated with that task? And also, is the AI making a decision autonomously or is it part of a joint decision
10:47
making process with a human in the loop? There's also factors here about usability of the AI products,
10:53
so how intuitive is that AI product to use and how seamlessly does it integrate with existing healthcare systems?
11:01
And then there are also some technical considerations. So for example, the type of algorithm that's used
11:07
can influence confidence and also how much we can tell about how their AI is made a decision.
11:12
So this moves into the territory of things like explainability, and Mike is going to talk more
11:18
about that a little bit later. But another important thing to think about is transparency. So this is more about getting
11:24
information from those who develop the AI about the type of algorithm that's being used, how it's been trained,
11:29
the kind of data sets that have been used. And any potential limitations or weaknesses in the model.
11:35
And there have been several transparency standards that have been released that could be helpful with this.
11:40
So moving now onto regulation, so having strong robust regulatory
11:45
regulation is really is key to building AI confidence and what we've
11:51
learned from our research is that health care professionals generally equate achieving regulatory approval
11:57
for medical devices as proof that that AI product is safe to use.
12:04
But in reality, the current regulatory standards often don't actually meet that,
12:09
and in addition, there aren't any specific AI regulation at the moment and we we feel like
12:15
these are the two things that need to be addressed during regulatory reform. And that's exactly what the
12:22
MHRA are currently looking at. So they have a software in AI as a medical device change program
12:28
that's recently been announced, and they're looking at several ways of addressing these things.
12:33
And within regulation there's also professional regulation, which is an important thing to think about.
12:38
So as health care professionals, we generally look to our regulators for advice on how we should behave
12:45
and that extends to how we should interact with things like artificial intelligence. And they're not only applies to
12:50
the clinicians who are using AI to make clinical decisions,
12:55
but also to those who are actually creating these AI products and also involved in their
13:01
validation and testing. There may also be an argument for some kind of professional regulation of the non clinical
13:07
healthcare professionals who are involved in making these products. So by that I mean software
13:13
engineers or data scientists who are working on healthcare AI products. So next,
13:19
moving on to evidence and validation. So it's essential that we know that AI products that are being
13:24
released into the healthcare system work and that as they say they do.
13:30
And for that it's important that we have good guidance on what kind of evidence we may
13:36
expect at the moment in terms of the regulatory requirements. There's no explicit requirement for
13:42
any external validation by a third party or of AI products or any prospective
13:48
clinical trials of AI products, and our research suggests that
13:54
there's definitely an argument for having that as a requirement for any AI product that
13:59
carries significant clinical risk. This is something that's being looked at at the moment by NICE as
14:05
part of their their digital health evidence and frameworks that are being reviewed at the moment.
14:11
And then moving on to guidance. So guidance is important for steering how AI is is procured and how it's used,
14:19
and there's several different types of guidance, so you know. With guidance, there might be
14:27
clinical guidelines. Sorry if anyone got their mic on, do you mind?
14:37
OK, so just honing in on clinical guidelines. So what we've heard from our
14:45
research is that clinicians expect to be given clinical guidelines about how to use AI technology in the same way
14:51
they currently are for say, medication. But the slight issue at the moment is that the processes involved in
14:58
getting specific product level guidance. for AI technology It's not really scalable and it's not
15:04
able to meet the demand of volume of products that are coming onto the market. So again, this is something that's
15:10
being looked at by NICE at the moment, and ultimately it may be that a more agile guideline process is required
15:16
and potentially a move towards rather than product specific guidance. There's maybe like a class based
15:22
guidance in the same way we sometimes see that for medication. As well. And finally,
15:27
moving on to liability. At the moment it's it's unclear from a liability and point of view about
15:33
who would be held to account in a situation in which an AI was to make a clinical decision that led to patient harm.
15:40
So for example, it could be a clinician who's using that a product AI product to make a decision.
15:45
It could be the person who or the company that made the product. It could be those involved
15:51
in commissioning it, and it could be those involved in say testing or regulating or validating it.
15:57
This becomes even more complex when we think about autonomous AI, where a human is actually removed
16:02
from their clinical decision making process entirely. So some kind of guidance and steering on this will be will be important
16:08
for building confidence in AI. So that concludes our baseline layer and so now I'll handover back
16:14
to George to talk about implementation. Thanks, Annabelle can please go to the next slide.
16:21
Thank you. So the implementation layer basically reflects one of the most consistent feedback that we've heard
16:28
during the interviews for this research, and that being that the safe, effective and ethical implementation
16:35
of AI in local settings contribute significantly to building confidence in
16:42
these technologies within the workforce. And the comments focused on 4 main areas
16:48
As you can see here, the first one is around strategy and culture, and what we have heard is that
16:56
establishing AI as a strategic and organizational asset can enhance confidence,
17:02
including through developing relevant business cases and maintaining a
17:08
culture that nurtures innovation, collaboration, and engagement with the public.
17:14
So these conditions allow for a confidence that the right decisions are
17:20
being made, and also that each setting can sustain these type of innovations.
17:27
The second factor is technical implementation, and that refers to arrangements
17:33
around information technology, data governance and issues
17:39
on interoperability. We heard that a lot of the current challenges for deploying
17:45
AI relate to these arrangements, and particularly that agreement on
17:51
information governance settings and data management strategies to handle the
17:57
data associated with AI technologies are highly important at this stage. We got the impression
18:04
that unless these are clarified, many clinicians would hesitate to use AI.
18:12
Annabelle talked about evidence and validation and local validation
18:17
is an extension of that which is the third third factor here. Local value validation may be needed
18:25
to ensure that the performance of an AI technology can be reproduced in
18:32
the local context so that we don't assume that an AI system that may
18:39
have good published performance data will generalize well to local situations.
18:47
And the last factor in this layer is system impact essentially being confident
18:53
that AI is properly integrated in clinical workflows and pathways.
18:59
And what we heard here is about the importance of seamless integration with
19:06
existing systems of clear reporting safety pathways, and of ethical
19:13
practices and that all of these build confidence and address
19:18
inhibitions to adopt AI. Now, the way that AI is integrated into
19:24
clinical decision making is particularly important as it may impact these kinds of decisions,
19:31
and this is something that we explore further in the third layer
19:36
that Mike will explain. Great, thanks very much George. Next slide please.
19:41
Hatim thank you. The third layer is the point at which
19:48
the clinical decision making process and the AI technology interact,
19:53
and this is the first point in our pyramid where we are really moving away from trying
20:00
to increase confidence to assessing confidence. And the idea here is that
20:06
an individual AI prediction which is used for an individual clinical decision for an individual patient
20:13
may or may not be trustworthy, so it's not necessarily appropriate
20:18
always to increase our confidence in AI predictions at the level of
20:24
the individual clinical decision. And really, what we're trying to do here. Is to retain a degree of critical appraisal,
20:31
which, as clinicians we would apply to any information involved in a clinical decision making process
20:36
and to avoid having either under confidence leading to inconsistency,
20:42
lack of benefit being realized by rejecting AI information,
20:48
and overconfidence obviously with the risks of clinical error and potentially patient harm.
20:56
So this is a nuanced problem, and if we go to the next slide, please.
21:03
Thank you. There are five factors in this clinical use layer which really drives this interaction
21:09
between the AI and the human decision making. The first of which is is underpinned by
21:16
our clinicians attitudes to AI, and we're aware from our research what we heard is that this varies a lot.
21:23
There are some clinicians who are digital leaders who are very excited about AI, are very knowledgeable,
21:30
very confident in it, and have a preference to drive it forward and include these types
21:37
of technologies in in as many clinical contexts as possible. There are also people who are more sceptical,
21:43
either through their own experience or potentially, through their lack of experience actually.
21:49
So there's a great variation there which we need to be aware of and take
21:54
account of as underpinning this confidence assessment. The other thing that really underpins
22:00
this is the clinical context. Obviously there's a huge variety of situations in healthcare from
22:05
primary services GP, all the way through to emergency medicine
22:11
in tertiary referral centers. And that has a great impact because
22:17
not only of the of the potential risks and benefits associated with employing AI in those different contexts,
22:24
but also the timescales for decision making. Some decisions are made over many
22:30
weeks with involvement of patients and families and are very discursive and other decisions are made in the
22:37
instant in emergency situations. And obviously that will impact the way that we assess our confidence
22:43
in AI and what we do with that confidence assessment for clinical decision making.
22:49
As Annabelle pointed out earlier, there are technical features of the AI system itself which will
22:55
impact on our confidence and our confidence assessment. AI can
23:01
make various types of predictions, it can be diagnostic or prognostic.
23:06
It can be used to recommend a treatment strategy for some sort of stratification,
23:12
or it can in fact be if you like a preprocessing of some images or
23:17
some other clinical data which adds a layer of information into a clinical decision making pathway
23:23
that already exists. So the type of information, the type of prediction which the AI makes,
23:29
the way in which it makes it, and the way in which that information is presented, whether it's presented as categorical
23:36
or probabilistic, whether uncertainty is included. All of these things are features
23:42
which will affect the way that we value that information as clinicians and the way that we assess our
23:48
confidence in that information when making clinical decisions. There's another factor here,
23:54
which is separated out, although it is a technical feature which is explainability and explainability,
24:00
I think is an area that's that's worthy of some examination at the moment, because it promises a lot,
24:07
and there's been quite a lot of interest in the potential for using explainable AI to get decision reasoning
24:14
out of out of neural networks particularly, and to be able to see the reasons
24:21
for individual clinical decisions. What we found through the literature survey and also talking to experts
24:28
was that its not yet ready for for real time. So we we believe that a I explain ability
24:35
has potential and we believe that a model validation level it has value.
24:40
But at this stage it does not appear to have value for individual clinical decisions.
24:46
So I think we need to be quite cautious using that as a way of assessing confidence in AI for
24:52
clinical reasoning and decision making. And really, what underpins this confidence assessment
24:57
is this fifth factor of cognitive biases. All of us as humans are subject to
25:04
cognitive biases and we may or may not be familiar with what those are, but it's important to acknowledge
25:10
them and to understand the way in which AI presented information may change those cognitive biases which
25:17
we may or may not be consciously aware, Impact on our clinical decision making.
25:22
So just to give you some examples there, there's things like confirmation bias, automation bias,
25:30
all these kinds of things are unavoidable. I think we have a tendency to assume that we are less susceptible
25:35
than the general population, but I think the research would suggest that that's not true. And therefore it's important to understand
25:42
how that factors into the AI assisted clinical decision making process.
25:48
OK, so next slide please Hatim, and then over to Maxine for a
25:54
couple of questions from the chair. Amazing thank you so much Frank, Georgia and Annabelle for a whistle
26:01
stop tour. So I'm gonna fill the the the gap with a couple of questions
26:06
so please whilst the time passes do come up with your own and post them
26:11
in the chat and depending on how the conversation pans out I also plan to invite you to ask a question yourself.
26:17
So please do ask any questions and there's no question too stupid or too intelligent! Probably there is one too
26:23
intelligent but it's probably not, so please ask the full range of questions.
26:28
So this one is definitely planned, but, what is your top priority for improving
26:34
clinical confidence in AI over the next one to two years that of drives towards that
26:40
center of that double ended arrow you presented Nick? What are you working on for the next
26:46
12 to 24 months? Thanks, Maxine. I'm not sure it's what we're working on so much as what the whole
26:54
community needs to be working on and I think really the challenges over the next period are at the baseline
27:02
and implementation layers as we presented in the pyramid, so there's some
27:08
work that is going on nationally at the moment, around regulatory clarity,
27:14
and that will definitely help with the baseline confidence and evidence standards.
27:20
As Annabelle pointed, they are currently being developed and are changing all the time in this space.
27:27
So we will see increased guidance. We will see increased definition of what levels of evidence are
27:34
appropriate for AI and healthcare, and that's definitely going to be a positive thing.
27:40
I think the other challenge is, moving to a place where,
27:45
again as Annabelle pointed out, we have some class guidance because a
27:50
lot of the products that are currently being produced or becoming available are relatively small niches and
27:57
I think expecting product specific guidance from bodies like NICE for every individual AI product,
28:03
which is going to enter the health care arena is is not a sustainable way to to work in in the longer term.
28:10
So I think having some more general guidance and standards around how to evaluate and implement AI once
28:17
it's achieved regulatory approval, I think will be, very helpful. And then the second part of this
28:24
answer really is at the local level. So considering a healthcare setting whatever that might be, I think,
28:31
really the challenges are around people. We need to have the right people to
28:36
drive adoption of these technologies forward and to do it with appropriate levels of knowledge and critique,
28:43
but also with sufficient motivation and positivity that we actually do get these
28:50
things translated into clinical practice. And I think around that there's a need to define some roles.
28:57
There may be roles which aren't common currently in healthcare organizations,
29:02
particularly around the task of implementing AI and I think we need
29:08
to take a multidisciplinary approach. So I think we need clinicians,
29:14
I think we need users. I think we need drivers of policy,
29:19
policy people and people who are kind of holding the purse strings and we also need technical people
29:25
who understand the evidence and the challenges associated with doing robust and ethical implementation.
29:33
So hopefully that ties in as an answer to what we were discussing in the framework.
29:41
I will hand back over, to Maxine. A busy two years? Yes a busy two years. Also,
29:46
The work is never done. And you're waiting on all these dependencies and everyone is sort of working them out as we go
29:51
so it is a bit of a juggling act I think for everyone in the community. So two things one is.
29:57
Amanda asked the question, be more Amanda, keep them coming. The second one is that
30:05
you talked about the need to define roles and support with multidisciplinary teams and
30:10
diverse groups and decision making. And I know that this is kind of a bit of a sneak preview because you
30:16
know future reports are going to be on education, but you need definitely need a pipeline
30:21
to start filling those roles, and that's going to take quite quite a period of time. So whilst not creeping
30:27
into future webinar topics, can you give a little bit of a hint about how you're thinking about the education piece on this one?
30:33
I think maybe for Annabelle? Yeah, sure. So as George mentioned we are
30:39
we are releasing 2 reports so the first one is coming out in a couple of weeks which will be focusing on what we
30:44
just talked about and the second one will be following in the next few months. And that's really focusing down on what
30:50
this means for educating the NHS workforce. So we have had a little thing about this and we can give a bit of
30:55
information now so Hatim, would you mind just go on to the next slide? Yeah, great, so we
31:01
are thinking about this across the whole NHS workforce, so this is not just about clinician end users
31:07
it's about everyone from like the most senior NHS management through to
31:14
the people who are commissioning products and people who are embedding them within the NHS. So the way we think about this is by
31:21
splitting that workforce into 5 archetypes, and these archetypes are based on the role that individuals will
31:26
have in relation to AI technology, and these archetypes are not exclusive so you can as an individual
31:32
sit in multiple buckets at the same time, but the reason that these archetypes we feel are helpful is
31:38
because these different individuals have different educational and training needs and and so it can
31:44
help us focus on on what we're going to need to do to prepare these different aspects of the workforce.
31:49
So just to go to explain them in a bit more detail to shape is here. Shape is really the people who
31:55
are setting the agenda for AI, so these are the people who are coming up with kind of regulation, policy and guidelines.
32:00
All of those things to do. To do with AI and examples of
32:05
them might be NHS leaders, the regulators of AI and other people who work within arms length
32:13
bodies and the drivers really are the people who are leading
32:18
digital transformation, so they are involved in commissioning AI. They're also involved in building
32:24
up the teams and infrastructure within HHS organizations that are going to be needed to implement AI,
32:30
so for example. They might be an ICS leadership board, or it might be CIOs within within
32:37
ICSs. The next bucket are creators so these are people who are actually
32:42
making AI. So when it comes to the NHS workforce, it may be that these.
32:48
Individuals are Co. Creating this AI with for example, like a commercial partner or something like that,
32:54
and the kind of people who would be doing this would be like data scientists. They might be software engineers,
33:00
or they might be a specialist clinicians or researchers and academics who are working on AI. Then we have our embedders, and
33:07
the embedder's role is essentially to integrate and implement AI
33:12
technologies within the NHS. So these individuals they might
33:18
also overlap with creators and being kind of data scientists, and they might also be clinical
33:24
scientist specialist clinicians by IT and and IG teams.
33:29
Yeah, so that's who we really mean by the embedders. And then finally the users and the users.
33:35
are anyone within health care who are obviously using AI product, so this might be clinicians. It might be allied health professionals.
33:41
It might also be non clinical staff and what's really important with all of these archetypes is that we
33:47
need to make sure we capture everyone so we don't just want to capture the workforce in training but also we need to make sure that we're
33:54
targeting those who are fully trained who are already working. And we're going to need to be giving slightly
34:00
different expert advice to each one of these different archetypes. Now the reason there's a box around creators
34:05
and embedders is just because we feel like at the moment, this is probably the area where we have
34:11
the least skill within the within the NHS at the moment, so it's one of the areas we're going to have to think carefully about.
34:17
How do we bring people with these kind of data scientists clinical informatics skills within the NHS?
34:23
And how do we train up existing people within the NHS to become specialists in that area?
34:29
So moving on a little bit from from this kind of baseline information education,
34:35
these people are going to need, we are also going to have to think about product specific training. So this is about giving people
34:41
knowledge and information about a specific product that they're using, and this really affects 3 main archetypes,
34:47
so we're talking about the drivers, the embedder's, and the users. So the drivers need to know specific
34:53
information about products so that they can make the right commissioning decisions about those products. Then
34:58
embedders need to know the technical information about products so they can make sure that they're integrated in
35:04
a way that's both safe and effective, and finally uses. So users are going to need training on specific products
35:10
that they're using to make sure that they understand clearly what the indications are for that product,
35:15
what the limitations are of that product, and really importantly, how to communicate with that product
35:22
about patients and how to facilitate joint decision making amongst clinicians and patients using AI
35:29
In the mix. So that's everything for me. Maxine, back to you. Amazing thanks
35:35
Thanks for Annabelle. So I am conscious of time and there's a couple of great questions,
35:42
especially for I guess this kind kind of conceptual piece about, you know what is confidence?
35:47
What is appropriate? So I think I might look to smush
35:52
maybe some of Amanda's question with some of James' so as you're
35:57
like thinking about what does it mean to be confident and what does it mean to be appropriate? How does ethics cut across as well?
36:04
Kind of ethics has its own, you know, like, let's redefine the question type of discussion that happens,
36:10
and so I'd love to come. Yeah, pick up Amanda's question about, uh, you know. Where does ethics transects
36:16
with appropriate confidence? And then I'm going to bundle that with the bottom of James question which is,
36:23
as this conceptual work, which I think is incredibly important, to underpin some of the the harder
36:29
or the kind of the the base-siding or that base layer, how does that intersect with already existing standards?
36:35
And I know that was kind of touched on a little bit. But, linking the conceptual with the hard would be a good thing to touch
36:43
on for the last couple of minutes. Perfect , shall I take that one?
36:48
Or those ones I think is probably a better description. So yes, let's start by thinking about ethics,
36:56
and I think our starting point for this work was, the idea that if we
37:01
want to do AI ethically, then we have to do it robustly and we have to do it safely and we have
37:07
to do it with appropriate confidence. Anything else is not ethical, essentially because what we're
37:13
doing is we're ensuring that we achieve patient benefit, we're minimizing risk,
37:19
and we're maximizing impact. And that includes ethical considerations like maximizing impact for different groups,
37:26
different demographic groups, and ensuring that we know what the performance of our AI is
37:32
for different demographic groups, So it cuts through, evidence and
37:38
standards and regulation. I'll come back to that in a second and in response to changes question.
37:43
It also cuts through what George was talking about in terms of local validation.
37:49
Local validation is absolutely key to ensuring that we have generalizability, that we understand the limitations
37:54
of the algorithms that we use, and that allows us to use them ethically. And then when you get to the clinical
38:02
decision making layer that really is where individual critical
38:07
thinking comes into understanding what the ethical implications might be and when it might be appropriate to
38:14
disregard potentially an AI prediction and whether that disadvantages
38:21
people is is something that needs to be considered at the workflow integration and the implementation
38:27
stage. So that we can try and be as evenhanded as possible with technologies that are not
38:32
necessarily, inherently even-handed in their performance, and that really is a big ethical
38:38
challenge with AI. I think it's very important to always have alternate pathways that can be
38:43
used in the case where we we don't have confidence in the AI and we need to make sure that those don't result
38:49
in detriment to certain patient groups. And so I think that would be my response to the ethical question.
38:56
I think in terms of regulation and MDR, Hi James, I'm the clinical scientist,
39:02
which is is probably why clinical scientists have got some representation here to some extent.
39:09
So I'm familiar with the MDR and the ISO standards.
39:15
There are going to be some new standards we expect in in the relatively near future looking at
39:21
AI specifically as a medical device. I'm sure you're aware of some of the discussion around that,
39:26
and I think our hope is that that will not replace what's in the MDR and ISO thirteen 485, but rather it
39:34
will clarify it and extend it. I think the other thing that's really
39:39
important to to think about in terms of C marking as it used to be called and now UK marking and the MDR is
39:47
that medical device approval does not tell you anything about performance,
39:53
so it's not necessarily sufficient. It's necessary, but it's not necessarily sufficient,
40:00
and I think where NICE and bodies like that come in in terms of providing these evidence standards
40:05
so what should our expectations be in an AI product so that we can have clinical confidence?
40:11
Yes, of course it has to be regulatory approved. But that to me is a first step,
40:17
not a final step. Yeah, I think it's like a great answer and for me that the -
40:22
not that my opinion matters in this but the the appropriate confidence for me is a really nice way of knitting together
40:28
You know things like ethics which can sometimes feel a bit intangible and impractical and you know with MDR that have some shortfalls.
40:34
So for me, this was this felt like a really nice way to to cut, turn some kind of floating themes,
40:39
or, you know, swallow tools into something a bit more holistic and practical so I know they run out of time. And it's not a competition
40:45
of who's the most popular but if it was Tracy would be winning! A number of your questions
40:50
in advance of this came about shared decision making. So in 45 seconds can one of you take,
40:58
the question around a, shared decision making or b, how do we make sure that patients truly
41:03
sit alongside the the creators that had a little bit of a dotted line around them? So who's gonna take that swiftly?
41:10
I can try and do that very quickly and so just to say in terms of the archetypes, patients are not in there.
41:15
They're not an archetype. That is actually and completely intentional, because this report is about the workforce and how we prepare the NHS workforce now.
41:22
It's also intentional because we think that patient, that conversation about patient involvement is really, really important and deserves
41:28
his own attention. So it is intentionally excluded from here. However, what is really important about
41:33
how we include patients is, first of all in that bit about how we design products in a
41:39
way that enhances confidence. We need to make sure we have users involved and patients involved at that design stage from very early on,
41:46
'cause they're the ones ultimately who these products are gonna be used on. So it's really important that we get their
41:51
input all the way through. The second thing is to say when we're talking about preparing on the users,
41:57
so the clinician users are a huge part of their preparation is about how they can,
42:02
how they can communicate about these products with patients, so making sure they have conversations that
42:07
bring patients bring patients in early. They make it clear about. The limitations and the risks
42:12
and the benefits of using AI. What it means for their data and their patient information, and how they can make decisions together.
42:19
And as a group so you know, clinician patient, but also potentially AI moving in there is like, you know, a third,
42:27
a third agent in that mix. Amazing, thanks Anabelle and thanks for
42:32
doing this so succinctly so I'm sorry we haven't had time to to hit some of the other questions come but
42:37
thank you so much for for posing them, and there's some Good ones that I'm sure that the individuals on the panel will be
42:43
happy to to follow up and answer. Then obviously keep the conversation going. But here endeth the first webinar
42:48
of the series. The next one is happening at the end of March. As Hatim says it's on nursing
42:54
and the recordings of this will also be made available in case your child came in halfway
42:59
through demanding lunch or something catastrophic like that, but otherwise thank you so much for
43:05
tuning in and do follow a Health Education England and on Twitter @HEE_digiready and I'm sure
43:11
everyone would love to keep the conversation going but thank you very much for for your attention and your questions and for coming and
43:17
hanging out this lunchtime with us. Thanks bye bye thank you.
Media last reviewed: 3 May 2023
Next review due: 3 May 2024
Webinar 3 - Learning from the application of AI in Health
The third installment of the DART-Ed webinar series took place on 11 August 2022 and looked at how artificial intelligence has the potential to transform healthcare, and in many cases is starting to do so. However, implementation of AI requires specialist skills from those clinicians working on these projects. The webinar focussed on experiences from clinicians who are actively working on AI related projects in the NHS. The webinar was guest chaired by Dr. Haris Shuaib, the AI Transformation Lead at the London AI Centre for Value Based Healthcare, and Fellowship Director for the recently launched London AI Fellowship Programme.
On the panel were:
- Haris Shuaib (Chair) - Consultant Physicist, Head of Clinical Scientific Computing at Guy's and St Thomas' NHS Foundation Trust
-
Dr. Hatim Abdulhussein - National Clinical Lead - AI and Digital Medical Workforce at Health Education England
- Dr. Amar Patel - GP, Digital Lead and IIF Clinical Lead at Southport and Formby PCN and Topol Fellow
- Dr. Kavitha Vimalesvaran - Cardiology Registrar and Clinical AI Fellow
- Christina Sothinathan - Digital Health NHS Navigator, DigitalHealth.London and Advanced Practice Physiotherapist, St George’s Hospital
Watch a recording of this session below.
Watch webinar 3
0:06
Fantastic. Okay. There you go. That's the last face I was waiting for Hatim's. Okay, fantastic.
0:11
Let's get started. I've given people two minutes, grace. That's good enough. So, welcome to the third of the webinar series hosted by Health Education England.
0:19
This is learning from the application of AI in health. So I hope you found yourself in the right place.
0:25
As it's just been put into the chat box, we are gonna be recording this webinar and it's gonna be published on the website in the next few weeks.
0:32
So you can let people who couldn't make it know, that they can still catch up later.
0:37
As you'll find you've been disabled in terms of your microphones and cameras. And, hopefully once we get to the Q&A session, we'll get people to, to put their
0:46
hands up and then they'll be able to come off mute and ask the panelist or myself
0:51
the questions that they want to ask. And then we can take the session from there in terms of the format of the session today.
0:57
We'll have an overview from Hatim of the DART-Ed programme, which hopefully some of you at least will be aware of and then Each of the other three
1:06
panelists Amar, Christina, Kavitha will introduce themselves and spend a couple of minutes talking about their work.
1:13
And then the remaining half of the session hopefully will be a sort of panel discussion and Q&A on the things that we've raised.
1:22
Fantastic. Hopefully that's okay with everyone. I'll start by introducing myself. So I'm Haris Shuaib.
1:27
I'm a Consultant Clinical Scientist at Guy and St Thomas' and I'm head of clinical scientific computing there, which is a team, uh, which is charged with
1:35
developing people, platforms and policy. For digital health. So we essentially help translate cutting edge digital health
1:42
technologies into routine care, which is quite an exciting field to be. And as you can imagine, a lot of our work involves deploying artificial intelligence
1:52
algorithms into the front line. With my other hat. I'm also the AI transformation lead for the London AI Centre which is a
2:00
large, innovate UK funded collaboration . Initially was across London, now it's expanded across the Southeast of NHS
2:08
trust, academic institutions and industry partners where we're collaborating around
2:13
developing AI prototypes along a whole range of patient pathways, primarily in secondary care to improve patient outcomes and operational efficiency.
2:22
But today we won't be talking that much about, about my work and more about the work of my fellow panelists.
2:27
And having said that, I think we'll get started and I'll hand over to, Hatim in the first instance, to give us an intro of the DART-Ed programme.
2:35
Over to you Hatim. Thanks, Haris. So what I wanna try and do is just give a brief overview of what we've been doing
2:40
so far, but not spend too much time. Cause actually what you really wanna, who you really wanna hear from is Amar, Kavitha and Christina.
2:46
So from my point of view, I'm a GP in Northwest London and I am a national clinical lead for Health Education England for AI and digital medical workforce.
2:54
That if I was looking at two areas, one is the emerging technologies around digital health, AI, and robotics and education.
3:00
And then how that, what that means for education in the future. Then also being the overall lead for digital readiness
3:05
for the medical profession. What I'm going to do today is just very briefly go over some of the kind of
3:12
key outputs that we worked on so far. So you can go back, look the previous webinars to see kind of our approach to this problem. And some of the background in terms of the policy that has driven what we're doing.
3:21
But what I'm gonna do is give a bit of an overview briefly, what outputs have been so far and then actually just highlights some educational
3:27
opportunities for people that wanna develop some learning in this space. So in January we published the AI roadmap and, that was an opportunity to start
3:34
to understand how many AI technologies currently exist in the NHS and what kind of taxonomies they sit within. So majority of the technologies you won't be surprised to hear are
3:41
diagnostics around 34%, but in a whole host of technologies around automation and service efficiency as well.
3:46
And in a growing amount of technologies. Remote monitoring and P four medicine. So P four medicine is around participatory, preventative, personalized,
3:54
uh, healthcare technologies that work at population health level. And within that, we're then able to look at what kind of workforce groups
4:00
they're going to affect, how they might affect those workforce groups. And also delve deeper into a couple of those technologies to really think
4:06
about what the impact might be on the way that we will work in the future. So have a read of that report. Cause I think it's really interesting to, to think about putting into
4:13
your mind, how things might, things might change in the future. And then in may we published a report which helped us to start to create a
4:19
framework for how we can understand how we're gonna build confidence in AI, amongst healthcare workforce, and what key factors are gonna drive that confidence.
4:26
And at national level, that's really about governance. Making sure that the regulation and standards are appropriate is at
4:32
evaluation and validation of these technologies and national approach from organizations like NICE and MHRA.
4:37
And clear guidelines. And, and then at local level, in terms of an organization and was gonna talk a little bit about is this strategy and the culture of the organization.
4:44
Do you actually have the technical skills to be able to implement this as a technology and then looking more closely at local level, wherever the validation
4:51
has been has occurred at that level? Cause we recognize that every population is different. And then finally, what does that mean as a clinician?
4:56
So me as a frontline GP, how will it affect the way I work? What kind ofcognitive biases do I need to be aware of when I'm working with
5:03
AI or what do I do when it's perhaps not working as effective as it should? And how do I then raise that?
5:08
And, and actually we go into this in a little bit more detail in webinar two, so if you want to go and learn a bit about approach and some of the
5:13
findings of this piece of research in more detail, rather than look at the report, then have a look at webinar two.
5:19
So to highlight common educational opportunities. We we've partnered with the university of Manchester who are developing a clinical
5:24
data engineering CPD for healthcare professionals to pick and learn specific
5:30
modules that will help contribute to their learning, to becoming someone who's a clinician, but also will have some learning or some knowledge of
5:36
skills around data engineering and data science. And at the moment we're piloting the first module of this, and we are offering
5:42
15 funded places on this first module. And I will post the link to the chat, access how to apply to
5:48
this and what to do to apply. Another opportunity to outline. And this is not something we're directly we're involved with, but
5:54
we are aware of and, and support the sort of the application of this. This is a, EU funded program, which is on explainable, artificial
5:59
intelligence and healthcare management. Um, this is going to be a master's program. It's gonna be delivered by University of Pavia in Italy.
6:06
And, uh, one of the partners in the UK is Keele University. Um, so whilst it being EU funded, it is open to applications from the UK.
6:15
The program costs 2,500 euros, but 80% of students that graduate from the program, I either top 80% will have their thesis award paid back to them.
6:24
So, uh, because of the funding that's attributed to the program and it cover what I think why this course is really interesting is it covers two areas.
6:31
There's the, sort of the core skills around AI. So there's modules on introduction to data science and, and
6:37
ethical AI, but also starts. Also covers areas around healthcare management as well.
6:42
So there's a module on transforming healthcare. There's a module on looking at AI and the workforce and the impact that
6:47
they might have on the workforce. So it should be a really fascinating master program for anyone that's wants to apply. And again, I'll post details in the chat for where you can express your
6:54
interest and then, someone from Keele or, from one of their risks that one of the university partners will reach out to, to see if you want more information.
7:01
Thank you for listening and I'll pass over to Amar. Thanks Hatim. Yeah, so my name's Amar, I'm a GP and digital lead and an investment and
7:10
impact fund lead at Southport and Formby primary care network, and a Topol Fellow.
7:16
So just to give people a bit of context , in the last couple of years, practices in primary care have been organizing into primary care networks.
7:23
And essentially these are groups of practices. Given some additional funding and then, they provide additional
7:29
services, a result of this. So our area, we were already organized cause we had quite a large Federation, but the legal basis of the primary care network allowed us to organize a
7:36
bit more and develop kind of the right kind of infrastructure that we needed. So a large part of my work is looking at kind of strategic
7:42
and organizational level work. Looking quite open questions, really in one in particular, simply what is going
7:49
on here, what's happening in our area to kind of get us, give us an idea of, you know, how we can develop and essentially that creates questions, a data question.
7:58
So a lot of the work is developing the requisite infrastructure that allows us to kind of utilize all that information that we have, but is
8:06
largely locked up in various systems. So my work, this is an AI kind of machine learning type webinar, but, um,
8:14
my work, you know, is machine learning, but I would say the unsung hero is just very simple data visualization, actually, that can be utilized.
8:23
With kind of more advanced tools and advanced infrastructure. So the work I'm doing is largely exploratory data analysis, looking at
8:30
general kind of clinical activities. So kind of consultations that we're seeing prescriptions, lab reports
8:36
that we're having to review documents that we're reviewing, engaging an idea about how we can apply machine learning, tech principles,
8:42
like clustering and regression. Forecasting to get an idea of what is going on now and what could be going
8:48
on, you know, in five, 10 years time. And then when you combine this with your other streams, like financial data staff data, patient survey, preference data, then you
8:56
get a really rich picture of what we should be building. So the work at the moment, relates to allowing us to kind of have extra
9:03
functionality where we move past using archaic search and report systems to querying databases, outsource using search query language.
9:12
And then we're allowed to , provide that further, rich analysis using things like Python or are to do that.
9:19
That's the exciting stuff of the machine learning and the modeling, but I'd say the larger goal, it's a small part of a larger goal, which is to turn us into a
9:27
data driven product creating organization. And the bigger areas of that, and the more time consuming areas of that are.
9:35
Actually the foundational infrastructure. So your information governance policies, your software, your hardware, getting
9:40
that political buy in from practices today, so that they really understand the value of data and can trust that group work, kind of activity.
9:48
So that's one element. And probably the large element as a part of that is that team.
9:54
So fundamentally this is a people team. There is no point having a machine learning AI siloed
9:59
group that some does some work. That's not really relevant to the rest of the organization. So we're building a digital urban, and the role of that is , to get a
10:07
diverse set of people, administrators, pharmacists, clinicians, and slowly
10:13
allow them to transition from using Excel in your simple systems to query databases at source, and then using, you know, your advanced analytics languages.
10:21
So that over a number of years, we can then develop a system where they're, you've got grassroots and
10:28
patient by people who are seeing the job and doing the job every day. Um, so the advanced patient facing stuff we're not ready there.
10:34
We're not ready yet. And we, I'm not, we don't need to be ready yet. I think this is part of a stage , a multi year plan.
10:40
But the opportunities are there. And it's just making sure that what we're doing this at the right time and the right
10:46
stage and creating the right culture. Thanks.
10:51
Fantastic. Thanks, Amar. Over to you Christina. Thank you.
10:57
My name's Christina I'm my background is I was a clinical lead physiotherapist for many years in MSK, at Kings.
11:03
And then I became, I did my digital pioneer fellowship through digital health London, and then I moved across to NHS, England and improvement London region
11:10
to be a clinical transformation fellow. And now I've moved over to Digital Health London to be a
11:16
NHS digital health navigator. So one of the innovators I'm working with called limbic.
11:22
So limbic are a AI solution addresses end to end mental health pathways. So they're implemented in 20 I app services.
11:32
They have an operat in of 92% and, those sorts end to end pathways.
11:38
So you have limbic access and then limbic care. So just wanna talk about limbic access, for the time being,
11:43
and then move on to limbic care. So, limbic access has allows patients to self prefer to eye apps and it
11:48
uses a type of machine learning. Called natural language, processing via chat bot, which analyzes text input.
11:57
So it picks out keyword and uses conversational AI and benefits of that as that engages patients in a more humanistic and caring approach and caring way.
12:08
So this improves usability and it's easier for patients to selfer and this is reflected by their 91%, completion rate.
12:17
And it also, collects really important screening that would normally be done within an appointment or pre-appointment via paper and that's the PHQ nine.
12:24
So that is a, for those of you don't know is a depression score. And that allows, , the chat bot to then stratify and risk flag so
12:32
that, the care can be adapted and tailored based on the individual needs. And in terms of like suicidal risk, the pathway can be tailored to local pathways.
12:40
So, asking, for example, if the patient can keep themselves safe or follow up with, questions around suicidal preparation or suicidal
12:48
intent or likelihood, and, just checking on protective factors, ways that they can keep themselves safe.
12:55
So it's an AI solution that provides clinical information pre-appointment in order to support the clinicians in their clinical decision making.
13:04
So the clinicians have the most relevant information and screening, pre-appointment
13:10
this frees up clinical time and to allow the clinicians to add more value based care and therefore reduces the burden.
13:16
And then in terms of limbic care, this uses the same engine machine learning engine, but for treatment support tools.
13:23
So whilst patients are on the waiting list, but also when they have started treatment and then post discharge.
13:29
So it helps with that elective backlog that we. Hearing so much about, it helps patients to wait well, while they're waiting.
13:36
And it also helps with patient safety as well. So during their therapy, it can be accessed 24 7.
13:43
So for example the patient can log in and say, I'm feeling anxious and, it will reply.
13:48
Then your therapist has recommended some breathing exercises. Shall we do this together? Okay, let's go.
13:54
Now, this sounds very simple, but we know that when anxiety and depression can inhibit problem solving abilities so it can be really
14:02
beneficial from that point of view. So it can be accessing the weekends 24 7 and then post discharge.
14:07
It allows access for patients to sort of help prevent that relapse. So.
14:13
Limbic is never going to sort of replace clinicians at the moment. But it supports clinicians and we know that AI's not ready to replace
14:20
humans as it lapses sort of human values that are really important to patients and individuals such as empathy and that gut instinct.
14:28
However, the benefits in terms of user benefits, it can be accessed 24 hours a day, seven days a week.
14:34
And we know humans need to sleep, so we all need to sleep. The chat bot doesn't need to sleep. And the feedback from patients is that they feel less judged
14:43
speaking with a chat bot. And it's less burdensome. Speaking to humans could be really beneficial and makes it much easier
14:50
for those with social anxiety. And in terms of clinicians and the benefits of clinicians, it reduces
14:56
by supporting the clinicians. It reduces that workforce burden. So we know that workforce burden is a huge challenge to the NHS generally,
15:04
and clinicians like having information prior to their appointments, it allows preparation, including seek and support for those more complex patients.
15:12
Just reflecting on my time at Kings, we used to do pre-appointment screening, including the PHQ nine, including start back, which is a MSK stratification tool.
15:22
And it really does free up time during the appointment so that you can spend more time with the patient.
15:27
In their appointment time, actually adding more value. So, you know, we're very aware of the sort of pros and the sort of the
15:34
benefits and the limitations to AI. But overall, this isn't trying to replace the clinician, but more
15:40
about supporting the clinician. Happy to take any questions after, so I'll pass on to Kavitha.
15:49
Thanks, Christina. Um, hi everyone. My name's Kavitha. Um, so a little bit about my background.
15:55
I'm actually a cardiology registrar in Northwest London, with a subspecialty interest in cardiac imaging.
16:01
I'm currently in the second year of my PhD at Imperial college London, where I'm developing a novel AI based clinical decision support system which
16:10
enhances the acquisition of cardiac MRI. I have a particular interest in valvular heart disease and therefore I'm focusing
16:18
a lot of my research in training neural networks to accurately identify valvular abnormalities in different cardiac views.
16:25
I've also got a bit of experience in using semi-supervised natural language processing or some of you, my recognize is as NLP algorithms to
16:34
accurately categorize diagnosis and specifically, in cardiac MRI, from radiology reports.
16:40
So, you know, we could utilize this for, very quickly screening through lots of radiology reports and pulling out scans that we would,
16:48
that need for a specific experiment. And part of my PhD, I work very, very closely with a very diverse
16:55
group of people, including engineers, medical physicists other clinicians
17:00
to create these algorithms. I also lead on the AI lab at Imperial where we conduct sort of fortnightly
17:08
forums to promote this sort of collaborative working really where fellows like myself are able to share our current findings, challenges
17:17
in some of the development of these algorithms or maybe the implementation work that we are carrying out.
17:22
And we try and work through some of these challenges together through this forum. And I guess what I've learned mostly through my PhD is, the
17:30
experience of data procurement. And this is both locally and nationally and both of these types of procurement.
17:38
So whether you're trying to get data locally from your NHS trust or nationally,
17:43
say from the biobank, they both have very different challenges in trying to navigate, through each of that.
17:49
Applying for ethical approval and then sort of utilizing medical images for AI and specifically for that data curation part, which is
17:57
sometimes the most important and most difficult part, to get right before you start training these AI models.
18:05
So that's part of my week. So half of my week, Spent, on my PhD and the other half of my week is spent
18:11
as a clinical AI fellow at G S T T. I'm one of 11 clinical AI fellows , on this first cohort of its kind that
18:19
Haris, um, obviously initiated. And one of my main roles in the clinical AI fellowship is currently supporting
18:26
the development and evaluation of a CE marked AI tool and C marked for those of you who don't know, just means that it has regulatory approval
18:34
and the software that we're gonna be working with, or the company that we're gonna be working with is cure AI. The specific product is Q E R what Q EER.
18:43
It's an algorithm, that can detect up to 11 critical abnormalities from non contrast head CTS.
18:50
And what we hope is that through this product, we'd be able to prioritize the
18:56
most critical scans for reporting by radiologists, therefore patients coming in
19:01
with a stroke or major trauma with , brain injuries will quickly get the treatment.
19:06
That they need through that prioritization and the software we hope will be deployed within the emergency department and radiology
19:13
workflows as with any other study AI or non-AI there's always the process of
19:19
developing the actual study protocol. And we're at that phase at the moment where we are designing the
19:24
prospective study which will eventually be conducted across five different NHS sites and these are major trauma centers, stroke centers, and a DGH.
19:33
, I'm also supporting the ethics application in this, and also collaborating with our industry stakeholders, particularly in terms of managing their expectations.
19:41
And eventually I'll be leading on the academic outputs from this project.
19:46
And I think this is often the rate limiting step, this initial thing where you have to really just get going, design the study, get all the sites together,
19:54
get all the PIs together, and see what everyone's expectations are of all of this before the study can actually even
20:00
get going. One of my main objectives, I suppose, by the end of this is to create a body
20:05
of evidence to build that trust and confidence of the healthcare workforce in this specific AI system, which can then allow it to be carefully scaled
20:15
up for use across the NHS sites. And what I'd like to take away eventually from the fellowship is to help me better
20:22
understand all of these challenges and the various different phases within life
20:27
cycles of implementation of AI tools, which would then give me a transferable skill set for deploying similar projects into hospitals in the NHS.
20:38
Thank you. Fantastic. Uh, thank you, Kavitha. Thank you, Amar and Christina, as well, fascinating introduction to your work.
20:45
And I have to admit, I didn't have any input into the construction of this panel, but I'm so glad we have such a diverse in just three people.
20:53
We basically cover everything which should hopefully make the Q and a quite interesting, we've got secondary care, primary care, mental health,
21:00
different clinical backgrounds as well. We had people send in questions, prior to the event.
21:06
And so I'm gonna use some of them as a launching pad into what I want to ask,
21:11
but I'd also be keen for people to put their hands up if they've got questions
21:16
from based on what they've heard so far, one of the things I want to start with, and I'll address this to everyone is, a few of you mentioned, the importance
21:26
and the challenges around data and there was a question by Lee Whitaker that was sent in about, what if any other major risks or drawbacks about AI in health,
21:35
one of the major challenges is around the quality of our data infrastructure, and
21:43
particularly around the development of AI, you require often large data sets across
21:48
many different geographies, and that means crossing institutional boundaries.
21:53
And I'd be very interested to hear about the challenges you've faced and
21:58
ways that you've overcome that Amar, you talked about how your new legal basis as a PCN enabled some activities.
22:06
Did that make it easier to get access to run SQL queries on larger cohorts of patients.
22:13
Or was there still, you know, 80% of the work still outstanding to do so to you?
22:19
Our first yeah. I mean, the truth is everyone's terrified about data really aren't they data sharing
22:25
and it was that's by far been one, one of the, you know, I'd love to be focusing all my time into building models and doing that kind of analysis, but actually one
22:33
of the bigger areas which has caused the most delays is that information governance policy and ensuring that everyone is
22:40
aware of what's going on and the right kind of the right stakeholder buy-in, what's complicated more is the sort of roll out of the national data opt
22:47
out, which made it even more difficult where you had people that were opting out, not really knowing why they were opting out S not knowing what was
22:55
involved in that and what that actually. And so our way of dealing with that is to be very basic.
23:01
I mean, you've got, you want to remember, I mean, general practice, the baseline
23:06
for, I suppose, digital maturity and data infrastructure maturity is a very low, so we don't even really need to be thinking too much about AI right now.
23:15
We just need to do the foundational blocks that, that we can then build on. So one of the things in terms of moving forward was just, you know, existing,
23:22
I, it providers, have the ability to provide new ways of searching. Great. That's a new, that's the straightaway, that's a tick box to that pathway
23:29
towards being able to build getting more complex kind of, um, analyses and then just using what we can that involves things like implied consent.
23:37
Anyway so clinical order is classified under direct patient care. And one of our things that we can do in practice anyway.
23:44
So, making our use of these technologies related to things that are already part of our day to day work before going onto those higher level areas.
23:54
So I think what's helped with our organization. Is that , just the sharing that's going on. We, because I mentioned we are already, we're already organized into, by a
24:01
Federation for the last several years. And then in that that's turned into. More of a PCN as well.
24:07
We already had data sharing things in place where we were providing direct care. So it made it a lot easier for us to do that, but I'd say the biggest challenge
24:15
is making sure that practices are organizing , and having that ability to have a plan for direct care of patients.
24:23
And then allowing that kind of direct kind of clinical audit kind of pathway that makes it much.
24:29
Fantastic. No really interesting. And turning to you, Christina, with, particularly with the limbic app where you have, you know, an end to end pathway management where
24:38
you're potentially jumping from organizations or from different services, you mentioned that you're able to refer to a number of I services.
24:46
Does that present challenges? How has that been in terms of implementing limbic or clinicians being able to
24:52
use limbic to its fullest extent? How has data flown across organizations or does it, is it easier because
24:59
a lot of the data is held or controlled by limbic themselves?
25:05
Well, I believe that the data is held by limbic itself. That does make it easier.
25:10
And then the clinician can input into the app so that they can, it can help
25:18
with that limbic care when a patient is having sort of a, more of a slightly, you know, having a bad day, more anxious or depressed, and then can sort of
25:28
pull that information from the clinical side to help support the patient. I'd say one of the sort challenges really is with the ICB and knowing
25:36
where the funding flows are. I'd say that's probably one of the sort of challenges for innovators at the moment.
25:42
So rather than it being some organizational, just the funding place from an ICB level, I don't know how everyone else feels about that, but
25:49
that's kind of the main challenge from innovators that I've been working with.
25:56
Okay. Interesting. So that's is that funding for the innovation itself or for the underlying infrastructure to enable it?
26:02
The underlying infrastructure where ICBs are relatively new? Sort of set up, and it's having those structures around and
26:11
framework for funding flows. It's not that obvious and there's a bit, quite bit of uncertainty around that.
26:17
So I feel that's quite a big barrier for innovation at the moment. Yeah, no, certainly, um, Kavitha, just to bring in this conversation, I think
26:25
you have an interesting perspective in that your week is split between being a developer and an evaluator or a deploy, an evaluator and what
26:36
are the different challenges, right? I mean I'm involved in very similar activity and I know that it's relatively
26:42
easier to curate static data sets. Right. Obviously there's a lot of blood sweat and tears that goes into it, but it's static.
26:50
And you have a big cohort of data that you throw into your machine learning algorithm. Yeah. At deploy time, when you want to evaluate it, say in a prospective
26:57
trial, Now you need the data to come in the same structure and format, but you need it to come live.
27:04
So in your experience what's been, the challenges has, has that gap been too big to overcome or has it worked well in some pathways?
27:14
So, completely, as you say, Harris it's sort of then getting the outputs to the patient, which is ultimately why we are all doing what we do.
27:23
It's a little bit different on my end because I work a lot with industry in terms of industry, in a sense that the algorithms that
27:31
I'm developing are for MRI scanner. Who owns the MRI scanner, Siemens GE for example.
27:40
So to get these algorithms onto the magnets essentially you have to be able
27:46
to convers with those teams and actually convince your, their, and eventually they are your stakeholders or get their buy in to you know, have them be happy
27:54
to have your algorithm on their magnet. And that has to be robustly tested before they would allow you to
28:03
integrate your algorithm onto such a big Already multiverse software.
28:09
So it's not easy. We have at the moment incorporated some of our early work where we are able
28:16
to get in on the fly outputs for say cardiac volumes, mass Calculations,
28:23
et cetera, but that hasn't come easy and you have to obviously pre-lab everything for research purposes only.
28:30
And the other thing I think with that is just, you know, being able to provide evidence that you have tested and validated your algorithms.
28:37
So, you know, trying to train and test just on one dataset from one hospital
28:42
on one scanner type, it may not be applicable to all other scanner types,
28:47
um, or you know, different software. So, you know, you have to really get data where you can show, That
28:54
what you've developed is actually transferrable to other clinical data sets. And that's one way, which we are trying to get around some of this.
29:03
So some of my early work's been developed on noisy clinical data from Imperial.
29:08
But we're now trying to validate some of those algorithms on the UK biobank data set to get a bit more leverage essentially and validation.
29:17
Fantastic. No really interesting. I think, what's clear across all of those domains is, I think it's, we've come a long way in the past.
29:23
So say five years in the amount of attention that frontline clinicians
29:28
like yourselves, as well as, corporate and support services like it have put into data infrastructure.
29:35
I think everybody knows that without solving that nothing else gets solved. One of the things that the London AI center has been working on is
29:41
exactly those kinds of problems. I won't talk about it too much right now, but we have two sort of flagship platforms.
29:47
One is the federated learning platform, which essentially allows us to develop algorithms without moving the data at all.
29:54
Which solves some of the. Privacy and IG risks around extracting data from data controllers.
30:02
You essentially move the algorithm to the data and then aggregate the learning. And then the other side, we have the AI deployment engine which solves some
30:10
of the interoperability challenges. Kavitha that you talked about, where essentially you have , a single
30:15
to which you can deploy algorithms and which can communicate with the rest of the hospital in an in way.
30:21
I'll put a link in the chat for those that want to read up more about it, there aren't any more questions that come in.
30:26
So I'm just gonna carry on talking about stuff that I like to talk about. You guys, in all of your work have, I mean, the industry
30:34
collaboration is pre crucial to this. All right. And also at Kavitha, there's a pretty strong academic component, particularly
30:41
when it comes to evidence generation. Right. And figuring out what should we pay for and what works and what doesn't work.
30:48
And I, it reminds me a bit lesser the academic component, but when we were sort of digitalizing healthcare sort of 20 years ago, when we had sort of.
30:57
Packs contracts coming out and we had sort of the national program for it. And then 10 years later, when those contracts were coming to an end,
31:05
we learned a lot of painful lessons about how we didn't quite set up those relationships for success.
31:10
It was very difficult for a lot of us to move to our new solutions or to a new providers in your recent experience, working with industry and with academia,
31:20
what are some of the sort of key takeaway. That you've come away with, that would ensure the NHS partner gets
31:28
the most out of that relationship. Or does the relationship need to fundamentally change? Because now we're a bit wiser , Kavita, join a kickoff with that.
31:36
Yeah. So I think in terms of, , you know, you've gotta manage expectations in terms of what
31:43
your NHS partner wants to gain out of it and what industry wants to get out of it.
31:49
And I think in terms of your NHS partner, what we want, what they want is essentially.
31:54
We've got a limited amount of resources, our resources aren't growing exponentially, but demand is growing exponentially.
32:01
And so how can these AI solutions be of use to mitigate some of those issues
32:09
and of it has to be a cost benefit relationship, I think eventually and
32:14
I don't know if it has been proven yet that all of these AI solutions. Are going to be beneficial from a cost point of view.
32:22
You know, some people might argue and say, we'll just hire more healthcare staff for the price that you're paying for these AI solutions.
32:30
So I think what we are doing at the moment in terms of collecting this body of evidence, say just from my cure project is actually then.
32:38
Conducting a health economics evaluation to see whether solutions like these do actually benefit, um, you know, NHS partners and actually make a
32:47
difference to patients eventually. And that they are, soluble for mitigating some of the issues that we
32:54
foresee and are happening right now. Especially in the post COVID pandemic era where there's a huge backlog of cases,
33:02
demand for healthcare is higher than ever. So those are the things that I think we need to work on , between healthcare
33:08
partners and industry, and trying to find that ground where, you know, we're working towards a goal that together basically.
33:18
Fantastic. Christina, I think you have a really interesting perspective, particularly and I, you don't need to focus on limbic but with your role as an NHS
33:25
navigator with digital health London, from your perspective, the HSMs roles is to increase adoption of innovation.
33:32
And you are usually very good at connecting innovators to, to trust and,
33:38
and, and the like, What could we do better, from the NHS perspective to help
33:45
innovators and to get the most out of it? I think this sort of the biggest sort of barrier is around implementation
33:54
and we know that workforce sort of what current workforce are kind of on their knees at the moment. There's a huge workforce crisis.
34:02
Taking on a new innovation is just an extra thing for our clinicians and our workforce, our service managers to do when they're already
34:10
inundated with an elective backlog. so trying to get that initial implementation is a huge barrier
34:17
for the workforce generally. I mean, I think the issue is it's not just the fact that the workforce is burnt out.
34:25
It's the fact that, you know, positions are not being filled. And so even if we recruited more clinicians, if there was money to do so,
34:35
we were gonna really struggle to recruit clinicians and to recruit workforce. So AI therefore supports, can support the workforce, when they're already
34:45
depleted, um, you know, I know of jobs that have been readvertised
34:51
because they're just not being filled. Whereas maybe five, 10 years ago, they were filled very, very quickly and you'd maybe have 50 people applying for the same job.
34:59
That's not really the case anymore. So that's the sort of first thing is around implementation when the workforce are on their knees and kinda encourage innovators to do is really help with that
35:08
implementation process to sort, almost provide project management support to
35:13
the NHS trust or to the NHS organization, to help with that implementation
35:19
process, to offload the burden on clinicians, just for that initial phase.
35:24
And so it becomes almost business as usual, and then there can be a transition period across. So that would be, sort of my thoughts on.
35:32
Fantastic. And that leads nicely to what I wanted to ask Amar on this topic. Um, and it also relates to computers, AI lab, that, and the forum that she has.
35:41
Amar, you talked about, a, sort of a hub that you're trying to build and a sort of an in-house team. How far can you, can we take that your in your local account as a
35:49
PCN, or just generally as an NHS where we are sort of in housing, some of the deeper technical E.
35:57
When it comes to AI just conscious of time, but briefly your thoughts on that would be interesting on how far we can go with that.
36:04
Yeah. I mean, there's a, there's an interview with Satya and Nadela who's Microsoft's CEO and, they ask him what is the real innovation that they see with AI and his
36:12
response was, he's most excited about the grassroot uptake, the idea that someone
36:17
who's doing a job every day can have a. A bit of software will code that is already built for them.
36:23
So they can then apply in a, you know, a day to day thing that can help. And that's essentially, you know, where the real innovation comes.
36:28
It comes from the kind of, you know, the ground up. And so it's kind of enabling people to do that now, you know, realistically
36:35
we can't expect everyone in an organization to become a specialist. And whatever data science, that's just not gonna happen, but you can
36:42
certainly target people that are interested in it and have, have some sort of local structure, whether they can then engage with the wider group.
36:48
So I think it's about, you know, providing opportunities locally to do that. And then, you know, providing support where they can take it to the next stage.
36:55
So, you know, we have like local innovation agencies or innovation funding to where we can take that route.
37:01
I suppose. The other, the other element is that I suppose it's Kavita's point about expectations. I mean an AI product is essentially a startup really, or it's being used as a,
37:09
like a similar way to a startup company. I mean, how many startup companies are still alive in five years time?
37:16
10 years time. And I think that's the reality of it. So, it's being aware that, you know, when you're investing in this area , be
37:23
realistic about what you are trying to achieve and in infrastructure and governance and ethics and , all of that kind of thing can then grow other
37:30
elements rather than gambling essentially on a few topics that questionable,
37:35
whether they'll be around in five years. Fantastic Amar no, very, very pertinent point at the end.
37:42
No, completely agree. That brings us to one minute to go. And so I'll take the opportunity to wrap up then.
37:48
Thank you so much to the panelist for sharing their background and the work that they're doing. , All very much at the coal face and really grounded in the
37:57
realities of delivering healthcare. So hopefully the audience appreciated that. And of course, thank you to, to the audience for listening
38:04
in and for participating with the questions beforehand. There's been a bit of chatter in the chat and that's mostly about
38:11
connected with some of our panelists. So hopefully we can follow through with that, at the end from my
38:16
perspective I just wanna say, yeah, thank you to the organizers as well. There will be a recording, like I said, at the top, um, that's gonna be
38:24
made available on the DA web pages. And finally you can follow DART-Ed on Twitter, uh, and hopefully the handle
38:33
will be in the chat for you to follow. And then you can stay up to date for any future webinars and that's everything from me and hope everybody enjoys the heat.
38:41
Take care thanks. Bye bye.
Media last reviewed: 3 May 2023
Next review due: 3 May 2024
Webinar 4 - Digital and AI transformation in Dentistry - where are we?
Digital technology is transforming how dentistry will be delivered in the future. Adopting digital opportunities will enable staff and patients to confidently navigate this new digital environment.
This webinar provided a scene setting to digital readiness in dentistry, as well as demonstrate the potential role of AI in dentistry and the interoperability challenge in the context of the profession.
On the panel were:
- Sam Shah (chair) - Chief Medical Strategy Officer, Numan and Digital Health Research Lead, UCL Global Business School for Health
- Andrew Dickenson - Chief Dental Officer for Wales
- Dr. Hatim Abdulhussein - National Clinical Lead - AI and Digital Medical Workforce, Health Education England
- Dr. Vinay Chavda - Academic Clinical Fellow in Dental and Maxillofacial Radiology, Birmingham Community Healthcare NHS Foundation Trust
- Dr. Tashfeen Kholasi - General Dental Practitioner and Vice President, College of General Dentistry
Watch a recording of the session below.
Watch webinar 4
0:04
Presentations and discussions from our fantastic and esteemed panel. So I'm Sam Shaw, I'm the chief
0:10
medical strategy officer at Newman and Digital health research lead at UCL Global Business School for health.
0:16
And I'm joined today by an amazing panel of people that many of you might know across the sector.
0:22
We have doctor Tashkin classy who is a general dental practitioner, CIO and the Vice president of
0:29
the College of General Dentistry. We've got VH other who is the.
0:34
Is an academic clinical fellow in dental and maxillofacial radiology and at
0:40
Birmingham Community Healthcare NHS Trust. We've got doctor Hatim Abdul Hussain who is the national clinical lead
0:47
for AI digital medical workforce. HE and also the medical director of one of the HSN's and of course Andrew
0:55
Dickinson who is our Chief Dental Officer for Wales and has of course got a big interest in digital and
1:03
technology and I'm sure be sharing. These insights from a policy perspective say welcome everyone.
1:08
So with that, I'm going to ask each of you to just briefly introduce yourselves in terms of what you do,
1:15
but also tell us about your views and thoughts on digital dentistry.
1:21
I'm going to start off with Tashfeen. So Tashfeen, over to you first of all.
Tashvine
1:28
Hey hello everyone, thank you for inviting me to this. Really honored to be here with
1:34
such amazing talent. My name is Doctor Tashman kalasi. I'm a general dentist but I have a very,
1:39
very strong interest in digital dentistry. My previous path to dentistry was I
1:45
did completely science and maths at university and worked in local authority
1:50
working on interoperability projects. So I have a strong interest in how we connect
1:56
systems together and I then decided to. Go down the dentistry pathway but realize how disconnected as a
2:03
profession we actually are and with the rest of the health service, which led me to work in health
2:10
education England as a clinical leader, a clinical fellow in leadership and with HEB back in 2017.
2:17
And that led me on to this path of being able to work with the NHS England
2:23
and NHS X as the clinical leading digital dentistry where we delivered the 1st.
2:29
Proof of concept that dentistry can be connected to the rest of healthcare and we did that with
2:35
urgent dental care and NHS 111. So we know that our suppliers in dentistry can connect to the wider system.
2:42
And so we can do that data sharing piece where we're we're passing data from the health service into
2:48
our dental services with you know, the right patient name, right patient details, what their primary concerns are
2:55
and and my other interest in the digital realm is actually. Frank, clinical safety and health IT.
3:01
So I'm also a clinical safety officer for a couple of different organisations and it's about how
3:07
we clinically manage the data that we're passing and the systems that we're using to make sure that
3:13
we're not introducing any harm or introducing any new hazards to how we manage our patients.
3:20
It's all about safety and health IT and that's where I think I'm going to stop for a moment and and
3:26
pass on to our next esteemed guest. Thank you very much. Ashe and it's great to hear about
3:31
your path from technology through to dentistry and now bringing it together, I'm next going to move over to Vinay.
3:39
Vinay, please. Can you tell us a bit about yourself, but also your thoughts and
3:44
views on the sector? Hi, my name is Vinay Chavadi. I am currently an academic
Vicky Chabla
3:51
clinical fellow in dental and maxillofacial radiology at SD2 level.
3:57
I actually got into dental radiology through my work as a Topal pillow
4:04
where which was a HE funded programme looking on on the back of the
4:09
the total report looking at the future of the healthcare workforce. Across all healthcare professions,
4:15
but particularly I was focusing in dentistry and recognizing that in dentistry we use a lot of
4:21
dental radiography to treat our patients in almost every day. From that I stand the project
4:28
looking at AI in dental radiography and dental radiology and that's kind of how I ended up doing my
4:35
job today and and training to be a consultant in dental Max radiology. I've got a few more slides,
4:41
but I'll show you later on once we've done our introductions. Feel free to share your slides at this
4:48
stage and yeah, please. OK, great. So I could just it was easy for me to jump down my my points on the slide
4:55
so that I don't forget anything. So, so yeah this was the the the scope of fellowship and this is kind of how
5:02
I ended up getting my interest in in digital technology and artificial
5:07
intelligence in imaging in particular. So these this this table is a
5:14
slightly out of date from 2019, but it shows that the ratio of acceptances
5:20
and applications is just behind dentistry for artificial intelligence courses at university and these were
5:28
UCAS figures that were that we use here and actually above medicine as well.
5:36
My other. Thing that I just want to bring up is that with all of these AI technologies is that there
5:43
is a a hype cycle and this is, this is first developed by Gartner the the, the management consultancy company.
5:49
But where we are within dentistry, I think there is still quite an
5:55
inflated expectation of what AI can do and soon and everyone on this on
6:02
this talk will still be aware of some of the limitations of some of the digital technology and I guess that's
6:07
what we're here to talk about today. But my.
6:14
Current area that I would like to reinforce is when we look at the
6:19
number of published papers on PUB Med. For instance, this is these these two graphs that I pulled off
6:26
the pub Med database yesterday. When we look at dentistry and implants that actually peaked first in 2016,
6:33
AI and dentistry has not yet peaked and we're not nearly at the level of the the
6:39
published research that for instance dental implants have reached yet and. When I had a chat with the
6:46
editor of the Journal of Dental and Maxillofacial Radiology, almost half the journal is now
6:52
looking at AI and dental imaging. And they were now looking for a
6:57
bioinformatician to go onto the editorial team to ensure that the research that is now being published is good quality.
7:04
And that's I think something that we need to be aware of as dentists to say what is coming out there in in future.
7:11
So that's where I'll stop. Thank you. Really insightful. Vinayan, thanks for sharing those thoughts,
7:16
especially having been one of the first Topal fellows as a dental surgeon, but also now embarking on a pathway that
7:24
very much touches on all parts of AI. So it's super, super helpful to hear those thoughts and we'll come back to some of
7:30
that in our discussion, especially around the Gartner Hype cycle. And it sort of goes back to a point that
7:36
Tashi made about the the system as we see it. Well, next I will come on to Andrew Dickinson,
7:42
Chief Dental Officer. For Wales, over to you, Andrew. Sam, thanks very much and and and it's
Digital Readiness
7:47
a real pleasure to be sharing this webinar with everybody today. As you kind of introduce me I'm
7:53
currently Chief Dental Officer for Wales but before that I worked for Health Education England as a postgraduate dental Dean and and
7:59
one of the real pleasures in My Portfolio was to support the whole
8:05
tell team as it was developing and starting to move into the AI robotics
8:10
digital and simulation field. And and as you alluded at the beginning I was not an expert in this
8:17
but it's amazing how quickly you upskill and and and that led me to to. Consider the whole concept about
8:23
digital readiness and and from there that led onto a paper which had him and I had published a as
8:29
a as a commentary in the British Dental journal earlier this year just to start that conversation.
8:35
And I think that's probably some of us on this call have that role in terms of asking the questions
8:41
not being able to give any answers but just to just to start getting the the conversation flowing and
8:47
and for me digital readiness always sounds a very simple concept and. And in dentistry we have been early
8:54
adopters of of technology but I think at while technologies is advancing at a really quick pace at the same
9:01
time I we have to keep thinking is everybody coming on that journey at the same time and and and it's
9:08
probably an overused terminology but you know we have a widening digital divide and and as we learned a lot through
9:14
the events of the last couple of years we've had to adopt technology a lot a lot faster than we ever imagined.
9:22
But does that really mean that that we are ready and when I say we that's not just the profession but also the the
9:29
patients that that we're working with? And my worry is that the digital divide tends to focus on access
9:35
to the digital technologies, whereas for me readiness actually refers to the preparedness such as
9:40
do we have the skills and do we have the trust in that technology as this influences the adoption of digital tools,
9:47
which for me and would be nice to talk about, this I think is a very separate
9:52
issue to access issues. Just taken recently by something dropped through my front door from Lloyds Bank and they've been doing
9:58
a lot of work in the digital sphere. And they were saying in the UK we have 99% Internet coverage,
10:04
but there's over 1/4 of the population which actually fall into the lowest
10:10
level of digital capability and there's 4% of our population have no basic digital skills at all.
10:16
They also showed, and This is why I read their document, is that 34% of the population actually
10:22
struggle to interact with healthcare services and this is where digital
10:27
readiness is going to be really prominent over the coming years. The other conflict I have is
10:34
people talk about digital literacy and and and healthcare literacy. And for me that's almost a little bit
10:41
of victim blaming because the system hasn't actually adapted to allow people who are struggling with digital
10:48
readiness to actually be supported. There's an assumption that it's up to them to go and go and learn this.
10:53
So there were just three messages, Sam, I wanted to say about digital readiness because I think there are three interconnected elements that
10:59
might be helpful for our discussions. One is, do we have the. Digital skills. Are people actually able to initiate
11:05
an online session, surf the Internet, search for content in a meaningful way?
11:11
Secondly, trust. Do they actually trust the the information that they either read online or do they trust the system?
11:18
And then the third is, is usage what? What do they actually use digital. Digital form? And interestingly,
11:23
all three elements can actually be measured. That can either range from asking
11:29
very simple questions. So in terms of digital skills. How confident are you getting on the Internet?
11:34
If you've got a new piece of electronic equipment, can you set it up or do you need people to help you with?
11:40
And and certainly from a dentistry point of view, how confident are you whether you learning? I think these are very important
11:46
questions that we can just start to measure people's preparedness in terms of trust. That's really complicated and and as
11:52
you alluded there is a policy element in that is the infrastructure right? Are there systemic elements
11:58
in involved in all of this? It's OK saying we need to go online but actually are. In providing people with the software,
12:04
the hardware, the, the, the data security, the financial element behind all of that.
12:09
And of course we've got to overcome the usage element which is there are some early adopters, there are some people who are the
12:16
hesitant users and there's some which are actually non adopters yet. And we've got to think about how
12:21
do we bring everybody in there. So in fact this concludes some just by by just my suggestion is that around
12:27
digital readiness does require us to think a little bit differently and just move away from the technology adoption.
12:33
Element, let's have a look at the culture and the behavioral changes that are required especially in in healthcare as we're
12:39
moving patients into that sphere, there's the human adoption element and that means that we've got to
12:47
become more proficient and that proficiency starts today, it starts through education.
12:52
I signpost everybody to the HE resource on on the digital readiness education program,
12:58
but in in conclusion, we can actually actually measure people's digital.
13:03
Readiness using a metal digital readiness score and that takes into account both our
13:09
digital capability and our proficiency. And I think if we can think like that then we've made a big step forward.
13:15
So thank you very much. Thank you, Andrew. I really do and very much like
13:20
the reference to the sort of the take on digital literacy. And actually we need to change the way we frame it.
13:26
And and I think we're absolutely right that yourself and other very senior leaders are there to ask the
13:31
right questions and get people to. To think about what we might mean and what we can do to change things for patients.
13:37
And thank you also for mentioning the Lloyds Consumer Digital Index for anyone that wants to read it. I've put the link into the chat
13:44
there that you can follow up on, but in a moment we'll come over to Hassim and Hakeem. You know, we've,
13:50
we've heard from Tashfeen about the disjointedness and the system, but also her journey through
13:55
working in national organizations, trying to make a change in digital dentistry. But also how she came with a set
14:01
of skills we've heard from Vinay. About the total fellowship and the opportunities it brings,
14:08
both in terms of his own work but also what it means for the profession as a whole. And then we've heard from Andrew
14:14
about the work around digital readiness and what that might mean, how we score people, the difference it makes.
14:20
And we didn't mention the the digital poverty in common to share in Wales, where telehealth is almost nonexistent.
14:27
But having you been leading on this program, you've been running large parts of the system and you
14:33
have a role in innovation. Be great to hear from you about what people can access, but also your your thoughts
14:40
around innovation in healthcare. Thanks and I'm gonna bring up a couple of slides but but but whilst I do that,
Oral Health
14:46
I mean it's really important to say as a as a practicing GP or all health is essential and it's really real pleasure to be around the
14:52
table with with oral health experts. But ultimately as a generalist you know that is important for
14:57
for the wider workforce as well. I think outside of dentistry as well we should all be thinking about how we promote oral health.
15:03
We know it's linked towards cardiovascular disease and other general health outcomes and we know actually it's
15:09
a really important factor in being able to just generally understand. How well, uh, someone is, you know,
15:15
coping anxiety and and and whether they need any further support. I mean I've been, I've had the pleasure of of of
15:21
working with our local primary care network and you know we need to do more within primary care networks to bring general practice dentistry
15:27
pharmacists and other healthcare professionals at that kind of level of care at a neighborhood level,
15:33
at that place base level together to be able to deliver outcomes that are going to make a difference. And what technology is one way in
15:40
which we can do that and there's one enabler. To be able to to allow us to do that,
15:45
I'm going to provide a brief update on our digital AI and robotics and education program and share as a
15:50
Sam you mentioned a couple of well a couple of opportunities that are on the horizon. So I know some of the questions
15:57
that have come in, uh prior to the conversation today, I've talked about some of the factors that we know are gonna
16:03
impact on confidence in AI. So these are things around regulation, standards, liability and we've got a previous
16:12
webinar on that goes into more detail around how and what drives healthcare worker confidence in the eye.
16:17
So please do check that out. That was out in Webinar 3 and all of these are recorded on on the digital
16:24
Academy website, which has been. Shared in the chat. And and and so just to highlight that
16:29
we recently published a developing healthcare worker confidence in a report which which talks about all of
16:34
those levels from a national level in terms of regulation standards, from a local level in terms of local validation that's needed when you're
16:40
delivering an AI or digital solution. And ultimately then at the coal face of that relationship with the patient,
16:46
you know how does the technology like AI impact on the way we make decisions and and and the way we
16:51
interact with the people that we serve. And so if you want a little bit more reading and background information on that please do check.
16:57
About those pieces of work I'm really pleased to announce and this is this is this is I think it's
17:03
fantastic that we are opening up the the fellowships in clinical artificial intelligence to both medical and dental
17:09
postgraduate learners and this is the second cohort of the fellowship program and the first cohort was London but
17:15
the second cohort is wider than London. There are now posted in in in most regions of of of England and and and
17:21
so I encourage you to have a look at the website and find out more if you are a dental learner.
17:27
That is currently in the SD three-year of specialty training or is in a program
17:34
which ends at SD3 and in their ST2 year, then you will be eligible to apply for these fellowship programs.
17:40
And what's really unique about these clinical artificial intelligence programs is that it's an opportunity to
17:46
do work that's AI deployment focused. So ultimately it's moving away from some of the research and development
17:51
that actually being at the innovation space of we're actually thinking about embedding some of these technologies
17:57
and improving. The care of the people that we serve through there through that. So it's a really interesting time and
18:02
opportunity to to join that program. Sadly the top or fellowship program is about to close for applications today.
18:08
If someone is out there and is really keen to get an application in today, there is still time.
18:13
But but if not I mean I am pleased to say that in all cohorts we've always had a representation from from
18:18
dentistry and oral health and I hope that continues in in the next cohorts. And then next year we will be
18:25
relaunching the flagship NHS. Little Academy uh program around the digital Health Leadership program
18:31
which Sam was on cohort one and a real trailblazer in this space and and so we heard that people will
18:36
keep an eye out for that. If you are keen to like Sam and like Hashman move into really senior dental
18:43
leadership roles around digital and that would be great to see. So just to highlight these opportunities
18:49
exist they're open to to our to our dental workforce out there and I hope that you found hearing from from the
18:55
esteemed panel today really useful and we look forward to answering some questions.
Digital Academy
19:08
Thank you Hashem for that. And that was brilliant. I'm just going to say that I actually did in it and it just digital Academy as well.
19:15
I was part of cohort too. And so if anybody is considering applying for either Topol or one
19:20
of these clinical fellows in a, I highly recommend it because the one year that you spend thinking
19:26
to yourself are I'm going to step out of dentistry, clinical dentistry for a year,
19:31
it will change the way you think about healthcare. So highly recommend it, take the opportunity.
19:36
If it doesn't work for you, that's OK you know you can always go back into clinical work. Actually, better butter,
19:41
but try something different. There's no harm. That's a that's a great plug. That's fine because I think both
Questions
19:48
yourself and Vinay have taken the courageous steps of doing things that are slightly different and a different
19:53
pathway that have resulted in in, but in both of you being able to do some excellent things that have influence at a system level.
19:59
So, so fantastic plug and thank you. So now we go to questions and we've got lots of questions already lined up,
20:06
but I also like people who have joined to to ask questions. So what I would say is if you'd like
20:12
to ask a question and take yourself. Put yourself onto camera if you can. But either way,
20:17
click on the reaction button so I can then see your name pop up in the participants and I can then see
20:23
that you want to ask a question. But I'm going to take the the, the first question and it goes
20:28
first to Andrew, then to Tashfeen. We've got a question from Sammy Stagnaro.
20:33
And that question is about is the fragmentation of dental services
20:38
an issue with collecting and implementing large datasets, solutions in primary care?
20:45
Uh, Andrew, first I'll, I'll come to you and then to Tashfin. Yeah, so brilliant,
20:51
brilliant started question. I mean data is key to everything that we do is key to research, is key to development,
20:58
is key to policy development and I think. We are moving in a new era where
21:03
we need to understand how the care that we're delivering is, is having an effect at a population
21:09
level and and at the current time a lot of the data that we collect in primary care tends to be more transactional.
21:16
It's, it's around billing, it's it's a bit, it's appointment scheduling,
21:21
document storage and it could do a lot more that than that.
21:27
The fragmentation of dental services is an interesting element because.
21:32
We are still siloed to a certain degree between primary community and secondary care and and we we
21:38
have an insight into what each of those areas are doing, but it isn't a a a fluid flow
21:45
of information between between the three sectors. The other element around data
21:50
collection are the large number of pieces of software that that practice is used as part of their practice management system.
21:58
In Wales where you've got 14, in England, I think it's 25 or 26. And so straight away some
22:04
of those are inaccessible, some are cloud, very well, very few are cloud based presently.
22:10
Most of it requires a local server. So it's very difficult to extract the data from from these systems.
22:16
So I I think that. Is is is a real challenge to us and of course the question then is do do we
22:23
start thinking in dentistry similar to to the to General Medical practice. Do we go for one or two pieces of
22:29
software where you can have it all cloud based you can start extracting that that information and and and
22:35
maybe this is where when I was talking about digital readiness we've got to be more trustworthy about the data.
22:41
I think people are very suspicious sometimes when we're asking for data we've gotta be mindful that when we ask
22:47
for data it can be very time consuming. For people to have to go and and and download it or or put it
22:53
into a format that can be used. So maybe that's where as we go through the,
22:58
the the the literacy and and upskilling of of practitioners for example into into how to use data.
23:05
We should also be asking why are we using the data? What does it, what does what do we want to do for us?
23:11
But I think we can't, um, it's going to be difficult until we are probably more cloud based.
23:19
Wow, you think sorry, lots in there. I mean and and Tashfin there's lots in there. There's a piece in there about
23:25
interoperability, integration, data privacy and trust and and and of course you know the,
23:31
the, the perennial issue that we face about those systems and where the data is held.
23:37
Tasha, you've been a CIO, you're GDP and you've been at the center trying to piece this stuff together.
23:42
What are your thoughts? I think the first question in my mind is what are we trying to what's
Interoperability
23:48
the problem we're trying to solve? You know, everyone's talking about AI and how it can do this amazing.
23:54
All these amazing solutions to how we gonna how we're gonna provide care actually what are we actually trying to solve.
24:01
And we have lots of data points and dentistry. I completely agree. And we have lots of different
24:06
systems and they're in different parts of the dental health sector and how do we connect that up? So that's the interoperability issue.
24:14
We don't really have any standards with regards to interoperability for dentistry and per se. And there's.
24:20
Been a an attempt to use snow mid, but actually is that the way
24:25
forward should be reusing others forms of interoperability standards like HL 7 and fire.
24:31
And I think one of the things that we were trying to establish a few years ago was actually let's
24:37
get some common standards going so that we can have a digitally connected dentistry system
24:43
across the across the network from hospital services into primary care.
24:49
And we have a whole host of information just sitting in our in our systems in practice.
24:55
And that can pick up anything from all pathology to caries management if we're going to be looking at X-rays and imaging.
25:02
And to being able to. Plan things like orthodontic treatment and you can use AI tools to be able to do that.
25:09
But actually ultimately, and this is going back to what you were saying about digital readiness
25:17
and how are we preparing our profession to go forward and using technology.
25:23
The other issue is, is that when we're talking about interoperability is that we we're often talking about how servers are connected
25:30
and how we can process large chunks of data. But that's only half of the solution
25:36
because actually from a front end, we've got user interface, we've got users who will need to be
25:43
using several different systems. And one of the hardest things and frustrating things is the clinician and working in hospital and working in
25:49
practice is actually when I have to log into multiple different systems to be able to access that data. So if I've got some AI running in
25:56
the background that I'm going to help to use as an adjunct to my care, I want that to be integrated to
26:03
whatever system I'm using to make that seamless workflow. So that's all part and parcel of
26:09
what we're trying to achieve. And then my other issue is around clinical safety and I love clinical safety.
26:14
I think it's really important and we need what is clinical safety, what what is it is.
26:19
A lot of people here well think of it in terms of what we're doing in practice, but what do you think of clinical safety
26:25
as clinical safety in health technology. So it's health IT and it's about
26:30
ensuring that the data and the technology that we are using is safe to be able to be deployed.
26:37
Into the services that we're using for safe patient care, so the data,
26:42
so for example, you may have a patient eight comes in, they've been expecting to have some
26:49
investigative tests and they're expecting that data to be presented
26:54
to them in a way that's usable and has got clinical significance. But actually through the pipeline
27:01
that it's gone down to be able to get that investigation and that outcome has something gone wrong.
27:07
And has something fallen down and just think of your Swiss cheese model and that.
27:14
Error has basically penetrated through the technology to the degree that actually you've got a patient
27:21
now that's being for example, being told that they've got an oral cancer when actually they haven't.
27:27
It's just that something's gone wrong in that pathway. So it's actually ensuring that the technology that we're using
27:33
isn't introducing any new hazards, isn't falling down and causing problems when it falls down,
27:39
it should stop at that point and and making sure that they're the tools that we're using.
27:45
Is providing the diagnosis that we're expecting and not introducing new harms.
Clinical Safety
27:52
Has been. I think that's really helpful to highlight those particular things and I was trying to get to the chat to
27:58
put in the link to DCB 0129 and 0160, but I can't get into the chat. But if you can do you share the links
28:03
if people can look those things up and understand clinical safety in health tech, but health tech sorry gone.
28:10
So I was going to say that that's an NHS digital standard which is relevant to only in England. So Wales and Scotland,
28:17
Ireland is not relevant. However, the. The theory behind it is,
28:24
so even though we don't necessarily have to use it in our allied countries,
28:30
it's something that's definitely there are clinical safety offices. So clinical safety officers has to be a clinician,
28:36
has to understand how health IT works, has to understand the standards and they
28:41
then can become somebody that will then promote safe health IT in healthcare
28:47
and that's opened a dentist as well. So as far as I know, I'm probably. One of the few dentists who is a
28:55
clinical safety officer in the UK, but as we grow we're gonna need more.
29:01
So absolutely and I think a good call for people to join that section of the the the profession and you
Xraybased AI
29:07
know as we move into the world of AI and technology that was that will be AI driven as well.
29:12
You know, this will become increasingly important in the same way that it would be for a medical device which sort of takes us on
29:19
to our next question which is for Vinay. This is about, you know,
29:24
Peter Bailey who many of you might know who has been working and leading on on tech in a big provider
29:31
organization and what's your view towards X-ray based AI technology in the marketplace at the moment,
29:37
something I'm sure you're very familiar with. Yeah, absolutely. So I mean my last time there were
29:43
14 companies worldwide which have got some sort of dental AI imaging
29:50
algorithms which are open to market. Now my thing with this is going back
29:55
to the safety element is that we are all GDC registered and we are got
30:00
to put up patients best interests 1st and there was again the word tool that was just brought up in
30:06
the conversation and essentially AI imaging software interpretation. He's just another tool that we can use
30:13
in our day-to-day practice to make our jobs easier, make them safer, make make sure we don't miss
30:19
anything for our patients. Now with all tools, you can use them the right way or the wrong way. You can you can put a Rotary endo
30:24
file down a down a canal and fracture the file if you're if you're not careful or safe enough. But that's similar with the artificial
30:32
intelligence imaging algorithms. So if you're not sure on what you're looking at or you're not sure on
30:38
how the data is presented to you, you will make errors and then you won't be putting your patients best interests forward.
30:43
So there is lots of availability out there. There's lots of companies out there called
30:48
be providing promotional information. Promotional advertising for themselves because it is a free market which
30:55
is it that's healthy and that will build AI software out there. But as clinicians we need to make sure
31:01
that we if we are going to procure this software we do correctly and through the right standards that are
31:07
set out by by by UK standards and and we've got many many regulators
31:13
of of digital technology Mr MHRC, QC all these different regulators
31:19
to make sure that. The software that we procure in our practice because essentially again primary care are all separate small
31:27
individual companies and they will procure different pieces of software and whether that is AI imaging or
31:33
whether that's dental software, electronic patient records, we want to make sure that we're doing
31:38
the right thing for our patients. So how do we assess what's best? And I guess the answer is up to up to
31:45
the regulators within this country, but there are lots out there, they are getting better.
31:51
Slowly and there are still some errors, and again that's when you need to be assess the literature,
31:57
assess what is being published and and be able to determine what
32:02
is good and what is not good. That's really interesting and you know particularly around that's what the issue
32:09
around the evaluation and assessment. I am in the moment being being touching back in on this but just on
32:15
that number of people who are on this call today and and myself were in a discussion with Eric Topol earlier
32:21
this week and one of the things he referred to was the bias in in datasets especially when you're building
32:27
AI are you worried about the risk that when we're building technology using AI and dentistry inherently.
32:34
We're using a very biased data set. If we were to take an NHS data set, let's say in in the UK,
32:40
the population it comes from itself could be biased. Is this a worry for you? Well, actually I mean the the aid,
32:46
all that garbage in, garbage out kind of thing. So like yes, we all know that term, but what does it actually mean there?
32:53
The one of the biggest sighted papers in in that dental and maxillofacial
32:59
radiology journal showed the AI numbering system of an RPG.
33:05
But it got thrown very, very quickly when an implant was thrown in or a root canal treated tooth was thrown in.
33:11
And it's very, very easy to throw off these algorithms. If you've got something which has not been seen before,
33:17
so if the algorithm was trained on a patient population where that type of implant has never been used in
33:22
this country or not regulated to be used in this country, that will throw off that that algorithm.
33:28
So yes, absolutely. You've got variable demographics of patients and ethnic backgrounds
33:34
in this country which could absolutely throw off off different, different pathology. Tiffin, you're gonna come in there.
33:41
I was gonna just mention about the regulatory aspect. So you're looking at software as a
33:47
medical device, so that ISO standards. And so any technology that you're planning on procuring,
33:53
you need to make sure that they've got the right regulations and that they're meeting the right standards. Because if they're not,
33:58
that's when you're going to be bringing an inherent risk. And is there anything in particular that you're finding is evolving at
Regulation
34:05
the moment of regulation in the UK, especially now with the the change from Europe? Is there something that
34:11
innovators should be aware of? And I think. Again, software as a medical device,
34:16
I think that's something that wasn't massively heard of in the past,
34:23
but it's definitely something that's becoming more familiar. And I think as we're getting
34:28
more and more entrance into the market and you need to be aware of that regulatory standard, you've also got things like the software
34:36
development cycle that you should be. And adhering to for best practice
34:42
because at the end of the day when something goes wrong and you were audited by the regulators,
34:48
then they are going to be looking to see what what processes were. And you need to be able to justify why
34:54
you've done what you've done to make sure that you've made things clinically safe. Mean Hasan, you probably have
Advice for innovators
34:59
lots of innovators coming to you in the HSN and of course in HE. What's your general advice to
35:04
them when they're starting out and thinking about the standards they need to meet and the type of evaluation they need for a product.
35:10
Especially what could be in dentistry? Yeah, it's a very, it's a very difficult landscape to navigate and it takes some
35:16
time to to truly understand it. And that's where the role of organizations like academic health, science networks that have experience
35:22
in running innovation exchange programs have experience in working with them, small and medium enterprises to
35:27
be able to overcome some of these challenges is really important. And you know getting set up on
35:33
some of those kind of accelerators are really important to do and and and our colleagues at Nice Mehra
35:38
have been doing some work in the background to really make this place. As clear as possible. They've got a a platform that they're
35:45
currently piloting is in beta phase and and and that will be a really useful platform to guide people
35:51
that are working in this space to understand what kind of regulatory boundaries they need to overcome. There is an evidence standards
35:57
framework for AI from from Nice as well which is is is is really important because it starts to think about the type of evidence and the level
36:04
of evidence you need to be able to overcome and to to be able to have a solution into the market or enter
36:10
the the NHS and and so that's that's really important and we are seeing. Transformation in this area,
36:15
I think regulators are taking notice. But as with anything, these things take time and and and so it's important to say that time
36:21
is is needed because these are very sensitive matters. These are people's lives that we're dealing with. These are we know how important
36:28
oral health is, we know how important avoiding bias in the way we make decisions is.
36:33
And therefore we need to do these things safely. We need to do these things ethically and we need to make sure that we
36:39
have the right infrastructure. I only the other day when I think about infrastructure, I was listening to the discussion
36:45
on whole genome sequencing that's taking place and and things like that. You know the use of CRISPR but I
36:50
won't bore you with that now but I'm going to change tact and think about that framework we we know that
36:56
technology is being used across the NHS already advanced technology that's being used for genome
37:03
sequencing but in dentistry we often don't even see some of those now might seem
37:08
as basic or routine technology emerge. I guess starting off with Andrew and you know we we've seen extreme
37:15
reality being used in maxillofacial surgery where colleagues at kings and elsewhere have used the technology
37:21
to to to plan surgery without having to necessarily start operating or
37:26
have more precision around the type of device or implant they may place.
37:32
What are your thoughts on the NHS framework and can we really implement and utilize technology advances
37:37
in the same way as other sectors and you know are we really? Able to keep abreast of AI or other
37:44
things in the NHS environment. We want, the simple answer is we can and we should.
37:50
But the more complex answer is it's it's really difficult and it's very difficult because as a professional
37:56
group we're actually quite small compared compared to medicine. We tend to be almost primary care
38:04
orientated which is a really strong place for for us to be if you were going to make a whole system change.
38:12
But at the same time we're all the majority of frontline clinicians and. And therefore, therefore,
38:18
the opportunities to start looking at technology falls into their spare time.
38:24
So you're absolutely right. A framework is, is that guidance, which is, is, is, is a yardstick for us all to aspire towards.
38:32
But the adoption is going to be very different. And I think there are various policy areas around this.
38:38
Number one is if it's something that we really want, and I think we've all said this in our introductions, if it's something that we really want
38:44
and we can answer that question honestly, what is it? Trying to do for us. Then that's where we have to
38:50
drive it through policy. We need to be funding this. The personalization in medicine is
38:55
going to come into dentistry and and we're starting to think like that. We're moving away as again our
39:02
oral healthy starting to improve. We are looking more at what can we do with our patients rather than what
39:09
can we do for our patients and and technology is is going to be perfect
39:14
for that in terms of monitoring, supporting the patients. More of that M health would would
39:23
come into the primary care situation. But I guess that that that's one of
39:28
my other concerns which I said at the beginning is obviously when new technology starts to come along,
39:33
we see it at conferences, we hear conference presentations, we get very excited and we go and spend money and we buy something
39:40
nice and shiny. I'm being cynical now, but we buy something nice and shiny and then we don't utilize it to its
39:46
full capability because we're not either using it for the right reason or we or we don't have that level
39:51
of practice where it's going to be adopted on a more regular basis. So you're quite right in in the
39:58
secondary care, that's where advances have probably been driven a lot faster than than primary care.
40:03
But I strongly believe that if we can get what we're asking correct and then we
40:09
negotiate that appropriately through policy, then we've got a real opportunity to make substantial changes in the
40:16
digital world within dentistry. But I think the caveat on that is we've got to have those open conversations
40:22
about what is it that we want to do. That, that, that that really nicely sums up our session today which
40:29
comes to close in a few minutes. But I suppose you know really to summarize that I guess we need to avoid sort of
40:35
searching for problems for solutions that might exist but really being problem focused around what is it want to solve,
40:41
what's the business benefit we want for the system and what's the outcome we want for the patient. And certainly I'm,
40:46
I'm as many people know Seneca of lots of things, but one of them is sometimes where we end up adopting technology for
40:52
the sake of adopting technology and I've certainly seen. Any of those over the years in the NHS and elsewhere,
40:59
but that takes us to the end of our session today and the questions and discussion probably could have
41:04
carried on for a few more hours. But I'd like to take a moment to first of all thank her team and the
41:10
team at HEB for putting this session together and the work that they're doing around digitization of the
41:17
profession and professionalization, which is absolutely amazing in itself. Urge you all to look at the website and
41:23
apply where you can. Thank you to our. Fantastic panel to Tashfin, to Vinay and to Andrew for championing this,
41:30
this particular area of digital transformation, AI and data in dentistry. Please do keep it going.
41:36
Let the conversation continue. But thank you very much everyone for joining us today and your questions and see you next time.
41:45
Thank you. And thanks Sam for an excellent sharing of.
Media last reviewed: 3 May 2023
Next review due: 3 May 2024
These webinars also include the work the DART-Ed programme are undertaking around education, and preparing the workforce with the knowledge and skills they will need, now and for the future.
Webinar 5 - Artificial Intelligence (AI) and Digital Healthcare Technologies Capability framework
This webinar looked at how the AI and Digital Healthcare Technologies Capability Framework, published on 21 February 2023, addresses the need for our healthcare workforce to continually adapt to meet the needs of the society it serves. Health Education England (HEE) commissioned the University of Manchester to perform a learning needs analysis and develop a framework outlining the skills and capabilities to ensure our health and care professionals can work in a digitally and data enhanced environment.
On the panel were:
- Professor Adrian M Brooke (Chair) - Medical Director (Workforce Alignment) Health Education England, and Honorary Professor, University of Leicester
- Dr Hatim Abdulhussein - National Clinical Lead for AI and Digital Medical Workforce, Health Education England
- Dr Alan Davies - Senior Lecturer for Health Data Science, University of Manchester
- Professor Sonia Kumar - Professor of Medical Education and Associate Dean, University of Leeds
A recording of the webinar is available below.
Watch webinar 5
0:05
12:45 twelve 45. And thank you all for coming those who have logged on,
0:13
and we hope you're able to stay with us for the whole time.
0:20
So can we ask that
0:25
All people who are joining the webinar have their cameras turned
0:32
off and the microphones on mute.
0:37
The session will be recorded.
0:44
And if you'd like to ask questions, but this is supposed to be a kind of a chat.
0:51
And therefore it's really helpful if there are questions if you can put them in the chat window,
0:58
which you can see if you click on this series of icons at the top of the screen.
1:05
The very first icon in the middle at the middle of the teams screen says chat.
1:11
If you click that, a window will open. I'll even do it now.
1:17
Your column appears in in in black on the right hand side of the screen
1:22
and then you can write your question there and then post it using the little
1:29
paper airplane icon in the bottom right hand corner of the screen.
1:34
And then we'll know what question you've asked and we can post that
1:41
to the panel that we've got here. So thank you all for coming.
1:48
Umm. Can I start by asking
1:54
All our panel members to just briefly introduce yourself and
2:03
where you work? What your job title is? And I'll I will demonstrate by saying,
2:12
uh, my name is Adrian Brooke I’m Medical Director at Health Education England
2:19
and my background clinically was in Paediatrics and I'm an interested.
2:26
Dinosaur from the pre digital age in this area so can I go
2:33
over to you Sonia next please? Hello everyone, lovely to be here. My name is Sonia Kumar,
2:38
I'm a GP by background and Professor of Medical Education and I'm an associate Dean at the University of Leeds.
2:47
Thanks, Sonia. And can we then move on to Alan next, please?
2:53
Hi everyone, I'm Doctor Alan Davies. I'm a senior lecturer at the University of Manchester and my backgrounds in
3:00
nursing and computer science. Excellent and last but by no means least,
3:06
Hatim. Thanks, Adrian. So my name is Hatim Abdulhussein. I'm a GP in northwest London.
3:11
I'm also Health Education England’s National Clinical Lead for AI and digital workforce.
3:16
thank you. So, and what we're going to do is just very briefly, uh,
3:22
kind of to introduce this session really to set the scene.
3:27
And. What I really just want to do here is in fact just remind everyone
3:35
that the AI digital healthcare technologies capability Framework
3:43
was published this morning at 10:00 AM and is available on the
3:49
NHS Digital Academy website. So that that report and the
3:56
framework build on the findings and recommendations made in the Topol
4:01
Review which came out in 2019 and was entitled preparing the healthcare
4:06
Workforce to deliver the digital future. And that outlines a set of
4:12
recommendations preparing the NHS workforce to become world leaders and utilizing digital technologies to the
4:20
benefit of our patients of course. Now we know clinical teams in the near future will be required to use AI,
4:26
artificial intelligence and other digital health technologies effectively and equitably.
4:34
Really, for the benefit of all And that's starting now actually.
4:40
So this is not, this is not something for the very distant future, this is occurring as we speak.
4:45
But in response to this need Health Education England, HEE’s how it's foreshortened.
4:50
It's an arms length body, commissioned University of Manchester to undertake
4:56
a learning needs assessment and create a capability framework and that's
5:01
to aid learning and development of our healthcare workforce. Now the framework aims to help
5:08
healthcare workers identify gaps in their current knowledge and areas for preparatory activities
5:14
to support digital transformation of the workforce as well as their
5:19
own individual learning. And this, this builds these capabilities that we've just published,
5:25
build on the foundational digital literacy capabilities first
5:30
introduced in a health and care digital capabilities framework.
5:35
So the AI digital healthcare framework extends this with capabilities around the use of health data and the
5:43
technologies that make use of this data. For example, applications on your mobile phone or
5:50
computer and wearable technologies, software and programs, etc.
5:57
And this is further extended with more advanced capabilities like
6:02
artificial intelligence and of course the advent of robotics.
6:08
And capabilities of course range across the whole spectrum from initial
6:13
awareness through implementing these technologies in a healthcare environment and supporting digital
6:21
transformation projects. So I'm going to shut up now. I think that's probably welcome for everyone.
6:28
And perhaps I'll turn to Hatim and Alan to present the framework.
6:33
Thank you. Thanks, Adrian. And so I'm just
6:39
bringing up the slides and hopefully. We can all see them. Give me a second.
6:45
Yeah, OK, everyone can see the slides, you can just give me a yeah, perfect. Alright. So I mean I've between Alan and myself,
6:52
we're just gonna go through the the methodology behind the framework and a brief overview of what the framework includes.
6:58
But when I was reflecting on what I'm gonna say today, I was just looking back when I
7:03
started my GP training as a registrar and started my first placement
7:08
as a GP in a practice in Hays. I was doing majority face to face
7:14
consultations and practicing in a way that seemed very familiar to me and I think going back I'll probably only
7:19
about 2 or 3% of my consultations were even telephone based and the majority were with the patient in front of me.
7:25
I then went into accident and emergency As a trainee and we hit the first wave of the pandemic and and being an
7:31
accident emergency I noticed certain things. So I noticed how suddenly our nursing staff announcing them and see what
7:37
collecting observations on on a device and while inputting them onto a system. I have seen the people are
7:43
struggling to see the slides. Um, so we've we've got messages in the chat that says they can't see the slides.
7:50
Let me see if I can share those accounts with you. Give me a second. So maybe if you go into presenter and all
7:56
then we all disappear that might be easier.
8:02
Can someone message to say Whether the slides are there? Yeah, we've got some, yes. OK. Thank you for your help.
8:10
So being in A&E I noticed that things were changing. So, so nursing staff were recording observations on, on, on,
8:15
on a system and we were having to access that system to to be able to look at observations. We had new healthcare records that we
8:22
were using in that emergency department. And I remember going in one day in pediatric and being told we've
8:29
got this new system in place and not really being shown how to use it and not really being
8:35
how kind of having had the time to really familiarize myself. with the system. I then went into general practice and
8:41
notice the whole world had changed. I you know, when I logged into my my my system, I suddenly had widgets on the screen
8:47
that allowed me to text and and and receive messages from patients. And something I was looking at appointments, I had something called Econsultation
8:54
where people had given me information beforehand and then I had to act upon and thinking about what I was gonna do going forward.
9:00
And all of a sudden I was doing about 50 to 60% on my consultations, either via the telephone or in some
9:06
cases even by video consultation. And I reflected on how when we got to this stage,
9:13
but two whether I thought that I had the best skills in place to to be able to to work in this new way and.
9:19
And I was also at a point where I was preparing for my general practice exams, and a key part of my exams was to to
9:25
record myself consulting with patients. And a lot of these consultations were over telephone and video. Now that became an opportunity for me.
9:32
It allowed me to really analyze the way I consult with patients and to reflect with my educational
9:37
supervisor around how best to do that and what kind of mitigation I needed to take when I'm consulting with patient over a video call
9:43
rather than over a phone or versus a face to face consultation. And so when I came into my role at Health Education England,
9:49
it was very important for me to think. About how we do our best to to help people understand what they need to know to be able to work with
9:55
the types of technologies that we interact with patients with. And so that's really the, the, the key context behind why this is important.
10:02
It will enable people working in, in, in health and care to be able to understand the types of skills they need to have when interacting with people,
10:09
interacting with patients and and using technology passing over to Alan and I'll move the slides along.
10:17
Thanks, Hatim. So I'm just gonna talk very briefly about this sort of methods we use to to create the framework.
10:23
So we used an iterative mixed methods approach to create it that involved co-design as well.
10:28
So this involved carrying out systematic literature review to look at the academic side of things and
10:34
where the different gaps were and also a series of workshops which we did online and then that was followed
10:40
up by a digital survey as well. Next slide please.
10:46
So the the systematic literature review is really used to to generate some initial concepts and this was carried
10:51
out by Health Education England's knowledge and management team. And we also include, as well as the academic literature
10:57
we looked at the grey literature, so we looked at existing frameworks, international frameworks and other
11:02
relevant policies and documents. And we use this to generate a set of a set of really groupings of
11:09
topics and themes and concepts. So looking at the different things that were coming up constantly in the literature,
11:15
that would seem to be important and we group these together roughly into what we call a concept map
11:20
and that acted as the basis for the workshops to give people a starting point, so they could look at the kinds of
11:27
technologies and things that we were talking about under those different main areas and spark the debate really.
11:34
So if you move to the next slide. So we carried out the workshops online, it was during the pandemic and we used,
11:40
we used a thing called Miro which is a an interactive board that allows multiple users to work at the same
11:46
on the same page basically. And we also put people into breakout rooms and the series of workshops
11:51
targeted different stakeholder groups. So the first one was really around so people like the Topol fellows there.
11:58
We have NHS, clinical entrepreneurs. We had in the second group we had
12:03
industry representatives so this was Babylon Health Google. Health, Barclays and Bupa and the final workshop
12:09
was focused around subject matter experts. So we used each of these three workshops to use the topics to spark
12:16
discussion and consider what the different capabilities might involve and then we're able to rank these in
12:22
order of importance and complexity. Next slide please. We use something called the nominal
12:27
group technique for the workshops and this is quite a useful technique when you've got people that aren't familiar with each other or you
12:33
might have power dynamic imbalances. So essentially you've got this nominal phase where you privately consider
12:39
the information and we did this offline prior to the workshops and then we have an item generation phase.
12:44
This is all around ideation. So people come up with ideas without being interrupted by others and
12:49
in this we captured that on post, it notes on the Miro board and then you go back around to the clarification
12:54
and discussion where you can kind of. Probe into the different ideas and ask people to explain them.
12:59
And then finally there's a voting stage where you're able to order the priority of the different items.
13:05
So we used this to generate a draft version of the framework, next slide, please. And then we sent that draft
13:11
framework out via survey for wider participation so we could get more people to give us feedback.
13:17
We took that feedback on board and then constructed the final version of the framework that you can see in
13:22
the report. Next slide, please. So the framework is, as mentioned before, it's built on top of the original
13:29
digital literacy framework and that forms the foundation. And then on top of that, we've got a lot of skills around data.
13:35
So obviously for a lot of these advanced technologies, wearables, AI, machine learning, they're all built on an
13:40
understanding and use of data. And then on top of that, we've then got those technologies. And then on the higher end,
13:45
we've got things like artificial intelligence and AI. So it's built up in that
13:51
in that way basically. And it sort of straddles the space between the original digital literacy framework,
13:57
which is very much around basic digital competencies, you know, so can you switch your machine on, send emails and do all these
14:04
fundamental digital things? And then at the other end, we've got special frameworks for special groups like Informaticians.
14:09
And this framework very much straddles that space in between the expert frameworks and the very
14:15
fundamental digital literacies. Next slide, please. So the other problem we had here is how do you make a framework
14:21
where you've got so many different types and roles in the NHS, so many different types of workers in the NHS workforce.
14:27
So it would be quite a challenge to map these capabilities onto all those different working groups.
14:32
And the other problem is some of these working groups will have different roles. So you might be a clinical nurse, but you might also be involved in
14:39
informatics projects, for example. So you might wear multiple hats. So to get around this, we use archetypes instead.
14:45
So essentially we map. All the capabilities onto archetypes and then people can self identify
14:51
which archetype or archetypes they belong to or their their managers can do this as well.
14:56
And the archetypes include things like shapers. So this can be people in leadership positions or arms length, arm length bodies.
15:03
We've got drivers. So this can be your CIO's and CCIO's creators. So these are people that are actually
15:08
creating some of this stuff, engineers, data scientists and then we've got embedders. So these are people who actually embedding
15:14
some of these things into the various. System, so IT teams and so forth and then we've got the users as well.
15:21
So people actually use the technologies and it is possible that you can come under one or more of these different
15:28
archetypes at different points. Next slide, please. We also use something called
15:33
Bloom's digital taxonomy. So for any educators out there, you're probably quite familiar with blooms. It's quite a popular framework that's
15:39
often used in in education and this is a digital version of that framework. And we mapped all of the
15:46
different capability statements onto Bloom's taxonomy as well. And it really includes moving from
15:51
lower order thinking skills through to higher order thinking skills. So at the lower end you've got
15:57
things like remembering things and and basic understanding moving through to application, analyzing, evaluating.
16:02
And then creating so we use Bloom's taxonomy across the framework and and
16:08
through the different sections as well. Next slide, please. So the framework itself is split
16:14
into a number of key domains, and these domains include things like digital implementation,
16:19
digital health for patients and the public, ethical, legal and regulatory considerations, human factors, health data management,
16:26
and artificial intelligence. Next slide, please. And a number of these domains
16:32
also have subdomains. So you can see there, for example, that they break down further. So AI includes things like robotics.
16:38
We've got things like management and leadership under human factors and ethics and regulations under
16:43
under the legal issues and so forth. And inside each of these, we've got a number of individual
16:48
capability statements. Next slide, please. So on on each of these
16:54
domains and subdomains, we've got a number of statements split into 4 levels. They're split into 4 levels to
16:59
make this compatible with the original digital literacy framework. So it's a familiar structure and the levels really just infer
17:06
increasing complexity or difficulty. So level one is going to be easier than level 4. And then within each of these levels,
17:12
you've got the actual capability statements themselves, and then these are mapped onto those different archetypes that
17:18
you can see at the bottom there. So that's it. That's a quick whistle stop tour through kind of how we designed
17:23
the framework and kind of what the framework consists of and I'll pass back to Hatim. Thanks, Aaron.
17:28
And a key message here is, so it's great. We've got a framework, we've got an idea at a very early stage
17:34
of what these capabilities might be. But how do we make sure that, one is sustainable and two,
17:40
we get the impact we need to get in terms of me being in my clinic room as a GP to practice skills
17:45
that I would need to be able to work the touch technologies that I'm interacting with? And so the first thing is
17:51
to say that technology is, is fast adapting and in our framework we've done our best to make sure that we're technology agnostic.
17:56
But we need to make sure that we continue to keep live to to advancements and developments in this area. And so we're going to be doing
18:02
some work to make sure we have a mechanism in place to continue to review and refresh the capabilities with the in the framework as well
18:09
as building new areas as things emerge in policy and in healthcare.
18:14
The second part is we want to empower individual learners to be able to use this framework. So it's about embedding it into
18:21
existing health education England learning platforms or tools and such as the learning hub.
18:26
So that individuals can really measure their own learning, their own aspirations for where they want to get to and and and and and that will then drive them
18:32
forward in terms of what kind of skills they develop based on that material out there. And then the final part is to be able
18:39
to make sure we're working with their educational bodies like people like Sonia who's working with in a higher
18:44
education institution or our royal colleges or professional regulators to be able to support the educational
18:51
reform that we need as a result of the learning that we have developed over the past year and a half in developing this
18:56
Framework and and so that we know that when I am entering my GP training,
19:01
I had it quite clearly within my remit to be able to develop these skills naturally within the competency of the capabilities that I need
19:07
to build as part of becoming a GP. I hope that's been a helpful overview of the framework, but I'll pass it back now over
19:15
to Adrian for the discussion. Thanks Hatim, and thanks Alan for for a kind of
19:22
lightening tour through the, uh, the rationale, background development and deployment
19:28
of of the framework. So thank you very much. What we'd like to move on now to is
19:36
discussion on how we can implement that framework in undergraduate
19:42
and postgraduate training. So and I'm going to turn to.
19:49
To Hatim and Alan and Sonia, I mean we have got this. You know, this funny triad between,
19:55
if you like the individual, the framework, the places individuals are in,
20:01
for example, you know, postgraduate or undergraduate courses and
20:07
we've got the changing landscape as well. So we've got lots of moving targets
20:12
and of course we've got a regulatory framework as well to navigate as well because some Healthcare
20:18
is a highly regulated field for obvious and very good reasons, which may not always be.
20:24
Quite as adaptive I well, I would imagine, and I don't know if anyone. Would like to kind of comment on some of
20:30
the difficulties that that that throws up, you know maybe around assessments or stuff.
20:39
I can go first and then. So whenever or when we kicked off this piece of work,
20:44
I think we made it very clear at the start that we needed to be engaging with, with educational bodies right from the start to be able to help them understand,
20:50
one, why we're doing this and and two, how they might use the product at the end. And then some early examples of
20:55
where that's been in effect is, for example the British Institute of Radiology. So we did a piece of work in January
21:01
of last year that looked at the AI and data-driven technologies in the NHS and what workforce groups they affect.
21:08
At the top of that tree we saw radiology radiologists. And and near the top healthcare
21:13
scientists as well and so. And further conversations off the back of that with the British Institute of Radiology,
21:19
we were able to say, well look, this is going to be really important for your membership. It's going to be really important for those that are working in the professional
21:25
groups that that you're responsible for. What can we do to to enable that learning that these groups need to need to have to be able to
21:32
work with these technologies. And we've got a webinar series and some learning materials that are being developed by the British Institute of
21:38
Radiology and are launching it there, uh, AI and your Congress as well in a few months time and so.
21:45
The key is is is to find the the bodies that are, you know really valued at the
21:50
importance of this and are looking to to work with us to just to build some of that proof of value for
21:56
for the learning in this space. Thank you. So so it sounds like some colleges
22:04
are kind of. Acknowledging this and you know, sometimes we say in education that we the assessment drives learning
22:10
and therefore if it if you're going to be asked about it in the exam, that's quite a powerful driver.
22:18
Clearly a lot of the workforce is not in training or education, but is post training as it were
22:25
in in service roles. But still needs to know so that the if you like the examination
22:32
or pressure to make you learn, it's slightly less urgent.
22:38
But I'm just wondering, for example, stuff like.
22:43
Finals for undergraduates, you know what? What's the inclusion in of the digital
22:50
kind of agenda in that and how how might this framework relate to that?
22:57
Sonia, can you, can you? I know you're GP, but I'm, I'm, I'm, I'm thinking things like
23:03
licensing exams and stuff like that. Yeah. No, I've involved in medical education for quite a number of years.
23:09
I mean, I have to say First off, I mean I'm really excited by this because it's a very clear outline of
23:15
the domains that you need to consider with digital health technologies. But I think equally,
23:21
I'm also quite worried about how the health service is sort of moving.
23:27
At breakneck speed in how we're adopting digital technologies and indeed how society is as well,
23:34
we all know that there's Google, there's wearables, there's apps. You know, digital health is part of our everyday lives.
23:39
But yet when you look at the training needs and how it's being integrated into undergraduate curricula and
23:46
that's across the health professions, postgraduate curricula, you do start to think that actually
23:53
digital health at best is, is sometimes mentioned it, it isn't. It isn't a strong theme and I think.
23:59
One of the really sort of beautiful ways of highlighting this is the medical licensing exam, which comes in just for medical
24:05
students in 2024, doesn't really mention digital health even though it does have an
24:11
area around around capabilities. I did a bit of a look yesterday and I was putting in words like technology,
24:16
digital remote consulting, anything that could encapsulate what we're talking about today and it
24:22
just isn't reflected and that that that's new that that that isn't, that hasn't even been launched yet,
24:27
that's coming out in 2024. So that disconnect between what society is moving ahead with what
24:35
the NHS and HEE's moving ahead with, but yet how educational bodies and
24:40
that's undergraduate and postgraduate are sort of somehow lagging behind. I think will be will be a problem
24:46
not only for dissemination of this framework but actually the bigger thing is, is actually how we supporting our
24:52
patients rather like you Hatim. I remember a patient coming in with their genome profile and I
24:57
had a student in with me, you know, I was totally out of my depth. How to counsel the patient about their
25:02
risk with for various conditions. So not only is there a training need for our pipeline, our students,
25:08
there's a huge training need for our trainers. You know who who is going to be
25:13
teaching our students all of these six domains around digital health. So I don't know, I don't want to use the word emergency,
25:20
but I do think there is a digital health emergency that we need to address.
25:25
Thank you. So that's really kind of powerful almost call to action, isn't it that that the that we need
25:32
to catch up across the system and maybe this reflects a societal,
25:37
a wider societal issue where we've got the kind of inexorable and ever quickening March of technology and
25:44
across society we struggle to catch up and are playing catch up with it, but from a kind of legislative point of view.
25:53
And if you like, this is an aspect in a, in a in a part of medical education
25:58
or healthcare education and practice. So I think that's a really
26:04
powerful observation. And we have got this strange situation, have we not?
26:09
And I've been interested to hear people's comments on this that you know that everything is moving really quite
26:16
rapidly now normally in in a lot of healthcare knowledge and understanding.
26:22
It's sort of held behind a kind of a, a bit of a Mystic shroud of learning,
26:27
isn't it? So we we've had this aspect of the the the doubling time of medical
26:33
knowledge and it used to be 50 years and then 25 years and then ten years.
26:39
And I think it's currently at about 70 days and shrinking, but for technology,
26:44
which is often unreleased in a commercial setting 1st and then adapted for healthcare rather
26:50
than the other way around. Actually if you like our public are way ahead of us in terms of their use
26:57
and and often their sophistication certainly for some parts of the population. So that I think that's another challenge.
27:04
How, how do you, do you think the framework will help our healthcare workforce if
27:11
you like math their progress in their learning journey that helps equip them to meet that challenge.
27:17
Perhaps I'll ask Alan, because you you described the the construction of the framework.
27:24
Yes, thanks. Actually I think I think it has the potential to do that definitely we we've certainly been and we've
27:30
we've we've we've put parts of the framework and map that against our new clinical data science program for example. So we're trying to try and
27:37
embed these things in, in some of those postgraduate work that we're doing. We've also got a lot of stuff going on in
27:42
nursing at the University of Manchester. So particular modules and courses, you know a lot of them tend to
27:48
be postgraduate focused though because you know there's a lot of crowded curriculum in a lot of these medical professions.
27:54
Between medicine and and nursing they're always putting more and more things in and obviously digital is very important.
27:59
But often we're seeing you know it's maybe not getting the attention that it deserves but some people
28:05
are also trying to embed that into normal normal kind of practice and put it into other units and other
28:10
things which I think is a good idea. It's another way to to embed some of the digital stuff in there as well. So we're seeing sort of more and
28:17
more adoption of these things and if it can be incorporated into other modules and interdisciplinary learning as well when we're working
28:22
with other professional groups. Because that's what happens in reality, isn't it? You're going to use this technology
28:27
a lot to communicate with other groups and other departments. And really we need to start embedding this early on
28:33
in the undergraduate and postgraduate curriculum. So I think definitely having that framework and the ability for people
28:38
to sort of look at that and see what those requirements might be certainly gives educators something they can start to work with and
28:44
start to make sure that they're including some of those main elements. So, so early adoption,
28:51
decimination and uptake kind of key key themes I think coming out out
28:57
of of of your answer and how how might you see that for example Hatim
29:02
or Sonia in GP training given that that's both your clinical backgrounds. Is that something you've you've seen
29:10
I don't wanna make presumptions but I suspect had him slightly closer to
29:15
training than Sonia but maybe not very, not very much so but from from your point of view, Hatim,
29:22
Have you seen that or Sonia, have you seen that in practice? So my reflection during my training is I think that we're on a journey
29:28
here that similarly we were on with leadership with with quality improvement which are areas that have fell in quite naturally into the the,
29:35
the GP portfolio or the GP kind of workplace based assessments. And now when I was training that
29:41
whilst there wasn't a specific section around digital, I made sure that I did a lot of
29:47
reflection on the way that I used digital tools in the way I interacted with my patients and I did,
29:52
I spent a lot of time thinking about well actually did that make things better. Did that help the the the case
29:57
or did that actually make things worse was that the right modality to to choose to communicate with that patient and was I excluding
30:03
them from from care by by by using that modality and I spent a lot of time reflecting on these things more naturally purely because it was
30:10
important to me but I think we do need to create the conditions within the portfolio to support people to
30:15
to be able to do that reflection and and to be able to make to understand that better because ultimately this
30:22
is all about safe ethical patient care and and and to be able to. deliver safe ethical patient care need to be able to be competent in
30:28
working with the tools that you're working with and understand their strengths and their limitations.
30:37
I've just sort of I suppose just adding from an undergraduate perspective, um, you know I I think the evidence around,
30:44
I wrote a paper around this the evidence around how you teach digital health. How you actually embed this in
30:50
curriculum is quite sparse. So I think that there's there's a real gap there is how do we
30:56
actually get this information and and skills and values around digital inclusion out to our students?
31:01
Clearly PowerPoint's not going to do it. You know teaching our students all of this in lectures isn't going to do it.
31:08
I mean one thought that I have is is 1 and this is bringing in my experience
31:13
around community based learning and and being a GP is that students and also just building on what you said
31:19
Hatim around quality improvement. One thing that I have done previous to my role at the University of Leeds,
31:25
I was at Imperial College London for 10 years and one thing that we did there is that the students year three
31:30
medical students did what we call Community Action projects and we sort of focused these around hot topics
31:36
such as COVID vaccine hesitancy. And just one thought is that students could do quality improvement projects
31:42
with communities in that they learn the knowledge base and the skill based around digital health,
31:47
but they do that through working with communities in upskilling communities in digital literacy.
31:54
So you sort of have a double win there that not only the students learning but actually they're learning through service.
32:00
Because I do think that we need to think about training needs not just of our healthcare professionals but also the
32:05
gaps in our in our, in, in patients. We need to empower them so that when they are. Coming with information, they've been able to do a little
32:12
appraisal of the information themselves and that they're not spending huge amounts of money and time on
32:18
digital health technologies that may not be best for their health. Thank you,
32:23
Sonia that that that's a really helpful insight actually. So we do have a question and so thank
32:28
you for the questions and comments which are coming through on the chat. We have a question from Jane Daly on that.
32:36
That was dinner at 12:29 for members of the panel that wants to look at the
32:41
question before I fire it out to you. So this digital first will only be
32:46
integrated or embedded if workforce contracts and the reward and recognition system is revolutionized.
32:54
How does this align with critical and strategic workforce planning? I've got horrible feeling as I read that it might come back my way,
33:00
but does anyone want to kind of start off with that? As a response, yeah,
33:06
I can kick us off in that shape. It would be great to have your views on that as well. But I guess the way the way I see this is,
33:12
it's important. This is a particular group that we haven't necessarily focused on just now. We're talking about undergraduate
33:18
and postgraduate training, but actually there's a whole group around continuing professional development who are out there working
33:25
in the NHS that will need to equally have these skills looked at and be able to kind of be supported to be able
33:30
to keep their skills up to date or develop their skills where gaps lie. And I think the key here is. Is.
33:36
Is culture of an organization top down leadership in terms of saying this is important to to be able to
33:41
develop the skills in this area, making sure these things are built into annual appraisals so that
33:46
you're able to at some point look at your digital literacy users. Something like a digital self-assessment
33:52
tool that we're developing a health education in England and piloting in, in, in, in, in, in the north of the country.
33:58
To be able to say where am I right now and where do I need to go and have a really open and frank conversation with your line manager in terms of
34:05
how you then develop those skills. And why it's important that you develop those skills. And so if you have all of that kind of naturally happening,
34:11
within an organization, you're going to be more digitally mature as an organization. And so it's important that we work with
34:18
providers to be able to enable that. That thanks. Thanks.
34:24
Hatim, yeah I know I think from a a workforce planning is really quite a complex thing isn't it?
34:29
Because a lot of planning is there's kind of short, medium and long term planning
34:35
and some of that planning, long term planning assumes or
34:40
if you like has foresight that there will be a great deal of
34:47
digital technological change. And yet it can't be exact in articulating exactly what it
34:54
would look like or how it affects. If you like the you know what's often known,
34:59
your workforce planning cycles, circles is productivity or your workforce requirements or
35:04
your learning requirement. Even so, it becomes really quite an inexact
35:09
science at that stage and as we know. Kind of current progress is not
35:14
always a good kind of predictor of future growth in in in the area. So it's it's it should be quite
35:21
easy I think it's the answer it's actually very very difficult to to do
35:27
accurately beyond very broad assumptions. I think that's one of the issues. So it's a really good question
35:34
that kind of highlighting some of some of the difficulties in trying to do that I think reward
35:40
is a really useful kind of. Local example of how you can reward
35:46
your workforce for training and pursuing that that knowledge journey
35:52
and competency and capability journey amongst for digital and we
35:58
know for example there are areas of practice and so protect the example
36:06
of diagnosis of stroke where AI technologies for imaging to diagnose.
36:13
Kind of strokes which are amenable to intervention. I think that's my understanding of the technology that's being used.
36:19
You know that's grown from about 10% uptake a couple of years ago and it's about 70% of units and they're using it,
36:26
so they're they're there's that rapid growth, it would be quite hard to predict
36:32
although it's incredibly welcome and some of the other technological
36:38
learning advancements which require greater interplay with. If you like skill,
36:44
individual skill might take a bit longer and of course need
36:50
guarantees and regulation because you know you don't want to be doing robotic surgery on people if you're
36:57
not properly qualified to to to undertake that procedure for example. Very simplistic view.
37:02
So I think Alan you have your hands up, so please do come.
37:08
So I think Hatim was first. No that's fine. I'm I took it off. I took it off, go for it. I was just gonna,
37:13
yeah I was just gonna say. Another thing that I think is quite important with this as well is we talk a lot about the the digital
37:18
literacy but as the technologies get more advanced they're often closely associated with data. So there's this concept of data
37:24
literacy as well where you know if you're if you're not putting the right data into it or or doing that
37:29
in the right way obviously what you get out of it can be affected. So I think that's another key thing is also having
37:35
access to this data for people to learn from as well. And to learn how to use data. And therefore, you know,
37:40
so it's not just the tools that we need to sort of teach people, it's the data that goes into the tools and how that's collected and
37:46
maintained as well as it were. And also we often have trouble with that in academia, getting access to real data sets and
37:52
things like that to train people on. So we're looking at things like synthetic data and, you know, using things like electronic health
37:58
record systems and using sort of fake data and things like that. But again, the sooner we can get people using
38:03
some of these tools and the data that's associated with it as well and getting them comfortable with. Using data that's going to help us
38:10
well in this in this area, I think. Right. So that's a really good question
38:15
from Catherine Woolley at 12:34 for the timestamp aware amongst
38:20
the panel which says do we need to upskill the trainers first,
38:26
can't teach effectively something you don't understand yourself, which is fantastic question
38:33
Catherine and incredibly true. So, uh, who would like to have a
38:38
go at answering that? Hatim I saw the ghost of a nod there.
38:45
So that that made that means it's you. No? I'd love to hear Sonia's opinion on this as as a senior educator,
38:51
and what your thoughts are in terms of one, I think we might. I'm hoping we're all gonna say yes.
38:56
And two, how do we then do it? Well, so I suppose just turning this on its head. So this is pre COVID I think 2017,
39:04
2018 I or maybe no, it was around 2019. It was around the time of the Topol review.
39:09
I set up a module for medical students called digital health futures. And looking at it, it wasn't particularly forward thinking,
39:15
but you know it was, it was on the basis of the Topol review which was where we really started to embed
39:22
some of this learning for medical students. And what became apparent exactly as you say Catherine is that none
39:28
of us really knew as much as the. Students. And so that really is where the light bulb hit that actually I do wonder
39:34
whether it is the new generation, it is our students that will be upskilling us.
39:39
Obviously we don't want to do complete reverse teaching, but I do think there is something about Co creating any curricular
39:46
changes with our students. They are so completely savvy, not only with just digital tech in
39:51
their life on their smartphones, but also around. There are a lot of students that are really,
39:57
really excited about digital health and know a lot about it. And so when we ran this module,
40:03
the students were absolutely teaching us. And so when we developed the module and and we presented uh,
40:09
some of our our work at conferences, we were very much working alongside students. So I think the how has to be with the
40:16
new generation who have been brought up with digital education and digital health.
40:22
Great. And of course there's nothing to stop our educators from using the same capability and competency
40:28
framework themselves, plot their own journey and make sure they're teaching to the
40:34
right level that as you say, you've got to understand a subject properly to be able to teach it well.
40:41
I think that's one that observations. So we're, we're just coming to the end.
40:46
We've got the last couple of minutes. So there's one very quick question, John, hoping there will be a really quick answer.
40:52
To do that, I think talks about the digital sheet digital literacy literacy as
40:59
assessment from Kerry O'Reilly. This is a digital literary assessment. Is this available for wider use?
41:08
So yeah, to my understanding the digital literacy self-assessment tool is is currently being piloted and it's
41:14
not open for for wider use as of yet but but then but hopefully it will be soon and I'll share a
41:20
link in the chat to the website so that people can stay updated in terms of its progress. That, that, that's brilliant, fantastic.
41:28
There are lots of really interesting and insightful comments on the chat.
41:36
And I can reassure you as we approach the last minute of the webinar,
41:43
because I think we log off at 12:45. That.
41:50
A recording of the webinar will be will be made available and it will
41:57
be on the DART-Ed web pages. So, and we'll add a link in the chat I'm sure,
42:05
I I hope will come soon. And there's a Twitter channel
42:10
@NHSDigAcademy. So I think this conversation and
42:16
the developments can be followed on on that on Twitter.
42:21
And I think I'm, I hope, I hope the link can get posted into that as well.
42:26
So I I would really, really like to thank our panel today.
42:32
I'd like to thank Sonia for for your inputs and development of this,
42:38
like to thank Alan similarly and thank you so much. And I'd like to thank Hatim for
42:43
really kind of helping coordinate and drive a lot of this in HEE. So thank you so much.
42:49
I'd also like to thank Beth. And Emily, who you won't see on this webinar,
42:54
but basically without their. Abilities to organise and corral us,
43:01
four of us into a room, albeit virtually at this time, none of this would happen.
43:06
Thank you so much for listening and tuning in and I hope we'll have further conversations
43:12
and look forward to you all all joining us in the future. Thank you.
43:17
Good afternoon.
Media last reviewed: 3 May 2023
Next review due: 3 May 2024
Page last reviewed: 12 September 2022