Blog: Between Innovation and Integrity: Reflections from the National Commission on AI in Healthcare

Dr Vish Ratnasuriya MBE, Chair of Our Health Partnership.
Last week I sat down with the newly formed National Commission on the Regulation of AI in Healthcare, and for the first time in a very long time, I felt impostor syndrome.
Even in our first meeting, it was clear that there were extraordinary minds around the table: from ethics, engineering, law, policy, and patient advocacy. The kind of people who remind you just how complex and exciting this moment is, and testament to the MHRA’s leadership in convening such a group.
That is exactly how it should be, because the question in front of us is not simply how to regulate a new technology. It is how we govern intelligence itself, both artificial and human, in ways that serve care.
Grounded in the Everyday
Most days I still see patients. That keeps me grounded in the realities of the NHS: the small acts of empathy, advocacy and compassion that make the system work, and the frustrations that come when it does not.
Alongside that, I lead Our Health Partnership (OHP), one of the UK’s largest GP provider organisations. Over the last decade, we have created a learning system to strengthen, sustain and improve primary care through constant change, resulting in nationally recognised improvements in quality, safety and outcomes for our patients and communities.
Through our Primary Care Accelerator, I have also been working with innovators to bring technology safely and effectively into practice, helping them to translate promise into impact with safety, equity and spread in mind. Perhaps most of all, I have gained a deep understanding of the challenges innovators face.
That vantage point, sitting between provision, policy and innovation, is sometimes uncomfortable, but it is where the most valuable insights often surface. It is where ideas in strategy documents meet the texture of human lives, and where policy and regulation can either protect that relationship or accidentally stifle or erode it.
From Policy to Practice
I have recently contributed to the NHS 10 Year Plan and the Life Sciences Sector Plan, and previously served on the NHS Assembly, where a broad coalition of experts helped the centre work through some of the system’s most complex challenges.
Those experiences taught me how difficult, but how necessary, it is to align national ambition with on-the-ground reality. They also shaped my conviction that innovation only delivers value when it is anchored in care.
That balance, between pace and purpose, efficiency and empathy, is what the new Commission will have to get right.
Why Regulation Matters Now
AI is already woven into the NHS, from diagnostics to documentation, logistics to listening. The question is no longer whether to adopt AI, but how to guide it responsibly.
My early priorities on the Commission are probably predictable, but they matter.
Real-world applicability. If regulation does not help patients and professionals to benefit from safe innovation, it is not fit for purpose.
Safety and trust. We need regulation that gives confidence, ensuring that AI serves care, not the other way around.
Clarity of liability. As a practising clinician, I am acutely aware of how accountability can blur in an AI-enabled world. Who carries responsibility when a system learns or adapts: the developer, the deployer, or the doctor?
Wide-reaching debate. We need a national conversation that is pragmatic, informed and inclusive, so that public confidence and professional trust grow together.
Regulation as Stewardship
I left our first meeting encouraged. The Commission is clearly thinking not just about control, but about stewardship: the idea that regulation should be a learning system, not a static rulebook.
In a world of adaptive algorithms, regulation itself must adapt. That means frameworks that evolve as evidence emerges, oversight that values context as much as compliance, and an architecture that keeps people whether patients, professionals, and the public, at the centre.
But learning systems do not just change the machines; they change us.
Every time an AI model shapes a clinical decision or lightens an administrative load, it subtly reshapes the behaviours and judgements around it. Over time, clinicians may begin to internalise the logic of automation; patients may adjust what they disclose or expect.
Regulation therefore has to look not only at how AI performs, but how it performs on us. It must track these shifts in judgement, attention, and trust as part of the safety case itself.
This is what I think of as the human feedback loop: the reciprocal learning between technology, clinician and patient. It is where the real long-term risk of drift, cognitive, ethical, and cultural, resides. It is also where the opportunity lies: to design systems that strengthen human care rather than erode it.
This is the work that will decide whether the UK can be fast because it is trusted, not reckless because it is fast.
Progress with Moral Purpose
In many ways, this work feels like a continuation of what we have been building through OHP and the Primary Care Accelerator: creating a space where practice, policy and innovation learn from each other in real time, including how behaviours evolve as technology embeds itself in care.
The NHS has always been at its best when it holds scientific progress and moral purpose in the same hand. If we can design regulation that is adaptive, relational and grounded in care, we can strengthen not only the NHS but also the UK’s contribution to global health innovation.
Closing Thought
I still feel daunted by the scale of what lies ahead. However, I am also optimistic that by learning how to regulate with our values, not around them, we can secure the future of both the NHS and the wider economy.
Because regulation, at its best, is not a restraint on progress. It is how a society decides what kind of progress it wants.
Alongside being appointed to the National Commission for the regulation of AI in Healthcare, Vish was a member of the NHS assembly, sits on the Birmingham Place Board in the BSoL ICB and is a Honorary Associate Professor at the University of Birmingham.

