Main content

Who is responsible? - How to guide and regulate health and social care professionals who use AI

29 Apr 2026

Insights from a University of Bristol and PSA workshop

“To what extent can or should professionals rely on, question, or entirely ignore an AI-generated recommendation? If something were to go wrong, and the AI gives a bad recommendation, then who will be held ethically and legally responsible if that wrong AI recommendation reaches the service user and leads to harm?”

The deployment of artificial intelligence (AI) in health and social care settings raises specific professional, ethical, and legal questions for users. Representatives from regulators, Accredited Registers as well as patients and service-users grappled with these questions and more at a recent workshop facilitated by academics from the University of Bristol. The workshop was commissioned by the Professional Standards Authority for Health and Social Care (PSA) in collaboration with Dr Helen Smith RN and Professor Jonathan Ives.

The session focused on identifying areas where regulatory clarity is needed and sharing best practices for ensuring patient safety and ethical deployment of AI. Through deliberative group discussions, and with the aid of a series of real-world scenarios, participants explored various themes including AI safety, bias, transparency and accountability.

Some key findings from the report are to further explore the need to:

  • Commit to career-long learning: Training should last an entire career, starting with a shared foundation and moving to specific technology and service skills, including Continuing Professional Development.
  • Improve reporting: Make it easier to report and learn from AI use. Create ways for staff and patients to raise concerns and work together to monitor technology after it is released.
  • Share responsibility: Avoid blaming only frontline staff for high-level system risks. Use shared accountability to prevent “moral crumple zones”, where individuals take blame for system failures.
  • Value diverse input: Include many different perspectives when developing and using AI. Use Patient and Public Involvement and professionals to spot and avoid problems and to keep improving services.
  • Stay flexible: Use a step-by-step approach with regular reviews so oversight can keep up with fast-moving technology.

We are using the findings to inform the PSA’s contributions to the UK National Commission on the Regulation of AI in Healthcare. The Commission is led by the Medicines and Healthcare products Regulatory Agency (MHRA).

Read the full research report.

ENDS

Professional Standards Authority for Health and Social Care

Contact: media@professionalstandards.org.uk

Notes to the Editor

  1. The workshop was held on 27 February 2026. It was commissioned by the PSA and led by academics from the University of Bristol. 
    The workshop brought together regulators, Accredited Registers (ARs) and patients/ service users to discuss the opportunities and challenges of using AI. The report, published on 29 April 2026, provides an overview of the main points and recommendations from the workshop attendees.
    The Professional Standards Authority for Health and Social Care (PSA) is the UK’s oversight body for the regulation of people working in health and social care. Our statutory remit, independence and expertise underpin our commitment to the safety of patients and service-users, and to the protection of the public. There are 10 organisations that regulate health professionals in the UK and social workers in England by law. We audit their performance and review their decisions on practitioners’ fitness to practise. We also accredit and set standards for organisations holding registers of health and care practitioners not regulated by law. We collaborate with all of these organisations to improve standards. We share good practice, knowledge and our right-touch regulation expertise. 
  2. We also conduct and promote research on regulation. We monitor policy developments in the UK and internationally, providing guidance to governments and stakeholders. Through our UK and international consultancy, we share our expertise and broaden our regulatory insights.
  3. Our values are – integrity, transparency, respect, fairness and teamwork – and we strive to ensure that they are at the core of our work. 
Find out more about our work and the approach we take