Successive governments have sought to improve access to general practice, or the working lives of GPs, via a mix of technological, patient self-care, administrative and staffing interventions. Much newer, however, is the idea that Artificial Intelligence (AI) has a meaningful role to play. AI refers to machines that perform tasks normally requiring human intelligence, particularly by learning from data to find patterns and make predictions.
The study for this report aimed to determine the proportion of GPs in the UK currently using AI in their clinical practice, and to explore how they are using it. We sought to understand the range of tasks for which GPs use AI, the perceived benefits and concerns, and the barriers and enablers to greater adoption of AI in general practice.
AI refers to machines that perform tasks normally requiring human intelligence, particularly by learning from data to find patterns and make predictions. This can include tasks such as transcribing speech from patient consultations into summaries and patient notes, producing responses to patient queries, through to tools that identify potentially serious skin conditions from images. We used a mixed-methods approach, combining a nationwide survey (as part of the Royal College of General Practitioners’ annual GP Voice Survey), with a series of online focus groups. This research is the largest and most up-to-date survey of general practitioners on this topic, giving insight into AI use just as the NHS 10 Year Health Plan commits to rapid expansion of AI use in the NHS.
Key findings from the survey include:
- Of the 2,108 GP survey respondents, 598 (28%) said they currently use AI tools in their clinical practice. Breaking this figure down, 13% of all GPs use tools provided by their practice, 11% use tools they have obtained independently, and 4% use a combination of both.
- There is significant variation in AI adoption among GPs across different demographic groups:
- Male GPs were significantly more likely to use AI than female GPs. Of the 848 male GPs who responded to the survey, a third (33%) said they used AI. This compared to a quarter (25%) of the 1,184 female GPs.
- GPs working in socioeconomically deprived areas were less likely to use AI (and in particular practice-provided AI). Of the 1,046 GPs who said they worked in more deprived areas, just over a quarter (27%) said they used AI tools, compared to over a third (35%) of the 467 GPs who said they worked in more affluent areas.
- Additionally, GPs in England were more likely to use AI (31%) than those in Scotland and Northern Ireland (20% and 9%, respectively). GPs in Wales were more likely to use AI (28%) than those in Northern Ireland (differences with Scotland were not statistically significant).
- Younger GPs were more likely to use self-obtained AI tools than relatively older GPs. Of the 461 GPs aged under 35, 15% reported using AI tools they had brought to work compared to 11% of GPs aged 35–54 and 8% of GPs aged 55 or more. GPs aged 45–54 were more likely (15%) to use practice-provided AI tools than those aged under 35 (11%) and those aged 65 or over (6%).
- Among the 597 respondents who reported which tasks they use AI tools for, over half (57%) were using AI tools for clinical documentation and note taking. Around four out of 10 GPs use AI tools for professional development (45%) and administrative tasks (44%), but fewer use AI tools to support clinical decision-making (28%) – though focus groups suggested that GPs are actively testing AI tools for this purpose.
- GPs want AI tools to handle routine time-consuming tasks reliably, allowing them to focus on complex clinical reasoning and meaningful patient relationships. For example, when asked to prioritise up to three areas for AI development to focus on over the next two to three years, the 2,108 GPs selected:
Regardless of their current use of AI, participating GPs at all career stages expressed concerns about AI adoption in general practice, namely:
- Professional liability and medico-legal issues (89% among non-users, 80% among those who use practice-selected AI tools, and 80% among those who use self-obtained AI tools)
- Lack of regulatory oversight on AI (88% among non-users, 78% among those who use practice-selected AI tools, and 74% among those who use self-obtained AI tools)
- Risks of clinical errors (83% among non-users, 69% among those who use practice-selected AI tools, and 70% among those who use self-obtained AI tools)
- Patient privacy and data security (82% among non-users, 69% among those who use practice-selected and self-obtained AI tools).
Focus groups revealed that the greatest benefit GPs experience from AI is saving time and reducing administrative burden. While policymakers hope that this saved time will be used to offer more appointments, GPs reported using it primarily for self-care and rest, including reducing overtime working hours to prevent burnout.
Focus groups echoed survey findings. GPs emphasised a lack of regulatory oversight of AI as a major concern, as well as misleading or incorrect outputs (‘hallucinations’). Beyond ambient voice technologies, for which guidance has been developed for the NHS at the national level, the implementation of AI in general practice appears to depend heavily on local policies developed by individual practices and Integrated Care Boards (ICBs), as well as local staff willing to test tools and share their learning. But practice is inconsistent across the country, with focus group participants suggesting some ICBs forbid all AI use and others actively encourage safe use and piloting. GPs highlighted the need for clear national standards, supported by local policies and aligned training.
To address variation in AI adoption and encourage responsible use, policymakers in England will need to:
- Work towards rapidly establishing evidence-based national guidance to address variation and inconsistency across ICBs, avoiding a postcode lottery, which should cover both administrative AI tools and clinical decision-making AI tools, as well as generative AI tools.
- Immediately clarify professional liability and safe AI use and regulatory and governance frameworks. This should be done by a consortium of national policymakers, professional and sector regulators and (like the guidance discussed above) should cover AI tools that are considered medical devices, as well as those that are not.
- Develop comprehensive and structured training and education programmes during medical education, as well as for postgraduate NHS staff. This should be funded nationally by the Department for Health and Social Care, with regulators and Medical Royal Colleges involved in standardising and specifying the content of this training.
- Use research into the impact of AI to set realistic ambitions about the potential benefits of AI. The 10 Year Health Plan suggests AI will radically improve patient access, but this study highlights a need to recognise that some of the time saved will reduce clinician overtime (and/or workforce burnout) rather than immediately equate to more appointments.
- Take actions to mitigate the risk that AI use can widen health inequalities. This should include addressing the finding that AI users in more deprived areas were less likely to access practice-based tools, as well as that some AI tools don’t support minority languages.
- Consider the environmental impact. AI adoption may increase carbon emissions and electronic waste, conflicting with NHS and RCGP net zero goals. National guidance is needed to align AI use with environmental priorities.
These recommendations may also have relevance to policymakers in Scotland, Wales and Northern Ireland.
To ensure that future tools address the needs and concerns of GPs, AI developers and tech suppliers will need to:
- Focus on developing AI tools that save GPs time in their routine work, assisting with automating administrative tasks and clinical documentation, not replacing clinical judgement.
- Integrate tools seamlessly with GPs’ electronic patient records rather than tools that are standalones or bolt-ons.
- Address and reduce hallucinations and ensure their risks are emphasised in training.
- Co-design tools for GPs with diverse groups of GPs and other practice staff.
Given that many GPs are already using AI, in the interim GP leaders and GPs in practice will need to:
- Develop interim local AI practice guidance until clearer national guidance and regulatory and governance frameworks are available. Encourage open discussion of all AI use within practices, and consider developing clear local protocols which might cover patient consent, approved tools, and reporting of adverse events.
- Test, learn and share collectively. Where possible, allocate time to evaluate tools, share learning across networks, report problems with tools and especially those that led to errors in care, and share resources such as policies and case studies.
- Help to educate patients about the use of AI in practice. As AI becomes integrated into general practice, it will be important to explain the role that AI is playing in practice while also maintaining the human elements of care that patients value.
The government sees improving access to and experience of general practice as a key priority, with AI expected to play a major role. Our study finds that AI has the potential to enhance patient care and reduce GP workloads, but benefits are not guaranteed, nor is rapid adoption imminent. Currently, 28% of GPs across the UK (and 31% in England) use AI tools, yet guidance varies widely. Some ICBs urge caution, others encourage experimentation with approved tools. Concerns about regulation, liability, as well as a lack of national guidance, remain significant barriers.
Policymakers hope AI will free GPs for more appointments, but most use saved time to reduce overtime and burnout, so expectations may need to be re-examined. Successful implementation will require addressing system-level issues, including clear guidance, training, equity, and safeguarding professional values. Policymakers and those involved in AI innovation need to act now.
We are grateful to Optum UK for providing additional funding to support Nuffield Trust in running the focus groups and preparation of this report.
Partners
Suggested citation
Kumpunen S, Lobont C, Garrard L, Fisher D, Lau R, Fallica G and Fisher R (2025) How are GPs using AI? Insights from the front line. Research report, Nuffield Trust