
While Spain is going all in on artificial intelligence to free up doctors for real conversations, the UK is hitting the brakes over safety and privacy concerns.
Starting this summer, some Spanish clinics are testing a new AI-assisted system to improve patient check-ups. During this trial period, the AI will take notes of everything the patient says, while the doctor will be able to participate in the conversation and actively listen.
If the patient agrees, AI software will record and summarize the conversation and provide a draft of the visit. In the meantime, the doctor can focus on the check-up itself – checking respiratory and heart rates, swelling, reflexes, and maintaining human contact.
Spain's health minister, Mónica García, says the goal of this project is not to turn clinics into robots but to “give doctors back the time to really listen to patients.”
So far, this is only a pilot programme involving volunteer regions. However, the Spanish government has bigger plans: the goal is to continue this practice in all primary care centres before the end of 2027.
Spain is currently piloting AI-powered transcription tools in primary care centres across several regions. Confirmed participants include the Madrid region, where the system is scheduled to launch in 2026. It could potentially benefit over 6.7 million residents.
Private healthcare provider Quirónsalud has already begun trials in hospitals across Madrid, Galicia, and Catalonia, using an in-house tool called Mobility Scribe. While early feedback from clinicians has been largely positive, real-world use has revealed some issues with transcription accuracy, particularly with handling medical jargon and a continued need for human oversight.

The downsides of AI check-ups: the British experience
In recent months, the UK’s health authorities have promoted the use of AI transcription tools for the same purpose – to help reduce administrative burdens and free up doctors' time for patient care. However, this has taken a wrong turn for some.
The NHS England had to issue a warning to doctors and GPs to stop the practice because clinical staff were using unauthorized AI tools to record and transcribe patient conversations. NHS has highlighted concerns that some tech solutions are being widely used despite failing to meet essential data protection and clinical safety standards.
The warning points out that these non-compliant tools could lead to serious risks, including breaches of patient data, clinical errors, and disruption of NHS digital strategies.
Your email address will not be published. Required fields are markedmarked