Health and Wellness

GPs are using AI to write ‘false apology’ letters to patients who complain, investigation finds

British doctors are using AI to respond to patient complaints to make their job easier, according to a medical group.

A report by the Medical Defence Union (MDU), who offer doctors legal advice, warns ‘some doctors are turning to artificial intelligence programs like ChatGPT to draft complaint responses for them’. 

The body says doctors have been ‘allured’ by the opportunity to ‘make everyday tasks easier’.

But this could not only risk handing over sensitive patient information to an AI, it could also lead to both inaccuracies and further upsetting patients, the MDU warn.

The group told MailOnline it had seen a ‘small number of cases’ where medics were using AI in this way, and were issuing a general ‘proactive’ warning to their members.

A report by the Medical Defence Union (MDU) warns ‘some doctors are turning to artificial intelligence programs like ChatGPT to draft complaint responses for them’

They said medics should be particularly concerned about AI providing ‘false apology’ where they address a complaint in generic terms such as, ‘I am sorry you feel your care was poor’, rather than addressing specific points a patient raises. 

Dr Ellie Mein, MDU medico-legal adviser, said it was understandable why medics, such as GPs, were turning to AI as potential time saving tool. 

‘In the face of increased complaints and immense pressure on the health service, it’s only natural for healthcare professionals to want to find ways to work smarter,’ she said.

‘There are many ways in which AI technology is being used to improve the quality of patient care, such as in health screening. 

But when responding to patient concerns, there is no substitute for the human touch.’

She said the MDU was aware of cases where patients had found out their doctor had used AI to respond to their complaint.

Before using such tools, medics should consider how they would feel if a patient confronted them about it, she added.

‘There have been cases where recipients who were suspicious of the wording in a complaint response were able to reproduce the same text by asking AI to draft a similar letter,’ she said. 

‘Would you feel comfortable in this scenario and would the patient feel you had taken their complaint seriously?’.

Dr Mein added that while using AI as a prompt to get started on a complaint response was acceptable, there were a number of pitfalls to avoid, and the provided response shouldn’t be relied on.

She said medics needed to watch out for inaccuracies or spelling mistakes that could result from many AI tools American roots that could clearly give away a medic wasn’t responding to a complaint authentically.

Additionally, medics should, in no circumstances, provide an AI with confidential medical information regarding a patient as this could run afoul of UK data protection laws. 

Dr Mein also warned AI prompted responses can also miss out information a medic is duty-bound to provide such as who a patient can contact, like a regulator, if they feel their complaint has not been addressed. 

It comes as the number of official patient complaints to NHS medics has reached record levels.

Almost 229,500 written complaints to the NHS were received in 2022-23, the latest data available. 

This is a 41 per cent increase on the 162,000 recorded a decade earlier.

Of 229,500 patient complaints recorded in 2022-23 the majority (some 126,000) were to GPs or NHS dentists. 

  • For more: Elrisala website and for social networking, you can follow us on Facebook
  • Source of information and images “dailymail

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button