Experts call for action on racial bias in healthcare technology

Experts call for action on racial bias in healthcare technology

Regulations for approval of medical and dental devices will change based on new research which found bias against ethnic minorities, women and low income groups in technology.

These findings were the result of a review of equity in medical devices commissioned by the Department of Health and Social Care and published on 11 March. The independent review found biases against several groups including black and minority ethnic populations, women and those from deprived communities.

The report warns that bias could lead to adverse health outcomes for the affected group. It also recommends solutions that the government and NHS could employ to combat bias.

Pulse oximeters

The research was prompted by concerns that pulse oximeters gave less accurate results when used on patients who were not white. The potential problem was highlighted during the COVID-19 pandemic, when oximeters were essential for determining the severity of infections.

The review confirmed the suspected bias. It found ‘extensive’ evidence of pulse oximeters overestimating oxygen levels in people with darker skin tones. While the racial bias has not been found to cause harm in specific cases within the NHS, the report says that ‘potential for harm is clearly present’. It also notes that there is evidence of the bias causing ‘adverse clinical impact’ for black patients in the US.

Artificial intelligence

Another area of technology which was found biased was artificial intelligence (AI) programmes. Negative consequences noted include under-diagnosis of skin cancer in ethic minorities, heart disease in women and glaucoma in asian people.

The biases are thought to result from development and testing processes which favour a ‘standard patient’. This patient is assumed to be white, male and affluent. For example, AI programmes which are trained with ‘unrepresentative data’ that lacks diversity.

Within dentistry, previous research has highlighted the potential for bias in AI programmes used for diagnostic assistance. Affluent populations are often overrepresented in data sets due to a greater degree of dental access.

Professor Dame Margaret Whitehead is chair of the review. She said: ‘The advance of AI in medical devices could bring great benefits, but it could also bring harm through inherent bias against certain groups in the population, notably women, people from ethnic minorities and disadvantaged socio-economic groups.   

‘Our review reveals how existing biases and injustices in society can unwittingly be incorporated at every stage of the lifecycle of AI-enabled medical devices, and then magnified in algorithm development and machine learning.  

‘Our recommendations therefore call for system-wide action, requiring full government support. The UK would take the lead internationally if it incorporated equity in AI-enabled medical devices into its Global AI Safety initiatives.’

‘Ensuring equitable access’

The Medicines and Healthcare products Regulatory Agency (MHRA) is responsible for ensuring that all medications and medical and dental devices are safe for the public. The results of the review have prompted the agency to reevaluate their stance on equity in devices. It now requests that approval applications for new devices describe how they will address biases.

MHRA chief executive Dr June Raine said: ‘The MHRA acknowledges that inequities can exist within medical devices and we therefore welcome the publication of Dame Whitehead’s independent review.

‘We are highly committed to ensuring equitable access to safe, effective and high-quality medical devices for all individuals, and the recommendations set out in this report will support and strengthen the impact of our ongoing work in this area.

‘We are committed to working collaboratively with government, regulatory bodies, healthcare professionals and stakeholders to address these issues effectively.’

Combatting bias

NHS guidance has also been updated to highlight the potential limitations of devices such as oximeters due to biases. The government has pledged to work with the MHRA to ensure regulations are safe for patients regardless of background. This will include work to combat bias in training data sets, training NHS professionals in health equity and improving transparency around bias from developers of medical devices.

Minister of State Andrew Stevenson said: ‘Making sure the healthcare system works for everyone, regardless of ethnicity, is paramount to our values as a nation. It supports our wider work to create a fairer and simpler NHS.’


Follow Dentistry.co.uk on Instagram to keep up with all the latest dental news and trends.

Favorite
Get the most out of your membership by subscribing to Dentistry CPD
  • Access 600+ hours of verified CPD courses
  • Includes all GDC recommended topics
  • Powerful CPD tracking tools included
Register for webinar
Share
Add to calendar