Health data must not reinforce social injustice, says report

Data-driven technologies can exacerbate existing health inequalities, according to a report published in The Lancet.

The article examines how structural inequalities, biases and racism can be easily encoded in data sets and in the application of data science.

It argues that factors such as the design, input, analysis, and application of data science can be affected by racism. For example, the way in which analytical problems are framed and selected is by the availability of funding, the interests and backgrounds of those planning and conducting analyses. 

The underrepresentation of black scientists and evidence that they are less likely to receive funding, can lead to health data science having a “white and western lens”, the article says. There is also lower participation of ethnic minorities in research.

To combat structural racism and its effects, the authors call for the health data science community to educate itself and future data scientists.

WHY IT MATTERS 

Data and data-driven technologies have played an increasingly influential role in health care during the COVID-19 pandemic.  

The Lancet article says that structural racism means that data science might not equitably benefit people from backgrounds that are underrepresented in the workforce and in the datasets. When racial bias is fed into algorithms which determine who needs care, this can place black, Asian and minority ethnic (BAME) groups at a disadvantage.  

THE LARGER CONTEXT

In the digital health space, there have been concerns about the issue of “race blind” data. 

The NHS recently announced it will publish ethnicity data on who is receiving the COVID-19 vaccine, following accusations of potential bias. 

In June last year, it was highlighted that the data protection impact assessment (DPIA) run by Palantir on the NHS COVID-19 data store would not be broken down by ethnicity, despite BAME people disproportionately affected by the virus. 

A previous study by The Lancet claimed that there is no such thing as race in healthcare algorithms, which further raises the question around systematic discrimination in healthcare products and apps.

ON THE RECORD

Maxine Mackintosh, co-author of The Lancet article and CEO of One Health Tech a grassroots organisation for equality and diversity, told Healthcare IT News: “This article highlights the macro, meso and micro ways that bias and discrimination, based on ethnicity, permeate the world of health data. It really highlights the multi-pronged approaches individuals, organisations and whole communities need to make to make digital health work for everyone. This applies to ethnicity, just as it applies to gender, sexuality or any other protected characteristic that is discriminated against, as they all intersect.” 

 

Source: Read Full Article