Difficult to remove racial bias in algorithms embedded in medical systems
by
It became clear in October 2019 that “racism” had occurred in a medical system that should not have data on race . Senators have stood up to this problem and are looking for improvements, but it seems that solving the problem is quite difficult.
Booker Wyden FTC Letter
https://www.scribd.com/document/437955271/Booker-Wyden-FTC-Letter
Racial bias in health care algorithms: there's no quick fix-the verge
https://www.theverge.com/2019/12/4/20995178/racial-bias-health-care-algorithms-cory-booker-senator-wyden
The content of the paper published in Science is that the medical algorithms used in the American healthcare system are biased to underestimate the treatment needs of black patients compared to non-black patients. The data used to build the system did not include race information, but the use of medical costs as data resulted in a bias for black patients who could not receive expensive treatment. It was out.
What is the reason for racial discrimination in a medical system for which no data on race exists? -GIGAZINE
In response, Senator Corey Booker and Senator Ron Weiden are investigating whether the federal trade committee is investigating the damage caused to consumers by a biased algorithm and Announced an open letter asking whether the algorithm is being investigated.
Although tools exist to detect algorithmic bias, Brian Powers, a Brigham and Women's Hospital co-author of a paper published in a science journal, said that the tool could be a potential source of bias. Comment cannot be removed, and there is no “single solution that can be applied to any algorithm”. Also, as the medical algorithm in question this time used medical expenses as an index of disease, the algorithm using surrogate indices needs to be considered more carefully, and the bias of surrogate indices needs to be evaluated individually. It was.
Nikol Turner-Lee of Brookings Institute also said that the tool is not useful for identifying problems in the construction method, and in order to remove the bias, cooperation between users such as the medical system / doctor / algorithm development group is necessary. It is said that there is.
For a long time, it has been pointed out that AI learns prejudice and discrimination when learning a large amount of human-made documents, and it seems that solving problems is not easy.
Related Posts:
in Note, Posted by logc_nt