UnitedHealth, the world’s largest managed health company, is under investigation by the New York Department of Financial Services and Department of Health for allegedly using a racially biased algorithm to assess patient risk.
The algorithm embedded in Impact Pro, an analytics platform created by UnitedHealth subsidiary, Optum, is one of the most widely used risk analysis programs in the health care industry. Although the current investigation remains limited to New York, the widespread use of Impact Pro has prompted investigation requests and legislative action at the federal level.
In April 2019, Sens. Cory Booker, D-N.J., and Ron Wyden, D-Ore., along with Rep. Yvette D. Clarke, D-N.Y., introduced the Algorithmic Accountability Act to force companies to study and correct algorithms that “result in inaccurate, unfair, biased or discriminatory” health care decisions primarily affecting African Americans.
On Dec. 3, Booker and Wyden co-authored a letter to UnitedHealth, Blue Cross Blue Shield, Cigna, Humana and Aetna demanding information on the potentially racially biased algorithms. Both senators also contacted the Centers for Medicare and Medicaid Services regarding the use of risk analysis algorithms and requested a formal investigation from the Federal Trade Commission.
The NYDFS launched its probe in October 2019 based on a study conducted by the journal Science, which found that the algorithm “falsely concludes that black patients are healthier than equally sick white patients.” The study contends that false algorithm data “reduces the number of black patients identified for extra care by more than half,” ultimately resulting in unequal access to health care.
In a joint letter to UnitedHealth, the NYDFS and NYDOH claim the algorithm “significantly underestimates” African Americans’ health care needs and demands the company stop using the data analytics program unless the company can prove it’s not racially biased.
Tyler Mason, UnitedHealth’s VP of Communications offered this response to the Science study conclusions:
“A recent study mischaracterizes a cost prediction algorithm used in an Optum clinical analytics tool as racially biased. The algorithm is not racially biased. It is designed to predict future costs that individual patients may incur based on past health care experiences, and does not result in racial bias when used for that purpose, a fact with which the study authors agreed.
"The authors studied the experience of a single health system that incorrectly used the cost prediction algorithm as the sole factor for determining who should receive care management services. This is inconsistent with both the algorithm’s design and training we provide our customers, and we are not aware of any other health system using it in this incorrect way.
"When used as designed and intended, the tool enables effective and equitable care management programs, helping clinicians provide appropriate care to all patients. The cost prediction algorithm is one of more than 1,500 health care metrics available in the tool, including more than 600 quality-based ‘gap in care’ indicators, condition identifiers, medication adherence indicators, and patient safety indicators, that are specifically designed to ensure that underserved patients are identified and receiving the health services they need.”
At the time of this report, no state or federal representative has responded to inquiries from New York Business Daily.