As 2025 approaches, Californians can look forward to strengthened patient protections under the new Physicians Make Decisions Act (SB 1120), authored by Senator Josh Becker (D-Menlo Park). This groundbreaking law, passed Dec.8, ensures that decisions about medical treatments are made by licensed health care providers, not solely determined by artificial intelligence (AI) algorithms used by health insurers.
Speaking at the Ethnic Media Services briefing, the Senator emphasized necessary health care provider oversight when insurers utilize artificial intelligence (AI) algorithms to decide on healthcare providers' requests to offer medical services.
Health insurers now often rely on AI driven algorithms to process claims and prior authorization requests with the goal of improving efficiency and cutting costs. This holds significant risks, said the Senator.
In November 2023, the nation's largest insurance company was hit with a class-action lawsuit accusing it and its subsidiary, NaviHealth, of relying on a computer algorithm to "systematically deny claims" of Medicare beneficiaries in nursing homes that had struggled to recover from debilitating illnesses.
CIGNA doctors spent an average of 1.2 seconds on each case, another case against CIGNA revealed. One doctor denied approximately 60,000 claims in a single month. CIGNA was accused of building a system that allowed doctors to instantly reject a claim on medical grounds without opening the patient file leaving people with unexpected bills according to corporate documents and interviews with former CIGNA officials.
Most people don’t appeal a denial and lose out on the ability to get the medical treatments that they need.
Overall at 36 percent of the people surveyed by Dr. Miranda Yaver experienced at least one coverage denial in their lifetime and most of them experienced these more than once. About 60 percent of those people had experienced multiple denials. Dr. Miranda Yaver, Assistant Professor of Health Policy and Management at the University of Pittsburgh was doing research for her new book “Coverage Denied: How Health Insurers Drive Inequality in the United States”.
“One thing that I find is people who are being denied are not disproportionately people from marginalized groups. No matter who you are, you are vulnerable to this happening,” she said at the Ethnic Media Services briefing. However, the effects can really cause a lot of inequities.
The average American adult reads at about an eighth-grade reading level whereas healthcare materials are typically at an 11th through 12th grade reading level. “Many when they get a health insurance document in the mail are overwhelmed in trying to figure out what to do next,” she said.
The insurance system is complicated and fragmented so claims end up getting submitted to different places. These departments for instance, behavioral healthcare and non-behavioral healthcare don’t always speak to each other. The administrative burden which means the learning cost of figuring out your eligibility to appeal a denial, the compliance cost associated with figuring out how an appeal works and how to do it successfully falls on patients' and physicians at a time when both are stressed and stretched for time. “The psychological cost and the emotional burden get amplified for people who are scoring lower on the health literacy scale, which is a lot of people,” she said. More affluent and healthier patients are better equipped to navigate these administrative burdens.
“Less affluent patients are a lot less likely to appeal denied claims and most likely to postpone care and other purchasing for instance home repair because they have an unexpected medical expense,” she said. The already profound inequity in the American healthcare system is deepening between those with meaningful access to health coverage and those for whom benefits are kept out of reach.
Medical literacy interaction with the system requires a certain level of sophistication and time that most people do not have, said the Senator.
‘Physicians Make Decision’ requires a licensed physician to oversee decisions made by AI algorithms in healthcare processes such as claims processing and pre-authorization requests.
“We hope that this simple and powerful safeguard addresses a critical gap in the current system by ensuring that human expertise is integrated into decisions about patient care. We are hoping that this is another example where people will follow California’s lead.”
Comments
Start the conversation
Become a member of New India Abroad to start commenting.
Sign Up Now
Already have an account? Login