Health Insurance Business Intelligence “Scoring” Algorithms Interfering with Human Morals
Posted Oct 24 2009 10:02pm
We keep hearing these outrageous stories on the news all the time. The first is a woman who is told to get sterilized in order to get insurance coverage. In this family, her husband and children were accepted for coverage but she was denied as she had a C-section. Are C-sections now flagged as as a reason to deny coverage? What if she had not had a C-section, would she have been covered. Many women over the years have had C-sections so why now are we considering this to be a “scoring” procedure when determining if someone can be covered? In her case it was United Healthcare giving her the news. If you think back to Sicko, remember we saw many examples of this times over, it has not stopped and as the algorithms get more defined and complicated, this will only continue to grow.
They are the king of the algorithms and make more money from this end of their business today than they do from insuring people. They live by those algorithms as just recently too in the news was the story of a 2 year old denied care as the algorithms showed she was too skinny. Recently too in the news a kidney donor was also “scored” to be a risk for health insurance as well.
Scoring algorithms on claims are powerful too, if you read about several dermatology offices that within 5 days were cut off from ALL the insurance policies from paying, they related it back to the Ingenix scoring procedures used for detecting “potential” fraud. This destroyed patient/doctor relationships and a couple offices had to close. When you stop and think of not just one carrier, but all stopping payment within 5 days with no notice, this is scary. Court and legal cases against Ingenix in the process as the other carriers subscribe to their business intelligence algorithms used.
It’s all in the risk management HRAs algorithms that determine this, set by a programmer under someone’s direction to do so. Recently this week I posted about the GINA law and health risk assessments in question and waiting for the ultimate decision on whether or not family history can be used. Again, the complicated algorithms run to assess whether or not someone can be covered is getting pretty complicated, so you somewhat wonder what will be the next item of criteria used to make these analogies. As she states in the video, she’s not alone.
In a somewhat related story, a couple has to get divorced in order to provide healthcare for their daughter. The couple had to legally get a divorce so their daughter could get the assistance she needed by showing that her mother did not make enough money on her own to qualify. How much is all of this interfering with our lives and “doing the right thing”. The couple plans to re-marry when their daughter turns 18. They obviously can’t afford the huge bills and perhaps too were denied coverage.
Our current healthcare plan and the algorithmic formulas that are supposed to be used to improve healthcare are appear to have profits at the top of the priority list and then if some folks happen to get healthier along the way while others suffer, well you know it’s all in those algorithms they taunt at Congressional meetings on improvement, which if it is you or someone in your family having problems, it means nothing when you need and are denied care. These are some real prime examples here as to why our current system does not work and is in fact working to destroy the “good” human instincts to help others that we all are born with by nature, and yet our leaders keep telling us to reach out and help others, but having this algorithmic penalty situation hanging over all our heads is forcing people to make decisions and take on attitudes we would not normally see.
When we start seeing stories of 70 year old seniors robbing banks, you have to ask what is wrong? This particular story happened in San Diego and they are still looking for the man, who by the way was also carrying an oxygen tank. BD