One of HECAT’s core guiding principles is our commitment to developing an algorithm that is ethical and that has no strings attached. We intend for use of our algorithm to be entirely voluntary, and there are many cautionary tales in the world of data science that tell us why this is important.
MiDAS was Michigan’s response to the global financial crisis of 2007/2008. With tax revenue falling due to unemployment, and government spending rising due to more people claiming unemployment benefits, Michigan, like many parts of the world, was in a tough spot.
In response, the state paid a private company to develop MiDAS (Michigan Integrated Data Automated System), a new algorithmic framework, developed using state of the art data science which would more accurately identify unemployment fraud. The issue? It never worked.
Between 2013 and 2015, more than 40,000 people were wrongly accused of unemployment fraud. While the picture is not completely clear due to ongoing lawsuits, it seems that the MiDAS system was programmed to automatically flag applications as fraudulent unless it could find evidence to the contrary. This assumption of guilt approach meant that even marginal errors in processing or analysis would flag an innocent person as guilty.
This would seem to be a fairly small problem, with people able to appeal their wrongful categorization and receive restitution. Unfortunately, along with MiDAS, Michigan’s Unemployment Insurance Agency (UIA) passed a series of brutal punishments for those who were accused. Additionally, the appeals system repeatedly denied claimants who professed their innocence.
In at least one case a family had to declare bankruptcy due to their wrongful accusation, having been fined $10,000 for committing welfare fraud, and had their tax returns seized three years in a row. Michigan created MiDAS due to a decades long struggle to balance the state’s budget, but now it seems likely that $100m will be paid in damages to the wronged parties. By comparison, MiDAS was able to generate roughly $60m in revenue, through monies saved by not paying out welfare to the accused, and through fines which were collected from the same.
From a technical and scientific standpoint, we at HECAT are sure that the MiDAS system is well designed. However, from an ethical standpoint there were innumerable flaws, and ways that this could have been avoided. The most significant of course was the human decision to automatically reject applications and treat rejected applications as having automatically engaged in fraud. No matter how solid the data science, at some point a person made a decision to make the system work this way, and up to 40,000 people have had their credit ruined, with some in their 40s or 50s requiring a co-signer to get a lease for a car.
This is one of many cautionary tales that algorithms have real impacts on the lives of the people they work upon, and that new perspectives will be necessary as we learn to live with big data. I would finish with HECAT’s core message: that we should work with not on the unemployed.