This week, WIRED debuted its joint investigation with Lighthouse Reports into the questions of bias and equity that are inherent in governments’ use of algorithms to oversee financial assistance programs and identify alleged welfare fraud. The investigation included an unprecedented look inside the system used by the city of Rotterdam, in the Netherlands, and the training data that was used on the algorithm. We looked closely at how flaws in the algorithm’s conclusions and wrongful accusations have impacted people’s lives in Rotterdam. And we examined the global role of the private fraud-detection industry in these systems as well as urgent concerns about the pervasive surveillance that is now inherent in Denmark’s national welfare scheme.