There's no question that algorithms can be biased, producing results that reflect the creator's preconceived opinions. But how do you reliably detect signs of that bias? Carnegie Mellon researchers can help. They've developed a system that tests algorithms to see how much influence a given variable has over the outcome, giving you a sense of where bias exists. It could reveal when a credit score system is giving any weight to racial discrimination, or catch simple mistakes that put too much emphasis on a particular factor.