It goes without saying that we need to avoid feeding it biased data but, even accurate data can lead to conclusions we don't want. For example, if there aren't a lot of minority or women candidates in a certain tech field, the AI might conclude that it's an undesirable trait.
To counter that, we might actually have to add bias into the system. For example, pre-add weight for underrepresented groups in tech. I guess, sort of how bookmakers set odds so that all bets don't just go to the favored sports team.
© 2022 Praveen Puri