Bias in Hiring Natalie Huante, Hayden Fargo, Chris Isidro, Emma Harper, Diego Lopez Ramos
Current algorithms meant to function without bias commonly inherit the biases of their programmers. This results from various issues such as the inclusion of traits like race, gender, nationality, etc. In our project, we look specifically at how the use of these algorithms affects the hiring process. In an attempt to reduce bias, we created