Bias in the Machine: Hilke Schellmann on AI’s Impact on Hiring

Episode 37

Guest

Hilke Schellmann

Author | The Algorithm

Assistant Professor of Journalism & AI Expert | New York University

In this compelling episode, host Denise Chaffin sits down with investigative journalist Hilke Schellmann, author of The Algorithm, to explore the complex relationship between artificial intelligence (AI) and the hiring process. Hilke provides a behind-the-scenes look at how AI is reshaping recruitment, often with hidden biases that can impact candidates’ careers and life paths. She shares stories from her investigations, including the unexpected biases in AI systems that favor certain candidates based on irrelevant factors, like hobbies and names. The conversation delves into the ethical concerns of AI in hiring, the potential benefits and pitfalls of people analytics, and the importance of transparency in AI-driven decision-making. This episode is essential listening for anyone interested in the future of work, technology, and fairness in the hiring process.

Key Episode Segments:

  • AI Bias in Hiring: Hilke Schellmann shares surprising insights into how AI in recruitment can introduce unintentional biases, sometimes favoring applicants based on irrelevant details like first names or hobbies.
  • Hidden Workforce: Hilke discusses the concept of “hidden workers”—qualified individuals who are overlooked by AI screening tools due to biased algorithms.
  • People Analytics: The conversation includes a critical look at people analytics, a growing field where AI is used to predict employee potential and retention, often with unpredictable and biased results.
  • Ethical Concerns and Transparency: Hilke emphasizes the importance of transparency in AI systems, advocating for companies to open the “black box” and ensure fair treatment in hiring.
  • Humanizing AI Decisions: Hilke highlights how human biases in the data used to train AI can lead to discriminatory outcomes, urging for a balanced approach to integrating AI in hiring.

Thank you for Listening

  • Share