Read Full Transcript

Hi, I'm Jeff Altman, The Big Game Hunter. I coach people to play their professional and personal games big.

I was listening to an episode on NPR of "All Things Considered." The episode is called, "Can Computers Be Racist." It talks about the human like biases of different systems. As a matter of fact,"The Human-like Biases of Algorithms," is the subtitle for the show.

It begs the question, it poses different search results, and how different algorithms reinforced certain interpretations. So that, for example, there was a racist bias in Google images that describe the person of color as an animal based upon some interpretations of facial appearances. It also looked at certain behaviors, it doesn't specifically say Google. Itsays a search engines. So it could be any of them. They interpreted certain behaviors as being more indicative that a person of color should find out more about bail than a white person might.

So, it begs the follow up question, in the way that your applicant tracking system is being programmed are there inherent biases to it that get reinforced time and time again. The probability is, "yes." For example, when you look at education, Silicon Valley loves people from certain universities but, sometimes, individuals can't afford those schools. So, yes, you can say they could get a scholarship, but they might not be able to get a scholarship. So, it reinforces a certain stereotype of people of particular races, and particular social classes. Thus, you're missing the opportunity to attract people with diverse backgrounds in your interviewing and selection process.

If someone could not afford to do that internship . . . could not afford to take an internship because they, frankly, needed a job they'll pay for college, your system probably looks down on that behavior and thus, it becomes an issue of class.

Ultimately, look at your system's programming and how it interprets things. Is it purely based upon terms or are there behavioral analyses that it's doing along the way that cause it to interpret and screen out people who don't fit a particular profile.

As the NPR story says, ultimately, what happens with algorithms is that humans create them. So, invariably, garbage in, garbage out. You know, that expression. It reinforces the biases that that humans have all around this nice analytical model that firms can go, "Gee, it's the the program. It's the algorithm!" But who programmed the algorithm. Asone of the guests on the show said that those biases that caused one search to interpret a person of color as a type of animal, was programmed by humans and, thus, made those mistakes. Your system can experience much the same sort of thing.

So, start considering going through the ATS and see how it evaluates. Do it with a discerning eye toward bias around class, bias around certain types of education. I've seen systems that have biases about people from the South of all ridiculous things, because I now live in the south. I'll simply say I'm an ex-New Yorker who moved here and, you know, there are issues of class that people who live in urban environments in the North look down on people from rural environments in the South. That behavior gets acted out PLUS you have the international scope of many of your departments where the bias gets reinforced that people of particular nationalities are superior individuals of different nationalities. Thus, those people get screened out because people of that nationality have been screened out previously.

It's insane. But do you really want to have your firm's behavior picked apart by someone who sues?


A recent NPR episode suggests that they can.

Read Full Transcript

Hi, I'm Jeff Altman, The Big Game Hunter. I coach people to play their professional and personal games big.

I was listening to an episode on NPR of "All Things Considered." The episode is called, "Can Computers Be Racist." It talks about the human like biases of different systems. As a matter of fact,"The Human-like Biases of Algorithms," is the subtitle for the show.

It begs the question, it poses different search results, and how different algorithms reinforced certain interpretations. So that, for example, there was a racist bias in Google images that describe the person of color as an animal based upon some interpretations of facial appearances. It also looked at certain behaviors, it doesn't specifically say Google. Itsays a search engines. So it could be any of them. They interpreted certain behaviors as being more indicative that a person of color should find out more about bail than a white person might.

So, it begs the follow up question, in the way that your applicant tracking system is being programmed are there inherent biases to it that get reinforced time and time again. The probability is, "yes." For example, when you look at education, Silicon Valley loves people from certain universities but, sometimes, individuals can't afford those schools. So, yes, you can say they could get a scholarship, but they might not be able to get a scholarship. So, it reinforces a certain stereotype of people of particular races, and particular social classes. Thus, you're missing the opportunity to attract people with diverse backgrounds in your interviewing and selection process.

If someone could not afford to do that internship . . . could not afford to take an internship because they, frankly, needed a job they'll pay for college, your system probably looks down on that behavior and thus, it becomes an issue of class.

Ultimately, look at your system's programming and how it interprets things. Is it purely based upon terms or are there behavioral analyses that it's doing along the way that cause it to interpret and screen out people who don't fit a particular profile.

As the NPR story says, ultimately, what happens with algorithms is that humans create them. So, invariably, garbage in, garbage out. You know, that expression. It reinforces the biases that that humans have all around this nice analytical model that firms can go, "Gee, it's the the program. It's the algorithm!" But who programmed the algorithm. Asone of the guests on the show said that those biases that caused one search to interpret a person of color as a type of animal, was programmed by humans and, thus, made those mistakes. Your system can experience much the same sort of thing.

So, start considering going through the ATS and see how it evaluates. Do it with a discerning eye toward bias around class, bias around certain types of education. I've seen systems that have biases about people from the South of all ridiculous things, because I now live in the south. I'll simply say I'm an ex-New Yorker who moved here and, you know, there are issues of class that people who live in urban environments in the North look down on people from rural environments in the South. That behavior gets acted out PLUS you have the international scope of many of your departments where the bias gets reinforced that people of particular nationalities are superior individuals of different nationalities. Thus, those people get screened out because people of that nationality have been screened out previously.

It's insane. But do you really want to have your firm's behavior picked apart by someone who sues?

ABOUT JEFF ALTMAN, THE BIG GAME HUNTER

Jeff Altman, The Big Game Hunter
Jeff Altman, The Big Game Hunter

Jeff Altman, The Big Game Hunter is a career and leadership coach who worked as a recruiter for more than 40 years. He is the host of “No BS Job Search Advice Radio,” the #1 podcast in iTunes for job search with more than 1500 episodes and his newest show, “No BS Coaching Advice.” He is a member of The Forbes Coaches Council. “No BS JobSearch Advice Radio” was recently named a Top 10 podcast for job search. JobSearchTV.com was also recently named a Top 10 YouTube channel for job search.

Are you interested in 1:1 coaching, interview coaching, advice about networking more effectively, how to negotiate your offer or leadership coaching? Schedule a free Discovery call.

If you have questions for me, call me through the Magnifi app for iOS (video) or PrestoExperts.com (phone)

Jeff’s Kindle book, “You Can Fix Stupid: No BS Hiring Advice,” is available on Amazon.

JobSearchCoachingHQ.com offers great advice for job hunters—videos, my books and guides to job hunting, podcasts, articles, PLUS a community for you to ask questions of PLUS the ability to ask me questions where I function as your ally with no conflict of interest answering your questions.  

Connect with me on LinkedInLike me on Facebook.

Join and attend my classes on Skillshare. Become a premium member and get 2 months free.

Watch my videos on YouTube at JobSearchTV.com or BingeNetworks.tv for FireTV, AppleTV, Roku and 90 ot

her devices

Join Career Angles on Facebook and receive support, ideas and advice in your current career and job.

 

 

About the author

Leave a Comment, Thought, Opinion. Speak like you're speaking with someone you love.

Is Work Not Working?

Instead of talking to Your Wife, Husband, or Partner, Schedule a Time With a Career Expert.

%d bloggers like this: