Assigning Jobs Using An Algorithm

Technology is being leveraged to bolster diversity.

On September 30, California Gov. Gavin Newsom signed legislation requiring publicly traded companies based in the state to have at least one racially, ethnically or otherwise demographically nontraditional director on their board by 2021. While 11 states, as well as some countries, have enacted or are considering board diversity legislation, these focus on disclosure, without instituting quotas. Even so, “the expectation is by taking the lead, California-based companies will set the standard that becomes expected by a lot of other publicly traded companies,” says Cecyl Hobbs, consultant at Russell Reynolds Associates.

Technology can be leveraged to accelerate diversity. Yet, how companies hire can work contrary to these goals—especially when technology platforms are sending companies an increasingly large number of candidates—for some companies, so many that it almost requires the use of technology to sort through.

“At the end of the day, it’s escalating warfare—you have all these people applying, and recruiters are inundated,” says Frida Polli, founder and CEO of Pymetrics, a game-based recruiting platform. “It’s not that the technology is all the sudden coming to bear, but people have come to realize how biased technology can be. AI is holding a mirror to society, and we don’t like what we see.”

Being thoughtful and creative with technology can eliminate bias, proponents claim. Algorithms can ensure job descriptions encourage a broader swath of applicants. Algorithms optimize for multiple criteria that are predictive without showing gender or ethnic differences. Training algorithms with data that’s more common across groups—like math scores versus engineering degrees— and evaluating soft skills like cognitive, social and emotional aptitudes rather than proxy variables for gender or race—produce a more diverse candidate pool.

“AI is only good as the data that gets fed into it—if the underlying data only feeds in experienced-based criteria, you may miss candidates that may have unconventional experience sets,” says Hobbs.

There’s no question that when humans interview candidates, bias is an issue. There are ways to overcome this, like anonymizing resumes or evaluating candidates with a rubric focused on skills rather than education and experience. Comprehensive referencing helps fill in the gaps for candidates with unfamiliar backgrounds

“There’s hard work to be done around diversity but there are incredibly rich rewards in terms of the performance of the companies willing to embrace this,” Hobbs believes.