AI is better than you at hiring diversely

0

Countless studies show that diversity — whether it’s based on race, age, gender, or socioeconomic status — is good for business. It adds more perspectives, opinions, knowledge, and skills to the table. But we know that companies, big and small, are still facing issues with hiring diverse workforces.

The reality is, we are inherently biased. We can’t stop ourselves from automatically liking people who resemble ourselves. That’s why it might be time to admit that tech could do a better job than us at hiring.

In the past few years, we’ve seen the arrival of a number of startups aiming to fight unconscious bias in hiring  — using software. Paris-based company Goshaba, for example, lets job candidates play cognitive games to make the recruiting process more efficient and inclusive.

The company was co-founded by Camille Morvan, who taught cognitive science and organizational psychology at Harvard. In 2014, she switched to entrepreneurship following a simple observation: Recruiters tend to be solely fixated on CV’s and cover letters while ignoring the candidate’s soft skills. But things are changing, says Morvan:

We’re seeing a real shift with large corporates becoming increasingly convinced of the benefits of objective, data-based, fair recruiting. In particular, they have observed the danger of unconscious biases in recruiting. Diversity is not only a key ethical question for companies but it helps them attract and retain the best talents.

For example, we will shortly be working with EDF Energy to target candidates just starting their career from school or university. Diversity and inclusion is a huge strategic programme for the energy firm, in a sector that has been quite white and male-dominated. They want to change that.

Headstart, which is based in London, has a similar mission but uses machine learning to determine which candidates are the best technical and cultural fits.

The recruitment process was designed to help companies move away from qualification-based hiring and take things like personality, interests, and motivations into account as well. By letting algorithms match applicants with the best ‘fit’ roles, unconscious bias can be reduced significantly.

Designing around bias

Siri Uotila is a research fellow at the Harvard Kennedy School’s Women Public Policy Program. She has done a lot of research on how bias affects our decision making and how we can design environments that give less room for biased decisions.

According to Uotila, there are loads of ways to improve decision making in terms of hiring – and to make the hiring process more effective in general seen from an HR perspective.

What we always recommend is to never do unstructured interviews. If you want to do interviews as the first step of hiring, make sure they are structured and equal for all applicants. What we would encourage even more than interviewing is blind recruiting and to require a work-sample test.

Blind recruiting is the act of removing personal information from an application, that includes your name and everything else that implies something about your demographics. This allows the essential parts of your application — your skills and qualifications — to stand out.

This is a tried and tested method. One of the most notable examples is that of classical musicians at orchestras. For decades they were highly dominated by male players, and numbers of female players were sometimes as low as five percent. This issue was noticed in the 70s and 80s, and players were then required to play behind a curtain when they auditioned for spots. This increased the numbers of female musicians significantly.

The other point of making hiring processes more effective and fair is requiring work-samples. Instead of letting a potential employee explain what they are good at, how about putting them to a test instead? Let them show their skills instead of explaining them.

Let AI do the hiring

This is where Priyanka Jain, head of growth at the hiring platform company, Pymetrics, comes in. Pymetrics offers a hiring platform to large companies, that excludes personal information about the applicant. This makes the process as unbiased as possible.

The goal for pymetrics is to give all applicants equal chance at being considered for a job. No matter their gender, ethnicity or socioeconomic background.

Pymetrics makes you play neuroscience games that let you solve different tasks. For example, how prone are you are at taking risks, or how impulsive are you? These tasks assess your cognitive and emotional traits, and the data they gather, Jain explains, is much denser than a resume would ever be.

So how does it work? You log in to a platform. You don’t send in your resume. You solve these tasks and depending on how you measure on them, you either will or won’t be invited for a second-round interview.