A Pew Charitable Trust survey noted that most Americans feel racial bias in corporate America. The April 2023 poll found that 64% of Black adults said unfair treatment based on a job applicant’s race or ethnicity was a significant problem; only 30% of White adults in the poll felt that way.
The survey also revealed that Americans who see racial and ethnic bias as a problem in hiring also believe that artificial intelligence (AI) could produce more equitable practices. Fifty-three percent of those who see racial bias as a problem in hiring think it will improve if employers use more AI in the hiring process.
The survey notes that Black Americans stand out as the most skeptical. But 20% of Black adults who see racial bias and unfair treatment in hiring as a problem say AI would make things worse, compared with about one in ten Hispanic, Asian and White adults.
Jennifer Opal is a Software engineer at the Opal Company. She is a neurodivergent woman who is autistic and has ADHD, dyslexia and dyspraxia. She is a staunch neurodiversity and inclusion advocate. Opal believes AI has the potential to improve diversity, but the ethics behind how the technology is used needs significant improvement.
“There are many amazing ways that AI is being used that have been incredibly useful for so many, such as accommodation tools for those with dyslexia, like me, or AI pair- programming tools for engineers to remove some of the repetitiveness of their jobs to spend more time implementing other solutions,” said Opal. “For myself, ChatGPT has been a surprisingly useful accommodation tool for my dyslexia and ADHD.”
“However, I worry about the ethics behind AI setting us back in our strive of diversity and inclusion in tech teams,” said Opal. “There are companies that use AI-hiring tools to find the right candidate, but it’s become common knowledge that some of these AI systems select an ideal candidate that tends to be white, male, heterosexual, able-bodied, or without hidden disabilities.”
“As a neurodivergent woman who is also a neurodiversity advocate, many neurodivergent people do not disclose they have hidden disabilities until they receive a job offer, their first day or, for some, they don’t disclose it at all for fear of biases, judgment and discrimination,” said Opal in an email interview.
“As we are having conversations about how to bring in neurodivergent talent, create neuro-inclusive environments, and encourage neurodivergent people to be transparent about their hidden disabilities, I’m concerned that biases in AI systems will exclude them,” added Opal.
Can AI help?
Opal says that while AI is an exciting technology that has the potential to make positive impacts on changing diversity within tech teams in such positive ways, it would require a lot of work from the humans collecting the data for the AI system designs.
Hackajob, a full-stack technology hiring platform in the UK, has launched three features they believe will increase diversity, equity and inclusivity in technology teams.
The platform is AI-driven, and the company says it helps remove bias from job descriptions to facilitate recruitment in the process. One of the features listed in a press release was to prioritize candidates based on diversity and inclusion objectives while ensuring the protection of the candidates’ confidentiality.
The company said it reduces bias in AI through a combination of data collection, model design, regular evaluation, and the incorporation of ethical guidelines.
Mark Chaffey, Co-founder and CEO of hackajob, says that the first feature uses generative AI (GAI) to remove bias from job descriptions.
UK-based hackajob has raised $25 nillion through a Series B funding round which was led by Boston-based Volition Capital and participation from existing investors AXA Venture Partners and Foresight.
“Employers can upload their job descriptions to the hackajob platform, where AI technology scans and assesses the content, creating suggestions for more inclusive language,” said Chaffey. “Employers can effortlessly implement suggested changes, ensuring that their recruitment materials are free from bias and appeal to a broad spectrum of potential candidates.”
But the elephant in the room remains – bias in AI datasets is a problem in the hiring process for minorities.
Four years ago, the National Institute of Standards and Technology study showed that facial recognition AI often misidentifies people of color. This was the case with Robert Williams, who was arrested in 2020 because the AI in facial recognition software could not accurately identify him because he was Black.
Williams is the first person to be wrongfully arrested because of errors in facial recognition. A 2021 study on mortgage loans showed that predictive AI models used to accept or reject loans didn’t provide accurate recommendations for loans to minorities.
In 2021, the AI and Algorithmic Initiative was launched by the US Equal Employment Opportunity Commission (EEOC) to ensure that the use of software, including AI, machine learning, and other emerging technologies used in hiring and other employment decisions comply with the federal civil rights laws the EEOC enforces.
Chaffey adds another viewpoint on the debate of using AI in the hiring process to reduce bias, saying it depends on what you use AI for.
“Amazon, for example, was using AI to make hiring decisions, and the AI model was trained on inherently biased data sets, which in turn reinforced the bias,” said Chaffey in an email interview.
Chaffey says the way hackajob is using AI to improve diversity is not for making decisions but to make content more inclusive.
All of the data and insights hackajob gathers are opt-in data – ethnicity, gender, neurodiversity, etc. – from candidates. Chaffey says they use GAI to ensure all job ads, company career pages, etc., include inclusive language and recommend ways to change that language to reduce bias against age, gender, ethnicity, etc. They enable companies to source underrepresented groups proactively.
“We don’t automate decision-making, and we don’t filter out non-diverse candidates either,” said Chaffey.
“The platform uses generative AI to understand if any language is not inclusive to certain characteristics or people,” said Chaffey. “It’s important to remember that hackajob uses a reverse model where employers are reaching out and applying to candidates, not the typical mode where candidates apply to employers.”
Chaffey adds that this is where the data comes into play. “Because they have the data, companies using hackajob’s Generative AI feature can see the rate of change for how many female (or other group) candidates accepted the outreach from the employer before the new recommended language and after,” adds Chaffey.
“Preprocessing techniques can be applied to minimize the introduction of direct bias such as filtering, resampling, weighting, or other methods to balance the dataset, and we make sure we only use data relevant to the task,” said Chaffey.
But Rosemarie Wilson, a mindset and leadership coach and keynote speaker, says that if we need quality, non-biased data, we must look at what’s behind this lack and address it.
“We need to step back from the novelty and excitement of what AI can achieve and rethink in some areas, so we don’t continue to build on the bias that already exists in this emerging technology to ensure we have enough relevant data to so that AI is developed and contributed to by all, for all,” said Wilson.
“We’ve been talking about diversity in tech for a long time; some say far too long without enough action,” said Wilson. “But I’d rather see slow-paced change ensuring strong foundations that can be built on if it means these changes mean hiring and recruitment demonstrate a true reflection of diversity.”
“I’ve seen tech being used to remove basic sources of bias by anonymizing resumés for details such as name, age, headshots and nationality alongside modifying or creating gender-neutral and more inclusive job descriptions and postings.”
Wilson says she knows companies that use tech to enable candidates to notify them if they prefer a non-standard interview and have also devised tests in a technical format for candidates who struggle with the interview process.
“Overall, they’ve adjusted the conventional hiring process to support neurodiverse candidates,” added Wilson.
Wilson says hackajob is making great strides to ensure more diversity in tech. “It’s been a long time in planning, and they’re also aware of how important data is, so they’ve been collating and measuring data to continue improving, which is essential for quality data sets,” adds Wilson.
“They work with different organizations at scale for tech recruitment, and from what I’ve seen, they have a long-term approach also to encourage new perspectives and thinking within these organizations when it comes to diversity hiring,” said Wilson.
Addressing bias in data sets
Opal says that to address bias in AI data sets, we need to start with the individuals behind the data collection.
“Some AI systems use data that reflect humanity: the good, the bad and the judgmental,” said Opal. “We need to ensure that we are collecting and using data from multiple diverse sources.”
Opal adds that companies need to be honest with their users about the data they use and where it is sourced, especially if the data has biases that can cause harm to minorities trying to get into the industry.
“One of the best options to address data bias is to involve minorities in building AI systems that could impact them the most,” said Opal. “According to the Women In Tech UK Survey 2023, 26% of people working in tech represent women, which is quite a small increase since 2019, where the figure was 19%.”
And Opal says that on top of those statistics, Black women only make up 0.7% of the tech industry.
For Opal, hiring, training and collaborating with organizations predominantly working with underrepresented groups, such as the LGBTQIA+ and neurodivergent communities, are key. “This is so the tech industry can ensure that AI biases are mitigated to make the growing lists of AI systems more inclusive for all those using them in the future,” adds Opal.
“We need to invite people into the room to provide expertise as AI systems are being created that can impact them greatly,” said Opal. “This is one of many ways to encourage education with the AI Engineers or researchers and the underrepresented individuals.”
Chaffey says that of the candidates that sign up for hackajob, around 80% proactively provide data about their ethnicity, gender, etc. “Also, hackajob doesn’t track through to final hires, so the company does not have data on conversions,” said Chaffey.
“Hackajob’s matching engine understands the company’s hiring priorities and can prioritize those candidates, but there is no way for the company to know what candidates on the list are male or female – that data is not served to the company,” said Chaffey. “Hackajob is also NOT excluding candidates from the lists of candidates they serve up to companies.”
Chaffey adds it’s crucial to explain that hackajob sorts candidates based on objectives, where they have bands of candidates that match the role similarly.
“So if a company wants to increase non-males in their tech workforce, hackajob doesn’t put all non-males at the top of the list, but where they match 40 candidates to the role, and the top 10 have a similar suitability for the role, they prioritize non-males with that band, then if the next eight are one rank down, they prioritize non-males within that band, and so on,” added Chaffey.
But Wilson points out that if we don’t have quality, non-biased data, we need to look at what’s behind this lack and address it, step back from the novelty and excitement of what AI can achieve, and rethink in some areas.
“If we don’t continue to build on the bias that already exists in this emerging technology to ensure we have enough relevant data so that AI is developed and contributed to by all, for all,” said Wilson.
“I’m no AI strategist, but we need to look at things such as what’s missing or being omitted,” said Wilson. And, are we doing enough to educate and demonstrate why we need to ask questions relating to data for diversity and quell the hysteria around AI and how harmful it is, especially for collecting this type of data and what it’s to be used for?”
What can the future look like?
Opal has shared her story on many national and international stages, including hackajob’s DevLab Conferences and Tech Show London to the Eye to Eye Conference in Denver.
“All this to advocate for others like myself to be seen, supported, and encouraged in the tech industry as well as for others to be educated and reflect within themselves on how they can be better humans for those around them, their colleagues in particular,” said Opal.
“I’d love to see a future where the desire to be seen and respected in the tech industry is a recognized expectation from everyone and not seen as a challenging request,” said Opal. “I would love to see a future where technology does not discriminate and exclude but rather invites diversity, creates inclusion, and encourages the underrepresented in many ways, especially in the fast-evolving tech industry we are a part of.”
Chaffey says there is no doubt that humans will always need to be involved in recruiting and are inherently biased, creating challenges.
“hackajob helps in that by tracking conversions of every stage of the process,” said Chaffey. “The new AI feature in hackjob says it prioritizes candidates based on diversity and inclusion objectives, while still ensuring the protection of candidates’ confidentiality.”
Opal says that, unfortunately, she doesn’t often see women promoted in their roles, especially Black women.
“Since the beginning of my career as a Software Engineer, I have never worked with a Black woman engineer, and I hope to experience that in the future,” said Opal. “There is something so inspiring and encouraging about working with someone who looks like you. I’ve worked and learned from many talented and experienced engineers, but working with someone who is also part of the 0.7% of Black women working in tech in the UK would be such a special moment for me.”
Wilson says she would like to see more diverse talent in a broader sense working in tech teams, not only gender, ethnicity, or neuro-focused, but also done better so that when people are hired, they are made to feel like they belong, deserve to be there and are provided with opportunities to develop and not be treated like an outsider.
“I’ve seen more women going into tech alongside some shift upward in neurodiversity numbers,” said Wilson. “But I’m also seeing candidates leaving the space because the infrastructure or company culture hasn’t caught up to help new hires feel comfortable or supported enough to experience a sense of inclusion and belonging.”
“If we are saying diversity is important in the hiring process, we also need to ensure the culture or infrastructure they’re entering is also set up to nurture and support them,” adds Wilson.
“I hope that more tech recruitment companies develop smart and fairer ways to ensure diverse candidates are provided with greater opportunities and that they work alongside organizations to implement better strategies and policies in the hiring process, said Wilson.
Wilson believes that organizations and tech teams need a vision and long-term approach to their strategies, along with maintaining equity in hiring, further support, and job promotions.
“Hiring for diversity is not a straight road, but we’ve been talking about this topic for a long time, so it is good to see some traction happening,” said Wilson. “We still have a way to go as I would also like to see a change in human perspectives and behavior for what inclusion means, especially when discussing diversity.”
Read the full article here