Friday, November 22, 2024

Artificial Intelligence: Diversity Solution or Dubious Excuse?

Recent Articles

Programming screenBy Nazir Awad, Writing Analyst at Unicast Entertainment 

Artificial intelligence (AI) has taken most industries by storm. AI grew to be the card up employer’s sleeves; it takes tasks like data collection, advertisement, scientific data extrapolation, and pattern analysis to a whole different league. I can go on forever; it is redundant nowadays to speak and introduce the reader to the untapped potential of AI. A specific use for AI has been employee recruitment and resume screening. Recruitment has been a thorn in the sides of employers for decades now. From the tediousness of going through hundreds of resumes to the challenge of remaining impartial and unbiased in the hiring process, recruitment was never a walk in the park. AI can help with a lot of that. It can go through resumes within seconds, assessing the potential of each candidate based on the qualities that are most suitable for the position in question. It can also seek out candidates on platforms like LinkedIn and Indeed, which would make the hiring process much faster and would pave the way for a more thorough hiring process. All these benefits can be of great merit, yet an AI perk that was celebrated more than any other is how it enables corporations to eliminate elements of potential bias to create a more inclusive and fair hiring process; that is what I would like to speak to in the next few paragraphs.

Racial, gender and sexual inequality have been topics of discussion and at the forefront of the liberal and humanitarian agenda. So naturally, novel technologies and solutions gravitate towards eliminating these kinks in the system. In theory, introducing artificial intelligence into the recruitment process should take out any residual/subconscious bias and result in a more diversified and fairer workplace. This hypothesis relies on several factors that are key in proving that AI is indeed capable of achieving said results. The first factor that needs to be assumed is that the algorithms used to make up the core logic of the AI system are inherently bias-free. Another assumption is that if it were not for the systematic discrimination during the hiring process, demographic ratios would reflect onto the ratios of different racial, sexual, and gender groups of employees with respect to each other in the workplace. The last factor to be assumed would be that the lack of diversity and inclusivity is mainly due to a systematic issue that has to do with the preliminary choosing process where resumes are picked out from a larger pool to be interviewed, rather than the interview process itself. If these factors are not in realistic and factual, then this hypothesis fails, and AI would not be the diversity solution that it is thought out to be.

The problem with such recruitment systems is that these assumptions do not hold in practice. For starters, the algorithms used can be adjusted and are made by humans that can have biases; the fact that an AI system is introduced to eliminate subconscious bias contradicts itself, as the programmers of said AI program could suffer from the same phenomena. Also, the notion that choosing employees based on merit only rather than the identity of a person would result in a balanced workplace with evened-out rations is not very accurate. I mean, let us take a software engineering firm for example. Statista estimated that 91.5% of the pool of software engineers are men, while 8% identify as females. Regardless of whether this difference in the ratio a result of the difference in aptitude between the two genders is or because of societal stigma, the fact remains that female software engineers are scarce, above-average female software engineers are even harder to find and hire. The company would then need to go out of its way and sometimes disregard merit to achieve a balanced work environment. Some industries lean way more towards a certain group over the other for whatever reason (good or bad), and AI really might make this difference clearer and even widen the gap, as companies now would have a very solid alibi to dismiss any diversity-related complaints.

The last assumption I mentioned was that the issue with lack of diversity is the systematic integrity of the hiring process; I would argue that this is not necessarily true. If a corporation has execs who are inherently and subconsciously biased, then introducing a system that gives them a way out of having to work on those biases would not do the job. Sure, more candidates of a wide array of ethnicities might get interviews, or might even get jobs, but that does not mean that the workplace would be more diverse. Interviews are still conducted by human beings who might not be the fairest; even if interviews are not conducted and the “unbiased” AI system decides the top candidates to hire, those hired candidates still might encounter a more hostile or uncomfortable working environment that would increase employee turnover and thus lessen diversity. Jennifer Tardy, CEO of JTC, discussed this in her interview with Unicast entertainment as she mentioned that some organizations end up using such technologies to replace the need to train recruiters and executives on the fundamentals of recognizing eliminating biases. After all, Artificial Intelligent systems have the potential to exponentially increase the productivity and efficiency of the organization, but they should never be a way out of solving the roots of our deep-lying issues.

To hear more of Jennifer Tardy’s thoughts on AI in the HR market and overall labor industry you can watch her full interview with Unicast here: https://youtu.be/jBx902d2nCs

Unicast Entertainment
+ posts

Unicast Entertainment is a global digital platform celebrating and gaining insight into the success of remarkable individuals across the professional spectrum.

LEAVE A REPLY

Please enter your comment!
Please enter your name here