What are the problems with introducing AI into recruitment tests and how can it be implemented safely?



The idea is that by using AI, corporate recruiters can review resumes without being influenced by bias or personal preferences, and increase the fairness and consistency of decision-making. Researchers from

Massey University explain whether using AI in recruitment is actually a fair and desirable process, and what to be careful of when using it.

What will a robot do to make your résumé? The bias problem with using AI in job recruitment
https://theconversation.com/what-will-a-robot-make-of-your-resume-the-bias-problem-with-using-ai-in-job-recruitment-231174



A paper published by the International Business Journal in June 2023 titled ' The Power of Artificial Intelligence (AI) in Recruiting ' showed that AI-based recruitment strategies, such as resume screening, video interview assistance, predictive analysis, and social media analysis, can improve efficiency, reduce costs, and result in higher quality hires. Here, it is believed that AI in recruitment can eliminate human bias and increase the fairness and consistency of decision-making, thereby improving objectivity and efficiency in the recruitment process.

On the other hand, there is also the idea that AI itself strengthens prejudice, and computer science experts point out that 'image generation models tend to promote stereotypical prejudice.' Some papers state that image generation AI generates images from short prompts, which 'amplifies demographic stereotypes on a large scale' regarding race, gender, occupation, etc. Research has also shown that AI can misinterpret data and exclude minorities by relearning content generated by AI.

Researchers warn that the sudden increase in AI deliverables has created a 'loop in which AI learns AI-generated content' and is causing 'model collapse' - GIGAZINE



By interviewing 22 human resources personnel, Merika Soleimani, a senior data analyst at Massey University, and Ali Intezari, a business lecturer at the University of Queensland, identified two types of bias in hiring. The first is 'stereotype bias,' where decisions are influenced by stereotypes about a particular group, such as prioritizing the selection of women for this job type, or thinking that someone with this background would be a good fit for this business team. In many cases, discriminatory beliefs are included, whether conscious or unconscious.

The second is 'bias towards people who are similar to themselves.' Recruiters tend to favor candidates who have similar backgrounds, hobbies, and interests to themselves. These biases are thought to have a significant impact on the fairness of the hiring process, and it is expected that they will be eliminated by incorporating AI into hiring. However, Soleimani points out that the past hiring data used to train AI contains these biases, so the AI that has completed the training will also be biased. 'These biases are deeply rooted in society. To ensure fairness in both human-driven and AI-driven hiring processes, careful planning and monitoring are needed to mitigate deep-rooted biases,' he said.



The researchers also interviewed 17 AI developers and asked them about ideas for developing an AI hiring system that could mitigate bias in hiring rather than exacerbate it. The model that emerged was one in which human resource experts and AI programmers exchange information to challenge preconceived notions as they explore data sets and develop algorithms.

However, Soleimani points out that this model is difficult to realize. 'Our findings reveal that the difficulty in implementing such a model lies in the educational, occupational and demographic differences that exist between HR professionals and AI developers. HR professionals are traditionally trained in human resource management and organizational behavior, while AI developers have data science and technology skills. These differences hinder effective communication and the ability to understand each other. These differences in background can lead to misunderstandings and disagreements when working together,' Soleimani explains the possible problems.

Therefore, to set up an AI recruitment system, it is first necessary to apply training programs for human resource professionals that focus on system development and AI, while also educating AI developers on recruitment-related information so that they can bridge the gap in their collaboration and develop strategies to identify and mitigate biases.



Developing 'appropriate, bias-mitigated datasets' and training AI from them will also be essential for future systems. HR professionals and AI developers must work together to ensure that the data used in AI-driven recruiting processes is diverse and representative of different demographic groups.

Finally, countries must establish guidelines and ethical standards for the use of AI in recruitment that help build trust and ensure fairness, and organizations should articulate policies that promote transparency and accountability in AI-driven decision-making processes and make them visible to those participating in recruitment.

'Taking these steps can help create a more inclusive and fair hiring system that leverages the strengths of both HR professionals and AI developers,' the researchers conclude.

in Software,   Web Service, Posted by log1e_dh