Promises and Risks of Use of AI for Job: Protect against Data Prevention Protect

by SkillAiNest

Promises and Risks of Use of AI for Job: Protect against Data Prevention Protect

Through the AI ​​trends staff

Although hiring AI is now widely used to write a job description, screening of candidates, and automate the interview, it is at risk of extensive discrimination if it is not implemented carefully.

Keith Sundering, Commissioner, US Equal Opportunity Commission

This message was the message of Keith Scadraling, which is the Commissioner of US Equal Opportunity, the Commissioner, AI World Government Last week, the event was held directly and practically in Alexandria, Wa,. Solding is responsible for the implementation of federal laws that ban discrimination against employment applicants due to race, color, religion, gender, national origin, age or disability.

“The idea that AI will join the mainstream in the AI ​​departments was close to science fiction two years ago, but pandemic diseases have accelerated the rate that is being used by employers,” he said. “Virtual recruitment is now here to stay here.”

This is a busy time for HR professionals. “The great resignation is leading to the restoration of the great resignation, and the AI ​​will play a role in it as we have not seen before,” said Sonderling.

AI has been employed in employment for years. “Sometimes., AI is now making all the decisions made once by HR personnel,” which he has not given good or evil.

“Carefully designed and used properly, AI has the ability to make the workplace more fair,” said Sandling. “But implemented carelessly, AI can do discrimination on a scale we have never seen by HR professional.”

Training Datases need to reflect diversity for AI models used for jobs

The reason for this is that AI models rely on training data. If the current workforce of the company is used as the basis of training, “it will copy the stagnation. If it is basically a gender or a breed, it will produce a copy.” On the contrary, AI can help reduce the risk of hiring prejudice as a race, ethnic background, or disability. “I want to see AI better in discrimination of workplace,” he said.

Amazon began to rent a rent in 2014, and over time it was found that he discriminated against women in his recommendations, as the AI ​​model was trained on the company’s own rent records in the last 10 years, which was primarily men. Amazon developers tried to fix it, but eventually the system was abolished in 2017.

Facebook recently agreed to pay 14.25 million to set the US government’s citizens’ claims. The social media company discriminated against US workers and violated the federal recruitment rules. Reuters. The matter is focused on the use of Facebook, which he has called his program for labor certification. The government found that Facebook refused to hire US workers for jobs, which were allocated for temporary visas under the program.

“Excluding people from a job pool is a violation,” said Sidderling. He said, “If the AI ​​program” prevents the existence of job opportunities for this class, so they cannot exercise their rights, or if it shapes a safe class, it is in our domain. ”

Employment studies, which have become more common after World War II, have provided higher cost to HR managers and have the ability to minimize prejudice in obtaining their services with AI. “At the same time, they are suffering from discrimination claims, so employers need to be careful and they cannot remove their hand,” said Sandling. “Invalid data will increase prejudice in decision -making. Employers should be vigilant against discrimination.”

He recommended research to resolve the DATA data on the dangers of prejudice based on race, gender and other factors.

Is an instance of Herovio In southern Jordan, Utah, which has built a rental platform predicted on the US Equal Opportunity Commission’s uniform guidelines, which is specifically designed to reduce unfair services, according to an account, Staffwork.

On our website A post on e -moral principles, “Since Harvio uses AI technology in our products, we actively work to prevent the introduction or spread of prejudice against any group or individual. We will carefully review the datases used in our work and ensure that we are able to promote our ability to serve our best. Knowledge, experiences and perspectives.

Also, “Our data scientists and IO psychologists develop Harry View’s diagnosis algorithm in a way that removes data through algorithm that helps the diagnosis predicts negative effects without significantly affecting. It helps to promote and promote the diagnosis when promoting a very legitimate, prejudice. Status.

Dr. Ed Ikiguchi, CEO, Icor

The issue of prejudice in the datases used to train AI models is not limited to just hiring. Dr. Ed Ikiguchi, CEO of AI, an AI analytics company working in the Life Sciences Industry, said in a recent account that he was reported in a recent account. Health Facility News“”The AI ​​is just as strong as it has been fed, and recently the reputation of this data spinal cord is being called in the question. Today’s AI developers lack access to large, diverse data sets that have to train and verify new tools.

He added, “They often need to take advantage of open source datases, but many of them were trained using computer programmer volunteers, which is primarily a white population. Because algorithms are often used on different specimens of different types of data, with different races, different places.

Also, “All algorithms need to be an element of governance and peer reviews, because even the most concrete and trial algorithm is obliged to produce unexpected results. The algorithm is never learned.B (b (b (It should be permanently developed and more data should be fed to improve it.

And, “As an industry, we need to make more doubts about the results of the AI ​​and encourage transparency in the industry. Companies should easily answer basic questions, such as how to train algorithm? On what basis did it conclude?”

Read Source Subjects and Information AI World GovernmentFRoom Reuters And Health Facility News.

You may also like

Leave a Comment

At Skillainest, we believe the future belongs to those who embrace AI, upgrade their skills, and stay ahead of the curve.

Get latest news

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

@2025 Skillainest.Designed and Developed by Pro