Opinion: Do you offer a resume, cover letter and access to your brain? The scary race to read employees’ minds

Modern workers are increasingly finding that companies are no longer satisfied with thinking about their resumes, cover letters and job performance. More and more employers want to assess their brains.

Companies screen potential job candidates with technology-enhanced cognitive and personality tests, use wearable technology to monitor brain activity at work, and use artificial intelligence to make hiring, promotion and firing decisions. The Brain becomes the ultimate sorting hat in the workplace – the technological version of the magical device that sorts young wizards in the Hogwarts houses in the Harry Potter series.

businesses Technology tools designed to assess applicants’ brains promise to “dramatically increase your hiring quality” by measuring the “basic building blocks of how we think and behave.” They claim their tools can even do that reduce prejudices when hiring by “relying only on cognitive skills”.

However, research has shown that such assessments can lead to racial differences that “three to five times larger than other predictors of job performance.” If social and emotional tests are part of the battery, so can they Cut out people with autism and other neurodiverse candidates. And applicants can be expected to disclose their thoughts and emotions through AI-based gamified recruitment tools without fully understanding the implications of the data collected. This is evident from ongoing studies more than 40% of companies The use of cognitive ability assessments in hiring is rightly being looked at by federal labor boards.

Once employees are hired, new wearables Integrating brain research into the workplace worldwide for attention monitoring and productivity score at work. The SmartCap detect employee fatigue, Neurables duck headphones promotes concentration and Emotiv’s MN8 earphones promises to “monitor your employees’ stress and alertness levels with … proprietary machine learning algorithms” – though the company assures that they “cannot read minds or emotions.”

The increasing use of brain-based wearables in the workplace will undoubtedly put pressure on managers to use the insights gained in hiring and promotion decisions. We are susceptible to that seductive appeal of neuroscientific explanations attracted to complex human phenomena and measurement even if we don’t know what to measure.

Reliance on AI-based cognitive and personality tests can lead to oversimplified explanations of human behavior that ignore the broader social and cultural factors that shape the human experience and predict success in the workplace. A cognitive assessment for a software developer may test spatial and analytical skills but ignores the ability to work with people from different backgrounds. There is a great temptation to convert human thinking and feeling into puzzle pieces that can be sorted in the right way.

The US Equal Employment Opportunity Commission seems to have recognized these potential problems. She recently released a draft proposal guidelines about “technology-related discrimination in the workplace”, including the use of technology for “recruitment, selection or production and performance management tools”.

While the committee has yet to clarify how employers can comply with anti-discrimination laws when using technology assessments, it must ensure that cognitive and personality tests are limited to work-related skills so they don’t invade employees’ intellectual privacy.

The growing power of these tools may entice employers to “hack” and test candidates’ brains based on beliefs and biases, provided such decisions are not unlawfully discriminatory because they are not based directly on protected characteristics. Facebook “Likes” can already be used deduce sexual orientation and race with great accuracy. Political affiliations and religious beliefs are also easy to spot. As wearables and brain wellness programs begin to track mental processes over time, age-related cognitive decline is also becoming detectable.

All of this points to the urgent need for regulators to develop specific rules for the use of cognitive and personality testing in the workplace. Employers should be required to obtain informed consent from applicants before undergoing cognitive and personality assessments, including clear disclosure of how applicant data is collected, stored, shared and used. Regulators should also require assessments to be regularly tested for validity and reliability to ensure they are accurate, repeatable and related to job performance and outcomes – and not overly sensitive to factors such as fatigue, stress, mood or medication.

Assessment tools should also be reviewed regularly to ensure that candidates are not discriminated against based on age, gender, race, ethnicity, disability, thoughts or emotions. And companies that develop and administer these tests must regularly update them to reflect changing contextual and cultural factors.

More generally, we need to examine whether these methods of evaluating applicants encourage overly reductionist views of human ability. This is especially true as the skills of human workers are more often compared to those of generative AI.

While the use of cognitive and personality assessments is not new, the increasing sophistication of neurotechnology and AI-based tools to decipher the human brain raises important ethical and legal questions about cognitive freedom.

The mind and personality of the employees must enjoy the strictest protection. While these new tests may offer employers some benefits, they should not come at the expense of employees’ privacy, dignity and freedom of thought.

Nita Farahany is a professor of law and philosophy at Duke University and author of The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology.

Author: Nita Farahani

Source: LA Times

LEAVE A REPLY

Please enter your comment!
Please enter your name here

spot_imgspot_img

Hot Topics

Related Articles