How your boss might be using tech to peer into your brain

Modern workers are increasingly finding that companies no longer just consider their resumes, cover letters, and job performance. Increasingly, employers want to assess their brains.

Companies screen potential candidates using technology-assisted cognitive and personality tests, deploy wearable technology to monitor brain activity at work, and use artificial intelligence to make decisions about hiring, promotion and dismissal of people. The Brain becomes the ultimate sorting hat in the workplace – the technological version of the magical device that distributes young wizards among the houses of Hogwarts in the “Harry Potter” series.

Companies touting tech tools to assess candidates’ brains promises to “significantly increase the quality of your hires” by measuring the “basic elements of how we think and act.” They claim that their tools can even decrease the bias in hiring by “relying solely on cognitive abilities”.

But research has shown that such assessments can lead to racial disparities that are “three to five times more than other predictors of job performance. When social and emotional testing is part of the battery, it can also filter people with autism and other neurodiverse candidates. And candidates may be required to reveal their thoughts and emotions via gamified AI-based recruiting tools without fully understanding the implications of the data collected. With recent surveys showing that more than 40% of companies use cognitive ability assessments when hiring, federal employment regulators have begun to pay attention.

Once the workers are hired, new portable devices are integrating brain assessment into the workplace worldwide for attention monitoring and productivity rating at work. THE SmartCap monitors worker fatigue, Neurable Enten Helmet promote concentration and Emotiv MN8 earphones promise to monitor “your employees’ stress and attention levels using proprietary machine-learning algorithms” — though, the company assures, they “can’t read thoughts or feelings.”

The growing use of brain-focused wearable devices in the workplace will undoubtedly put pressure on managers to use the information gathered from them to inform hiring and promotion decisions. We are vulnerable to alluring allure of neuroscientific explanations for complex human phenomena attracted by measurement even when we don’t know what we should be measuring.

Relying on AI-based cognitive and personality tests can lead to simplistic explanations of human behavior that ignore the broader social and cultural factors that shape human experience and predict workplace success. A cognitive assessment for a software engineer may test spatial and analytical skills but ignore the ability to collaborate with people from diverse backgrounds. The temptation is to turn human thought and feelings into puzzle pieces that can be sorted the right way.

The US Equal Employment Opportunity Commission seems to be aware of these potential problems. He recently published an application project guidelines on “Technology-related employment discrimination”, including the use of technology for “recruitment, selection or production and performance management tools”.

Although the commission has not yet clarified how employers can comply with non-discrimination laws when using technology assessments, it should ensure that cognitive and personality tests are limited to skills related to employment, lest they intrude on employees’ mental privacy.

The growing power of these tools may incentivize employers to “hack” candidates’ brains and filter them based on beliefs and biases, assuming that such decisions are not unlawfully discriminatory because they are not directly based. on protected characteristics. Facebook “likes” can already be used to infer sexual orientation and race with considerable precision. Political affiliation and religious beliefs are just as easily identifiable. As wearable devices and brain wellness programs begin to track mental processes over time, age-related cognitive decline will also become detectable.

All of this points to an urgent need for regulators to develop specific rules governing the use of cognitive and personality tests in the workplace. Employers should be required to obtain informed consent from candidates before they undergo a cognitive and personality assessment, including clear disclosure of how candidate data is collected, stored, shared and used. Regulators should also require that assessments are regularly tested for validity and reliability to ensure that they are accurate, reproducible and linked to job performance and outcomes – and that they are not unduly sensitive to factors such as fatigue, stress, mood or medication.

Assessment tools should also be regularly audited to ensure they do not discriminate against applicants based on age, gender, race, ethnicity, disability, thoughts or emotions. . And the companies that develop and administer these tests need to update them regularly to account for changing contextual and cultural factors.

More generally, it is appropriate to ask whether these methods of evaluating job candidates do not promote overly simplistic visions of human capacities. This is especially true as the capabilities of human workers are more frequently compared to those of generative AI.

Although the use of cognitive and personality assessments is not new, the growing sophistication of neurotechnology and AI-based tools to decode the human brain raises important ethical and legal questions about cognitive freedom.

The spirit and personality of the employees must be the object of the strictest protection. While these new tests may offer some benefits to employers, they should not come at the expense of workers’ privacy, dignity and freedom of thought.

Nita Farahany is a professor of law and philosophy at Duke University and author of “The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology.”

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top
%d bloggers like this: