The head of the US agency charged with enforcing civil rights in the workplace says artificial intelligence-driven “bossware” devices that closely track the location, keystrokes and productivity of workers could also be violates discrimination laws.
Charlotte Burrows, chair of the Equal Employment Opportunity Commission, told The Associated Press that the agency is working to educate employers and technology providers about their use of these surveillance tools as well as AI tools that -streamline the work of evaluating job prospects.
And if they are not careful with say, draconian schedule monitoring algorithms that penalize breaks for pregnant women or Muslims spend time in prayeror allowing flawed software to screen graduates of women’s or historically Black colleges — they won’t blame AI when the EEOC comes calling.
“I’m not shy about using our enforcement authority when necessary,” Burrows said. “We want to work with employers, but there’s certainly no exception to civil rights laws because you’re engaging in discrimination in a high-tech way.”
The federal agency released its latest set of guidance Thursday on the use of automated systems in employment decisions such as who to hire or promote. It explains how to interpret a key provision of the Civil Rights Act of 1964 known as Title VII that prohibits employment discrimination based on race, color, national origin, religion or sex, which includes bias against gay, lesbian and transgender workers.
Burrows said a key example involves widely used resume screeners and whether or not they can produce a biased result if they’re based on biased data.
“What happens is that there’s an algorithm that looks for patterns that shows patterns that it’s already familiar with,” he said. “It’s going to be trained on data that comes from its current employees. And if you have a non-diverse set of current employees, you’re likely to end up firing people who are unintentionally not like your current employees .
Amazonfor example, abandoned its own resume scanning tool to recruit top talent after it found it favored men for technical roles — in part because it compared job candidates against the company’s own tech workforce.
Other agenciesincluding the Department of Justice, already sent similar warnings last year, along with previous sets of guidance about how some AI tools could discriminate against people with disabilities and violate the Americans with Disabilities Act.
In some cases, the EEOC has taken action. In March, the operator of the tech job-search website Dice.com settled with the agency to end an investigation into allegations that it allowed job posters to exclude workers of US national origin in favor of those immigrant seeking work visas. To settle the lawsuit, the parent company, DHI Group, agreed to rewrite its programs to “scrape” for discriminatory language such as “H-1Bs Only,” a reference to a type of work visa.
Much of the EEOC’s work involves investigating complaints filed by employees who believe they have been discriminated against. And while it is difficult for job applicants to determine if a biased hiring tool resulted in them being denied a job, Burrows said there is “generally more knowledge” among workers about the tools that are used more to monitor the their productivity.
These devices range from radio frequency devices to track nurses, to monitoring the minute-by-minute tightly controlled schedules of warehouse workers and delivery drivers, to tracking keystrokes or click the computer mouse because many office workers have started working from home during the pandemic. Some may violate civil rights laws, depending on how they are used.
Burrows noted that the National Labor Relations Board is also looking into such AI tools. The NLRB sent a memo last year warning that overly intrusive surveillance and management tools could harm workers’ rights to communicate with one another about union activity or unsafe conditions.
“I think that the best method out there – I’m not saying don’t use it, it’s not illegal – but to really think about what the owners are looking to measure and maybe measure that directly, ” said Burrows. “If you’re trying to see if the job is done, maybe check if the job is done.”