So far, over 3,000 people have applied to one open data science vacancy at a US health tech company this year. The top candidates are given a lengthy and difficult task assessment, which very few pass, says a recruiter at the company, who asked to remain anonymous because they are not authorized to speak publicly.
The recruiter says they believe some who did pass may have used artificial intelligence to solve the problem. There was odd wording in some, the recruiter explains, others disclosed using AI, and in one case when the person moved on to the next interview, they couldn’t answer questions about the task. “Not only have they wasted their time, but they wasted my time,” says the recruiter. “It’s really frustrating.”
It’s not uncommon for tech roles to now receive hundreds or thousands of applicants. Round after round of layoffs since late 2022 have sent a mass of skilled tech workers job hunting, and the wide adoption of generative AI has also upended the recruitment process, allowing people to bulk apply to roles. All of those eager for work are hitting a wall: overwhelmed recruiters and hiring managers.
WIRED spoke with seven recruiters and hiring managers across tech and other industries, who expressed trepidation about the new tech—for now, much is still unknown about how and why AI makes the choices it does, and it has a history of making biased decisions. They want to understand why the AI is making the decisions it does, and to have more room for nuance before embracing it: Not all qualified applicants are going to fit into a role perfectly, one recruiter tells WIRED.
Recruiters say they are met with droves of résumés sent through tools like LinkedIn’s Easy Apply feature, which allows people to apply for jobs quickly within the site’s platform. Then there are third-party tools to write résumés or cover letters, and there’s generative AI built into tools on sites of major players like LinkedIn and Indeed—some for job seekers, some for recruiters. These come alongside a growing number of tools to automate the recruiting process, leaving some workers wondering if a person or bot is looking at their résumé.
“To a job seeker and a recruiter, the AI is a little bit of a black box,” says Hilke Schellmann, whose book The Algorithm looks at software that automates résumé screening and human resources. “What exactly are the criteria of why people are suggested to a recruiter? We don’t know.”
Still, generative AI tools for both recruiters and job seekers are becoming more common. LinkedIn launched a new AI chatbot earlier this year, meant to help people navigate job hunting. The hope was that it would help people see better if they align well with a job or better tailor their résumé for it, peeling back the curtain that separates a job seeker and the hiring process.
That came after LinkedIn began rolling out a new set of generative AI tools for recruiters to source candidates in October. With the sourcing tool, recruiters can search a phrase like “I want to hire engineers in Texas,” and profiles of people that may meet those criteria appear, as do other specific skills that may be related to the role. They can also send messages written with generative AI and set automatic follow-up messages. LinkedIn’s data shows that AI-generated messages are accepted about 40 percent more frequently than one-off messages written only by a recruiter.
Charlie Wood
Juliane Bergmann
Kathy Gilsinan
Julian Chokkattu
“We’re really focused on helping to make recruiters’ lives more efficient,” says Peter Rigano, director of product management at LinkedIn. By substituting the tedious parts of the job with generative AI, the company hopes recruiters can focus on “more rewarding aspects of their job,” like actually connecting and talking to job seekers.
Indeed also announced new AI tools in April. The company says its Smart Sourcing tool recommends candidate profiles based on an employer’s needs. It uses AI to read résumés of “active” profiles (which include those who have searched for jobs in the last 30 days or updated profiles) and summarize why the person might be a good fit, but Indeed says the tool will also note that some gaps could be overlooked, like if a person has four years of experience when a job description asks for five. Like with LinkedIn, employers can also send AI-generated messages to candidates. Both companies built their generative features on OpenAI tools, as well as their own internal data or models.
But the changes may not fix everything recruiters are dealing with. The recruiter from the health tech company says their company rarely posts jobs to LinkedIn; the job platform’s Easy Apply feature sends too many unqualified applicants their way. And with so many people out of work and applying, they’re relying more on inbound candidates, and have little need to source for more options.
Another recruiter from a second health tech company, who requested anonymity because they are not authorized to speak to the press, says the inbound candidates on LinkedIn often aren’t good matches or high-quality candidates, but that the site remains their “bread and butter” for sourcing. They pulled up the site’s generative messaging tool and had it draft potential outreach to candidates. While the recruiter told WIRED it wasn’t a bad first shot, they would still opt to add other details to further personalize it.
While LinkedIn’s tool shows candidates in an order that can feel preferential, Rigano says the company has programmed it to be representative of people in the category, by showing men and women proportionally to their presence in the industry, or by highlighting other relevant skills a person could search that might bring up candidates that would slip through the cracks of an initial job search. Indeed’s head of responsible AI, Trey Causey, tells WIRED the company has engineers, scientists, and researchers who evaluate the system’s fairness, and that the company takes feedback from people about ways to improve its generative AI systems. “However, no system can ever be completely unbiased, as there isn’t a single definition of bias and definitions often conflict,” Causey says.
These tools may favor more active profiles on their sites—which makes sense for recruiters hoping to reach people who are actually checking their messages. But that also could exclude people who have been less active on the sites, or potentially even stepped away from the workforce due to reasons like illness or caregiving.
Bias is a top concern in automated hiring. HR tools have been found to make rash, negative judgements on applicants who have Black-sounding names, prefer men, or skip over candidates who don’t check every box or have employment gaps on their résumés.
Charlie Wood
Juliane Bergmann
Kathy Gilsinan
Julian Chokkattu
Sim Bhatia, working as the people operations manager for a company named Reality Defender, which specializes in deepfake detection, discloses that she does not rely on any AI tools for candidate evaluation during the hiring process. According to her, such tools carry more risks than benefits at this stage. Bhatia notes that she can filter applicants based on their location, for example, New York, where the company is based, without employing the use of sophisticated generative tools. She raises concerns about data security if such emergent technologies are used in a company’s system and affect either potential candidates or existing employees.
Bearing the responsibility of conducting applicant reviews, Bhatia dedicates about 10 hours every week to examining résumés and conducting telephonic screenings. She’s one among many recruiters who haven’t entirely sidelined the potential of future implementations. Bhatia shares her optimism towards the evolving technology, saying, “I’m excited to see it evolve, as with any technology that’s arisen.” However, she also asserts that AI is not quite ready yet.
Another perspective is offered by Leanne Getz, the vice president of delivery channels at an IT staffing firm, Experis. She opines that as more generative tools are being integrated into the systems recruiters employ, the learning curve in developing an understanding of where AI works and where it doesn’t is still steep. Getz expresses her confidence in AI adding substantial value to the recruitment realm but doesn’t believe that AI can entirely automate the hiring process. As she puts it, “We’re a people organization. The AI can’t replace what our recruiters can do day to day.”
Meanwhile, some individuals possessing hiring authority have become more cautious about career-focused social platforms. For instance, Krysten Copeland, the founder of a PR firm called KC & Co Communications, reveals her plan of not advertising new vacancies on LinkedIn. She shares her previous experience of having 600 applicants for a public relations manager position she advertised in the fall. Despite receiving applications from some high-quality candidates, there were also some peculiar ones. She even suspects one of lying about their work history, an allegation backed up after consulting with colleagues familiar with that workplace. This particular candidate was one of the top recommended by LinkedIn. While acknowledging that LinkedIn can’t validate every claim made, she recognizes that they do offer a profile verification feature and their extensive professional connections tend to foster trust among recruiters and hiring managers.
Ultimately, there’s a disconnect in the online job hunt, Copeland says: “Everyone has a job they’re offering. Everyone is looking for a job. No one is getting it.”
WIRED has teamed up with Jobbio to create WIRED Hired, a dedicated career marketplace for WIRED readers. Companies who want to advertise their jobs can visit WIRED Hired to post open roles, while anyone can search and apply for thousands of career opportunities. Jobbio is not involved with this story or any editorial content.