With the rapid growth of AI and it becoming an everday part of lives, especially at work —our team members always relish the chance to have meaningful discussions about it with a variety of industry leaders.
Recently, our chief product officer, Jason Fournier, joined Mike Wolford, director of analytics at Claro Analytics, and Adam Kovacs, CEO and founder of The Talent Foundation, in a lively webcast about the intersection of AI and talent acquisition (TA).
Hosted by ERE Media, “The AI Revolution in Talent Acquisition” covered a range of pressing issues for TA professionals — including what it means for TA jobs, how hiring teams can become more efficient with AI, and tactics for using AI to recruit.
In this post, we cover five takeaways from the webcast. Readers are also encouraged to watch the entire webcast here:
As our recent survey of TA professionals revealed, there’s growing concern that AI will replace recruiter jobs. The webcast panelists suggested that, while the long-term future of artificial general intelligence (AGI) remains to be se seen, current AI can augment the role of recruiters and enable them to focus on tasks that human recruiters are simply better at.
In particular, AI can offload a lot of administrative duties — including paperwork and cold emailing — things most recruiters probably don’t want to do.
“For those of us who lean in and embrace (AI), we get to spend more time prepping our candidates, one on one,” Mike Wolford said. “We get to spend our time informing our hiring managers. We get to spend our time at strategy meetings. … Because that's what recruiting should actually be doing instead of sending 500 emails. We're just doing that because that's what it takes in order to get enough candidates into the pipeline right now.”
In talent acquisition, as in many sectors, the panelists agree that using tools such as ChatGPT can almost immediately increase efficiency and productivity. In fact, a recent MIT study found that workers were 37% more efficient using ChatGPT.
It’s important to note that no AI should be any final decisions about a candidate. Hiring platforms using AI, like Filtered, is meant to help you make informed decisions in much less time. More on this later in the post.
Without humans, it’s not fair hiring. So you do the hiring, while AI provides the insights to help select the right-fit candidate.
The term “prompt engineering” has rapidly emerged as an important skill for today’s technical worker, as developers engage with generative AI tools to write and debug code, and to help identify technical solutions to different challenges.
To get the right AI output, users must experiment with writing “prompts” — including multiline prompts — to fine-tune the AI’s response.
But prompt engineering isn’t just for engineers. Understanding how AI is likely to respond to the context you provide, and to a lesser degree, knowing how large-language models (LLM) work — is key to unlocking AI as a tool for many professionals. Including for recruiters.
“Prompt generation is important,” Jason Fournier said. “Evaluations of (AI) responses is another important skill. … You can think of these models as great at creating a first draft and you become the editor to re-draft or to re-prompt and improve the result that you’re getting.”
So the next time you’re using ChatGPT to help you draft an candidate outreach email or write a job description, be sure to tell it, “You’re a recruiter,” first. Here are other prompts for recruiters to make recruitment life a little easier.
The panelists offered several tangible suggestions for how talent acquisition professionals can apply generative AI tools today. Among the most beneficial ways are:
One of the biggest problems with today’s AI tools, the panelists agreed, is the lack of transparency in the training data for each.
Given that most DEI programs are centered around removing implicit and unconscious bias, the application of AI in DEI programs is still somewhat limited. Machine learning models have a somewhat sordid history of perpetuating bias when it comes to companies hiring based on worker demographics.
However, human bias is one reason why more TA leaders are moving toward skills-based hiring processes. Machine learning and ethical AI can be applied in platforms such as Filtered to purely identify and assess skills — without being influenced by bias .
And, as Jason Fournier noted, tools like ChatGPT can be used to proactively search for cultural sensitivities. For example, you could prompt it to review a job description or an HR website for language that might be insensitive. “It can help cover blind spots,” he said.
In the meantime, the panelists suggested, industry and regulators need to ensure that the builders of AI tools offer more transparency.
That means putiting back on the vendor to provide provide information on how they ensure training data is both multi-cultural and multi-lingual. “While training data comes from all over the world, it’s mostly in English,” Adam Kovacs pointed out.
We wrote recently about why some companies and their technical workers have been slow to adopt AI because of concerns over security.
Companies in which information security and privacy are particularly sensitive — for example, financial institutions — may prohibit employees from using generative AI tools because of the immaturity of those tools.
However, the panelists argued that generative AI is following the path of many new technologies. Eventually, even in sensitive sectors, AI will be more readily adopted as it matures.
“It's why we don't have Blockbuster anymore,” Mike Wolford said, “and why we don't buy our computers from IBM, and why we don't shop at Sears.”
As Jason Fournier put it, “These tools are going to generate too much productivity gain. And you won't be able to exist if you're not embracing them. Because your competitors will be.
“And you'll be left behind if you don't.”