Page navigation

'Give preference to candidates using AI' - recruitment expert Bill Boorman

July 2025

Bill Boorman, an adviser to talent technology companies with more than 44 years in the recruitment industry - talks to Prospects Luminate about why AI is a must-have skill for jobseekers and why employers and universities need to catch up fast

We know that entry-level roles are vulnerable to automation. These roles are often a key route into the workforce for young people. Do you think the way we get young people into the workplace will change in the future?

We're in a bit of a figuring-it-out phase right now. Traditionally, a lot of graduates would go into big firms like KPMG, EY, Accenture, the banks, especially if they weren't sure what they wanted to do. These companies would take on large intakes and put people through three-to-five-year programmes where they'd do a lot of heavy lifting - audits, number crunching, reports.

Now, AI is doing a lot of that. It's faster, better, and you can get it done in a few days instead of months. You don't need to send people around the country and put them in hotels to do that work. So the work that used to train people is disappearing. That creates two big challenges. First, do we still need the same number of people? And second, how do we make sure we're still developing future leaders who'll be ready in five years to take on senior roles?

Some firms, like KPMG, are adapting. They're giving new hires more complex work from day one, so stuff they might have done in year three before. That's great for engagement and learning, but it also means fewer people are needed overall.

The nature of work is changing too. It's less about what you do and more about what you know. You're not crunching numbers anymore, you're managing the AI that does. You need to understand the output, prompt it correctly, and check for errors. So even though the job titles might look the same, the actual work is very different from even a year ago. That's why I keep saying - the first thing you've got to do is go and look at your workflows and rethink every single job you hire for. Ask 'how does AI impact this role now, and how will it in the near future?'.

You've got to forecast. Because the job you're hiring for today is going to look quite different in terms of responsibilities and capabilities even a year or two down the line. 

Employers need to rethink every job they hire for. How AI impacts it now and how it will in the future. And academia needs to catch up too. It's still wrestling with how to handle AI and what's acceptable, what's not. So we're not really preparing young people properly yet. But we need to, because the way we bring them into the workplace is already changing.

We're still telling universities to prepare people for work, but we don't even know what those jobs are going to be.

Do you think academia is keeping pace with AI and what would you like to see universities doing to help prepare students for the evolving job market?

Academia's really struggling to keep up with AI. It's still wrestling with what's acceptable, what students can and can't use it for. We're not preparing students properly because universities are still arguing about whether AI should even be used, rather than figuring out how to teach people to use it well. It's the same debate we've had before when calculators came in, or when Google replaced the need to go to a library to find answers.

But AI is different. It doesn't just help you find information - it can do the work for you, even interpret it. That changes everything. The skill now isn't just research, it's verifying what's true, what's reliable. And academia hasn't caught up with that.

Curriculums were designed before AI was widely available. Changing them takes three to five years, and by the time a course is written, approved, marketed, and taught, it could be nearly a decade out of date. I saw this first-hand when I was asked to help write a degree on social media and recruitment. By the time it would've launched, the platforms and practices would've changed completely. In the technology world, anything older than a year, becomes history.

Meanwhile, students are already using tools like ChatGPT to apply for jobs faster than employers can adapt their hiring processes. But universities aren't teaching them how to use these tools effectively, how to prompt, how to verify, how to work alongside AI. And that's what the jobs will involve.

The bigger disruption coming is robotics. That's going to change the workforce even more than AI, because robotics is going to take away a lot of entry-level jobs - roles that people used to have in factories, doing tasks that went all the way up to quite a technically skilled level. Take something like being a car mechanic. Ten years ago, if you were training for that, you'd spend three years learning how to physically fix cars, taking engines apart, replacing parts, understanding how everything worked mechanically. You'd get your hands dirty, and that's how you learned the trade.

Now, with electric cars, it's completely different. You don't need to know how to strip down an engine. You plug a computer into the dashboard, it runs a diagnostic, and it tells you what's wrong. Then you either reboot the system or replace a component.

And that's just one example. If you apply that shift across every industry, where diagnostics, automation, and robotics are replacing the traditional hands-on work, then what does the workforce of the future actually look like? We're still telling universities to prepare people for work, but we don't even know what those jobs are going to be. So we've got to start speculating, start preparing for what we think the future might look like, because the old model just doesn't fit anymore.

With so many students now using AI in their job applications, do you think that should be encouraged, and how should universities and employers be responding to that shift?

Students are using AI in their job applications, and that's not going to slow down. So the key now is teaching them how to do it well, because these are exactly the skills they'll need in the workplace.

Roughly 76% of applicants are now using AI in their application process for entry-level roles. For some, that's just writing a cover letter or tweaking their CV. For others, it's finding the job, applying for it or doing the whole thing with AI. And that's only increasing.

But universities aren't preparing students for that, and employability teams are saying 'Don't do that', like it's cheating. At the same time, employers are saying, 'People are cheating with AI'. And I'm saying, hang on - don't you want to hire someone who knows how to use AI efficiently? Someone who knows when to override the machine and when to let it do the work? Someone who can produce the best output?

That's why I said we should give preference to candidates who are using AI in the application process. We should be asking them 'How did you use it? What tools did you use? What did you do?'.

If someone says, 'I didn't use it at all,' they're either lying, or they've been told they'll be penalised for it. And that's a problem. Because they're avoiding the best freely available technology to do their job more efficiently and present themselves better.

What I want is someone who says, 'Yeah, I used this tool, I checked it, I edited it, and here's how I did it'. Because that's exactly what I'll be asking them to do in the job - write prompts, supervise outputs, check for errors, revise the work. If they've done that in their application, I'm thinking, great, you're already working the way I need you to work.

So if I were advising universities, and I've told them this, I'd say make this a mandatory part of employability from year one. Not just writing a LinkedIn profile or a CV. Teach students how to use AI in their job search and in work. How to communicate that use, how to be transparent, how to do it ethically and sensibly. But also teach them how to do it well - what tools to use, what the risks are, when not to use it. That should be a mandatory unit for every student in the country.

And we should be starting even earlier in schools, with kids who are 14, 15, 16. The challenge is, they often know the tech better than the people teaching them. And that's part of the problem.

employers need to rethink their jobs, look at workflows, rethink what entry-level roles actually are, and what skills they're hiring for.

But if we do that, how do we shift that mindset some employers have that it's cheating?

That one's pretty easy for me. This is the rule - unless you explicitly tell someone they can't do something, it's not cheating. That applies across the board. If I tell you, 'You cannot use ChatGPT to write your CV' and you do it anyway then fine, that's breaking a rule. But if there's no rule, then it's not cheating.

So when someone goes to apply for a job, there should be a clear, explicit policy on the use of AI in the application process. It should be right there on the careers site saying 'This is our acceptable use policy'. That's how we start shifting the mindset.

And we need to get to employers quickly. Everyone involved in entry-level careers has a responsibility here. We need to be saying 'This is the world now, these are the jobs you want people to do, and these are the skills you're going to need'. If we don't do that, we're the ones cheating young people.

We can't wait a year to write a research paper about it. We need to act now. That means employers need to rethink their jobs, look at workflows, rethink what entry-level roles actually are, and what skills they're hiring for.

For a long time, it was simple. You hired for a fixed skill set, based on a fixed job profile. You did a degree, you learned certain things, and that matched a job. But that model doesn't work anymore. I remember a Stack Overflow survey from years ago where first-year developers said 80% of what they learned at university was already redundant.

We've got to recognise how fast these environments change. And the truth is, we're still operating in analogue. Even some of the biggest, most progressive employers I speak to are not thinking digitally yet. But we need to go from analogue to digital to AI. And we're not even fully digital.

It's a big shift, yeah. But it's exciting. And if we don't get entry-level right, everything else falls apart. This should be the model we build from. Entry-level should be the part of the market where we accept that change happens fastest and where we move fastest to keep up. We'll never be fully up to date, but we've got to get closer to the curve.

AI gives you transparency. If you're rejected, it tells you exactly why. There's no mystery, no bias, no 'gut feeling' from a recruiter.

In your speech at the ISE Student Recruitment Conference in June 2025, you mentioned that 98% of people said they wanted to be assessed by AI at that moment when applying for a job. That really surprised me. Could you share some of the reasons why people preferred AI in that situation?

Yeah, so to give some context, that 98% figure came from a real-world test, not a hypothetical survey. I sit on the board of a company called Vonq and it was part of a proof-of-concept project with Adecco that ran for a decade, and it's been replicated across different countries and sectors. You give people two big buttons on a site. One says, 'Book an appointment with a human for an assessment', even if that's just booking in to speak to someone later. The other says, 'Do an AI assessment now'. And 98% of people choose the AI.

Now, if you ask people in a survey, 'Would you rather be assessed by a human or a machine?' most will say human. But in context, when you're applying for a job and you just want to know if you're eligible or not, people overwhelmingly choose the AI. Why? Because most job applications are determined by black-and-white disqualifiers - do you have a driving licence, do you live in the right area, do you meet the minimum qualifications? People don't want to spend hours applying only to be rejected for something that could've been flagged up front. They want to know early, so they can decide whether to invest their time.

And AI gives you transparency. If you're rejected, it tells you exactly why. There's no mystery, no bias, no 'gut feeling' from a recruiter. Everyone is treated the same. That's what candidates want - fairness, consistency, and a timely process. They're not asking for special treatment, just a fair shot. Especially for younger people, applying for jobs is a painful, time-consuming process. If you give them the option to get screened immediately and move forward, they'll take it. 

Interestingly, when we tested the use of chatbots in the application process years ago, we assumed younger people would prefer them. But it turned out older, more experienced candidates were even more likely to choose the chatbot because their questions were more specific and they wanted quick, black-and-white answers.

So really, the surprising thing isn't that 98% chose AI, it's that 2% didn't. That's what we should be asking: why didn't those people want speed, clarity, and fairness?

Get insights in your inbox!

Related articles

Loading articles...

{{article.data.article_title.value.text}}
{{article.data.page_title.value.text}}

{{article.data.article_title.value.text}}

{{article.data.author.linkedDocumentContent.full_name.value.text}}

{{article.date}}

This article is tagged with:

Event: {{article.data.page_title.value.text}}

{{article.data.city.value}}

{{article.date}}

This event is tagged with:

Loading articles...