Culture

3 Hiring Lessons Learned from 2,600 Engineering Candidates

15 Mar 2022 3:00am, by
The words: What we learned about hiring engineers after 6,000 technical interviews in one year with a picture of Michael the speaker talking

When CircleCI was looking to double its engineering team over the last year, it seemed like a good time to run some experiments. After all, there was room for improvement. The whole team already felt an uneven strain of pressure to interview candidates on top of regular workloads. And there was a lot of inconsistency.

The company’s continuous integration and delivery (CI/CD) platform was looking to scale from 120 to 250 engineers. The whopping 6,000 technical interviews that ran from February 2021 to January 2022 made for a solid sample size to experiment with ways to improve the usually overwhelming tech interview process.

Above all else, said Michael Stahnke, CircleCI’s vice president of platform, at his Wednesday talk at Eficode’s DevOps Conference, the company was looking to answer, “How do we do this correctly and how do we make this consistent across the board?”

What Makes Tech Interviews Challenging?

The tech interview is an obvious area for continuous improvement. There’s this awkward principal-student vibe where the candidate must be on their best behavior while under extreme pressure. And then there’s the standing in front of class to code hypothetical situations on a whiteboard — solving puzzles that often have nothing to do with the role.

In fact, a 2020 study from North Carolina State University found that job interviews in the tech industry actually “assess anxiety, not software skill.” So how can we flip that?

Stahnke’s team started by asking interviewers to reflect:

  • Are you getting the signal that this is the person capable of doing the work? Of collaborating with those involved?
  • Are you being fair and consistent throughout the whole thing?
  • How are you addressing bias? “Bias is consistent,” Stahnke said. “I’d like to say we remove it but that’s impossible so Ilike us to understand it as much as possible.”
  • Are you treating each candidate with compassion and empathy?
  • Are you any good at interviewing?
  • Are we overusing the same good interviewers, instead of training more?
  • Are interview processes going on too long, with scheduling conflicts?

In their reflection and rework of their interviewing process, the engineering hiring managers focused on how to make tech interviews more like a partnership between employer and potential employee. “We want to make this as comfortable and problem free as possible for our candidates,” Stahnke said.

The baseline interview process they were working with includes a welcome letter attached to a take-home test. It includes a reminder that this part of the interview process is totally self-paced. The hiring managers emphasize they aren’t looking for “gotcha moments” — they want people to bring their best. And they acknowledge bias and outline their efforts to counter it.

In the artificial interview environment, Stahnke said, they wanted to find a way to answer: “How do you work together to solve problems and do that in a job-like situation?”

So they settled on this original hypothesis: We can improve our technical interviewing to improve our ability to detect candidate fit and to ensure consistency for engineering. Then Circle CI’s hiring managers identified constraints to work through and to test by looking for modifications to their technical interviewing.

Most importantly, the researchers reviewed their experiments quarterly.

Experiment No. 1: Shorten the Interview Process

One problem from the start was, like most tech interview processes, CircleCI’s was just too long. This meant several interviewers and reviewers were being used too frequently, “which can be a problem when they burn out and you haven’t ramped up somebody else,” Stahnke said.

The CircleCI process lasted about three and a half weeks, which was enough time for many engineering candidates to receive offers from other companies. “The shorter your interview process, the more likely you will continue that loop to the end,” he said.

The team moved to scheduling interviews in blocks, condensing into three steps — HR screening, technical validation and team interview — instead of seven.

Another thing that held the process up was that the hiring managers offered seven different coding language options for the technical take-homes. They didn’t always have enough teammates to review the tests swiftly enough. And sometimes, they were hiring for a new skill that was new to their team, so they were tasked with figuring out how to evaluate candidates.

The engineering division also recognized that being more consistent across different hiring managers would not only be fairer but time-saving. The hiring managers created interview scorecards to aim for more consistent results.

The first experiment definitely saw an improvement in cutting down the arduous technical interview workflow, but some issues remained. The hiring managers still struggled to find the right reviewers for take-home exercises, which meant some folks remained in heavy rotation.

One thing that held the process up was that the hiring managers offered seven different coding language options for the technical take-homes. They didn’t always have enough teammates to review the tests swiftly enough.

They also uncovered that the take-homes weren’t exactly consistent. There was a bug in the Python version that wasn’t in the Java version, which meant Python folks had an edge if they caught it. Some, but not all, reviewers were grading things on a curve — like squashing commits into clean, discrete sets of changes.

Inconsistency led to some engineering candidates being unnecessarily scrutinized. “I don’t expect everyone to put forth their best work in this stressful situation,” Stahnke said.

The team still hadn’t come up with answers to persisting questions:

  • How to hire for new areas of expertise, including security and data science?
  • How to make sure the interview panels were as diverse as the team itself?
  • How to make sure to eliminate bias — they had already removed referral bias?

Experiment No. 2: Outsource Candidate Screening

In order to fill those gaps, CircleCI’s hiring managers decided to move the technical screening to a third-party specialist. This aimed to reduce the scheduling load on the company’s existing engineers.

CircleCI chose specialists Karat because its process allowed for consistency, had a familiarity with more skills and languages, and allowed candidates to redo the interview if they felt they didn’t do their best work.

Plus, each interview was recorded, with annotations, so CircleCI managers could review for auditing or training purposes, or just to check out a candidate. Stahnke said this last functionality was really for him, because he could watch really experienced interviewers in action.

When CircleCI tried outsourcing part of its interview process, candidates balked — it made them feel like the hiring company didn’t care enough, said Michael Stahnke, vice president of platform: “Some candidates would drop out from the start and put us on blast on Twitter.”

Still, CircleCI’s hiring managers had some concerns from the start. They did not like the candidate moving from an all-CircleCI process to a third party’s, and then back again. And Karat wouldn’t reveal the diversity of the interviewing pool. The CircleCI team also expected to have more input on the interview questions.

Finally, Stahnke said, they felt disconnected from the process earlier on, not being able to sell the company throughout the whole loop. “‘I really wish I could have worked at CircleCI’ is what I want people to leave with, no matter what the outcome,” he said. “They tell friends and maybe they themselves will want to apply for a more appropriate role” in the future.

In the end, CircleCI outsourced 1,081 technical interviews to Karat. The experiment cut the process down to about seven to 10 days, which seems to be the limit before losing engineering candidates.

The outcome of this second experiment was much more candidate-influenced — throughout, regardless of whether hired or not, CircleCI has sent candidates feedback surveys. The candidates definitely moved faster through the process and they loved the flexibility to be interviewed when they wanted to, even nights and weekends. Candidates also valued the transparency in the flow and the opportunity to redo the interview within 24 hours.

But most engineering candidates did not like moving from CircleCI’s to a third party’s process — it made them feel like the hiring company didn’t care enough. “Some candidates would drop out from the start and put us on blast on Twitter,” Stahnke said.

CircleCI decided to drop the external partnership because they found the engineering team was still watching the videos, which, despite the annotations, took almost as much time as if they were conducting the interviews themselves.

The company’s hiring managers also felt restricted in their ability to modify questions. For example, they were hiring for a new hybrid site reliability engineering developer role, and the outsourced technical interview process came off too dev-focused, with not enough operations or infrastructure questions.

Experiment No. 3: Use Pair Programming

Finally, the CircleCI team decided to move from technical screenings to pair programming. This more easily simulates working on a team and shows how the engineering candidates can — or cannot — get work done with others. I

It’s still an ongoing transition, but it already involves fewer algorithms and more “this is the job” type questions. The new process also intentionally avoids classic whiteboard questions.

The interview process has been moved all in-house again. “More control means we can sell CircleCI through the process,” Stahnke said.

Measuring Success

While some of the company’s engineers are still relied on too heavily for technical evaluations, CircleCI looks at the changes as an overall success. This was measured by the engineering team’s output, answering: Did more engineers equate to more output? Yes.

Michael Stahnke talking in front of a screen of graphs that shows an average of 15 hires per month over the last year, hundreds of engineering candidates per month, about 20 hiring offers per month, and, over the full year, consistent weekly deployments per enginee

The engineering team looks at average throughput as a measure for success. At CircleCI, that’s an average of 3.5 deployments per engineer per week, with a deployment equalling a completed amount of work. Couldn’t people gamify that result by breaking it into smaller chunks of work? That would be great, Stahnke said, because he’s always trying to get engineers to break up their work.

Despite having more than doubled the company’s engineering team, he said, “Our aggregate throughout is still the same — so that means we have about twice as much work for twice as many people.”

Over the course of a year, the CircleCI interview process included:

  • More than 2,600 engineering candidates.
  • More than 6,000 interviews.
  • 114 engineers hired.
  • More than 5,800 hours of interviews conducted by CircleCI’s engineers.
  • 1,081 third-party technical assessment interviews.
  • An average of 22.2 hours per hire spent on interviews. (A great candidate’s process may only take six person-hours.)

Most importantly, candidates told CircleCI in the end-of-process feedback forms that the overall interview experience has gotten better.

“Whether it works or whether it doesn’t, I just want you to leave for a positive experience,” Stahnke said. “Interviews are not combat, so let’s just make sure we treat people with respect and empathy,”

Check out more about hiring and retaining from this episode of The New Stack Makers podcast: