Skip to content

Instantly share code, notes, and snippets.

@ZacharyRSmith
Last active November 26, 2020 13:17
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ZacharyRSmith/cb914a9d4ecb57fbe0dc5d831bf65b8f to your computer and use it in GitHub Desktop.
Save ZacharyRSmith/cb914a9d4ecb57fbe0dc5d831bf65b8f to your computer and use it in GitHub Desktop.

How to Interview Software Engineers

Why this article? "The engineer interviewing process is broken." One reason is that the assessments most commonly used predict much worse than the best assessments. This post discusses common assessments and their better alternatives.

When interviewing someone, you want to:

  • predict if they would be a fit,
  • predict if they meet your hiring bar, and
  • give them a positive experience.

Giving them a positive experience is important even if you decide against offering them a position. However, this article focuses on how to predict if they meet your hiring bar.

Broadly speaking, you want to assess these:[2]

  • soft skills:
    • communication
    • leadership
    • conscientiousness
    • other skills needed for the position
  • technical skills

Common, less predictive assessments

You probably use one of these less predictive assessments:

  • whiteboard challenges that could be more predictive. See "How to conduct predictive "whiteboard" coding challenges".
  • unstructured behavioral interviews. "Structured interview" is a technical term. Your behavioral interviews probably have structure, but are half as predictive because they are not "structured" in the technical sense.[TODO] Not only are unstructured interviews half as predictive, they are also highly influenced by interviewers' social and technical biases.

How to conduct predictive "whiteboard" coding challenges

"Whiteboard" is in quotes because you want to mimic the real-world:

  • provide the option of using a laptop instead of just a whiteboard. Unfortunately, this might not be up to you, as you want all candidates to have the same options.
  • frame the challenge in real-world terms.
    • This isn't fluff to make it fun or interesting (although this may) but about assessing the skill of translating real-world problems into coding problems.
    • You might use a scenario from your actual work, but usually there is a tradeoff between that and a made-up scenario that enables you to assess more skills in a shorter time period.

Avoid these common mistakes:

  • assessing familiarity with irrelevant instead of relevant algorithms, thinking this is the best proxy for relevant skills. Good coding challenges assess relevant skills. See "Which algorithms to use in coding challenges".
  • using a personal favorite, even if it's an ineffective assessment.
  • using faux IQ tests like brainteasers, puzzles, and so on.
  • wasting time assessing trivial knowledge instead of more relevant skills.[1]
  • wasting time writing too many lines of code (unless you're assessing the skill of tedious coding under pressure).
    • Some problems just shouldn't be used because of this problem.
    • You can instruct the candidate to provide only the function signatures of uninteresting subroutines.

Which algorithms to use in coding challenges

tl;dr

Ultimately, with coding challenges you do not want to assess familiarity with algorithms but the ability to do well in the position's real-world coding problems.

Which algorithms to use in coding challenges:

  • only those required by the position. An exception might be if the position requires uncommon algorithms like graph algorithms, but you decide to assess general algorithmic ability to avoid screening out engineers who could learn on-the-job.

Details

For many positions, this means that the only algorithm skill you'll assess is the skill of using hashes to avoid iterating through arrays. For many positions, even this fundamental algorithm skill is useless.

A common anti-pattern is conflating familiarity with algorithm X with the ability to do well in the position:

  • This is especially common with interviewers unaware that many candidates practice interview questions for a hundred, hundreds, or even thousands of hours.
    • These interviewers think, "Oh, wow! This candidate solved this problem very quickly, they must be able to do well in the position!" not realizing that the candidate solved the problem very quickly not because of general ability but because of familiarity with the problem or similar problems. I've heard multiple interviewers think this conflation. Maybe you've been influenced by this conflation.
  • Not only are good candidates screened out due to lacking an irrelevant skill, but time spent on assessing familiarity with algorithm X would be better spent assessing other skills.
  • Just because someone on the team needs to be familiar with algorithm X doesn't mean everyone needs to be familiar with algorithm X.
  • Just because everyone needs to be familiar with algorithm X doesn't mean candidates must be familiar with algorithm X before being hired.

Even if an algorithm is required by the position, you might want to assess the ability to learn the algorithm rather than current familiarity with the algorithm:

  • For ex.: if the position requires familiarity with graph algorithms, then the brute force approach to interviewing is to assess familiarity with graph algorithms. But because most engineers do not use graph algorithms, this would screen out many candidates who could learn on-the-job. Instead, you might want to assess general algorithmic ability.

Footnotes

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment