Skip to content

Instantly share code, notes, and snippets.

@JasonTrue
Last active May 9, 2024 07:59
Show Gist options
  • Save JasonTrue/8023727a4773d324bea6ad25441faa73 to your computer and use it in GitHub Desktop.
Save JasonTrue/8023727a4773d324bea6ad25441faa73 to your computer and use it in GitHub Desktop.

How to succeed in hiring without really hazing

Tech job interviews suck. Make them suck less to attract better peoople.

You aren't a big fan of the typical tech interview process. Why not?

Mainly because success or failure in the interview is often not well-correlated to success in the actual role. Objective-sounding criteria are often more capricious than they appear at first glance, and can easily filter out excellent engineers, and are often a better indicator of who did the best interview prep than anything else.

In many mature industries, job interviews are business conversations, a mutual discussion to discover the ways the employer's needs and the interviewee's experience and goals converge. When I've done software consulting, the conversations that lead to contracts are usually like that too. But many full-time and agency temp software job interviews are more like hazing rituals, alienating experiences that vaguely feel like alpha geek peacocking displays, built on an implicit assumption that a bunch of people are faking their way through the industry and must be exposed.

Curiosity, tenacity and persistence have been better predictors of success in most of the teams I've worked in or hired for than whiteboard coding exercises, credentials, or confidence. So I optimize my interview process to identify those attributes.

I also try to steer the conversation to the actual work we're doing and our business objectives, not a bunch of abstract problems. A good candidate will recognize overlap between things they've worked on in the past and talk about how that context could help them make sense of the work in your organization.

What questions should I ask?

I try to be as open-ended as possible. But I do want to get a sense of what makes a person tick.

Of course, the natural starting point is to just ask a candidate to tell me about themselves, which I hope will get the candidate to sell me a bit on what they are most proud of. I'll sometimes literally ask them to "sell me on you."

Mostly I want candidates to talk about work that they're proud of, especially when it wasn't a straight line to get it done. Talking about both the challenges and the easy bits, the missteps and the flashes of insight, the hallway conversations with coworkers, the research and technical spike tangents they might have taken, will give me a good sense of both their approach to problems and and their experience. But importantly, it also demonstrates communication skills. A person who can explain how something works will almost certainly have the capacity to do a thing; unless you're a musician, it's usually harder to explain something well than it is to do it. I can ask probing questions to improve my own understanding of their solution, and it has the benefit of demonstrating their own proficiency.

I might ask questions like these:

  • Walk me through the approach you took to solving a tough technical problem.
  • What's your biggest frustration about your favorite programming language or framework?
  • Tell me about a time that you had to build out a feature whose requirements were still ambiguous or incompletely specified.
  • Is there a skill you'd like to improve right now? How would you go about learning and developing it?

I've got experience in a lot of different programming languages and frameworks, so occasionally I'll ask a candidate if they've ever experienced a problem that I've encountered in the past with the tooling they're most comfortable in, and then I'll probe them about their approach to solving those issues. I'm not going to ask trick questions or anything, but in an average case we can commiserate a bit, and in a great situation I might learn something.

I also want to know how candidates deal with the everyday negotiations that are part of team software development. Tell me about a time you had to convince your colleagues to go a particular direction they were resistant to. What did you do to convince them? Did it work? What you do differently if faced with the same situation?

You brought up the idea of the “everyday negotiations” of a software team. What traits are you trying to identify with those questions?

In the most effective teams I've worked in, people develop a theory of mind for their coworkers and try to explain things in terms of what that person thinks is important. It's pretty important to me to find people who are comfortable making a case for things in terms of consequences rather than preferences (even when arguing for a thing that is fundamentally a preference); also, it's hard to have a real discussion with people who mostly appeal to external authorities, or to use handwavy phrases like "best practices."

I need people to think in terms of the context of both the project as it exists, what its goals are, and rooted in an understanding of the team's dynamics.

It’s not foolproof, but asking questions about how someone has tried to sell their team on an idea, or even asking them to sell me on a specific architectural or tooling change, can shine a light on their approach to collaboration.

But how will I know someone can code if they don't write code in the interview?

I've heard people tell stories about a mythical person who somehow sent all the right signals in an interview but couldn't write so much as a simple variable assignment or loop on the job, but I've never actually seen it. I often suspect people who think they've experienced those sorts of cases just worked on teams with terrible on-boarding processes, ineffective collaboration, or pathologically dysfunctional project management, and just didn't do their part to make their new team members productive.

On the other hand, I've seen plenty of situations with people who have built a successful career struggle to write simple code on a whiteboard either because they're faced with a problem sufficiently removed from their day-to-day routine that they can't get into the right headspace for it under the pressures of an interview.

Perfectionists can get flustered when they notice a small mistake or suboptimal solution. Introverts may find it hard to simultaneously explain their solution to the interviewer while they're trying to solve it. People with even a tiny bit of anxiety can have it overwhelm them in an interview context, because they know they're being evaluated, not just having a friendly chat with trusted peers. I myself have had basically the same coding question go super well at one company and fall flat at another, just because of being thrown off my game, by, say, being asked to answer in a language I hadn't been working in recently, or surprising an interviewer with some language or runtime feature they weren't familiar with and being redirected to explain that instead of working the problem.

So they're more capricious than they sound.

Anyone I've interviewed who could describe their previous work in detail (at least the bits that aren't so confidential they can't be shared), could answer follow up questions, and demonstrated curiosity and tenacity has, with sufficient onboarding and direction, been able to do the work of software development.

How can I objectively compare candidates to each other?

You probably can't. For me that's not even a goal. I'm trying to figure out what each person I talk to could contribute to the team. It's often quite a pleasure for me when an interview moves into a direction that I didn't expect, as I'll sometimes discover that someone has an entire range of skills and experience I didn't realize would help me until we started exploring.

This is where the magic is. Finding out what what the candidate is actually good at. Geting a sense of their metacognitive awareness. You can only do that effectively if you try to meet candidates where they are, instead of trying use a universal yardstick of skills.

Every person has gaps in their knowledge. The people on my current team most suited to our image processing and discrete math problems came from a Windows background and are sometimes flummoxed by things that the long-time Unix users in our team find trivial. The relatively recent bootcamp graduate has a good intuition for UI and mobile design but would probably struggle if I suddenly assigned a task to improve our deployment process. I'm pretty comfortable in most of the layers of our system but wouldn't be efficient writing a machine learning algorithm. Instead of trying to do a direct comparison I try to find out what each candidate could bring to the team, vs. what we'd have to do to make them productive with our tooling and infrastructure. Then my CEO and I will go stew about what we think is most critical right now.

What's are the key differences between how you interview people and a typical tech interview?

A typical tech interview attempts to filter people based on how adept they are to responding to quiz-like problems. I want to discover what they actually know. If I just pick from a favorite set of programming questions, I'm not going to find out what that candidate is good at; I'm going to learn how they fared, today, under pressure, on a toy problem.

But if I meet them where they are, I'll learn what they think they're good at, how they see their accomplishments, and what they think they want to improve on.

So I do what I can to make candidates comfortable. I point out that we don't do a hazing ritual, and the goal is to have a conversation about the role and to get to know each other. I generally schedule a 30 minute initial slot, and if all goes well, we'll either continue the conversation if their schedule allows, or reconvene on a different day. If it's clear to me (or the candidate) there's not a great mutual fit, we'll try to end the call on a high note but we'll move on.

One of my colleagues said she'd only take interviews where there company is willing to provide most of the interview questions, and a clear statement of the assessment criteria, in advance. I haven't been asked to do that, but I think it's a reasonable accommodation. I'd point out that I might ask some probing questions based on the responses, but it's with the goal of obtaining more insight into the candidate's thought process and experience, not meant to stress them out.

Each round of hiring, we've ended up with a few very different candidates at the top of our list. Each of them would add totally different skills to our team. Since our team was and remains quite small, whoever we pick could easily alter the direction of our company.

One person had machine learning expertise that we were looking for, but no production software development experience. Someone else had image processing experience. Another person was trying to break into programming, but had a network of industrial contacts that made them a potential asset to our sales process.

I've also hired straight-from-bootcamp or fresh CS graduates. Again, the core values I'm looking for are curiosity, persistence, and tenacity. A willingness to ask questions even if they might reveal ignorance is a good sign to me. The ability to describe what was hard about the most recent projects they worked on, which is something I expect from more experienced candidates but will probably be more basic than what an industry hire would struggle with, is a good sign to me. But I consider part of my team's responsibility, when working with a new-to-the-indsutry hire, to provide a lot of mentorship and guidance, and do a mix of patient exploration and firehose-of-information teaching that helps bring them up to where they need to be. If your team invests in developing new talent, you can take almost anyone who is driven to learn and mold them into an effective contributor.

What if I need specific algorithmic knowledge for my problem space?

Then you should say that pretty clearly in the job description.

It's fair, if relevant to the specific role you're trying to fill, to prefer candidates who have experience designing or implementing specific algorithms.

For most general-purpose developers, being able to pattern match a problem description to a class of algorithm, along the lines of the "War stories" in Steven Skiena's The Algorithm Design Manual, is a sufficient signal that you can do the necessary work to solve algorithmically-demanding problems.

But if you know that you have a bunch of specific algorithmic knowledge needed to be successful, it's totally fair to make that clear when you're hiring. There are certainly a number of domains where you absolutely, unavoidably need deep discrete math skills, experience with problems that lean on graph theory, or knowledge of 3D or signal processing algorithms.

So how do you assess those skills? With the lost art of conversation. As coworkers, we'd likely approach a design problem that way; why not do the same thing in an interview.

I think it can be reasonable to ask "When would you use algorithm X vs algorithm Y?" if it applies to the role you're trying to fill. You can also ask "Would your approach differ when trying to solve this problem for 20 items, vs 50,000, 1 million, or a billion items?"

As for me, I'll attack problems differently depending on just how key it is to the business. Is this an experimental feature? Then can we cut some corners on performance and reliability until we know it's going to become an important part of our offering? It's worthwhile to talk about when to do the simplest thing that could possibly work, versus spending a week reading research papers before writing a line of code.

What would you do if you need to dig into specific skills needed for a job?

If I need to develop an assessment of the ability to take an abstract problem and turn it into a defensible software design, I might ask a question about a closely-related domain to things that I am actually working on.

When I worked on a product with some shipping logistics issues as a subdomain, I gave candidates a scenario, asking how they'd model a system for retrieving schedules and pricing for air couriers, ground-based freight, and private couriers, leading with examples of some of the scenarios the system would need to handle.

What I'm really assessing in questions like this is the ability to listen, reflect on the problem, maybe quickly sketch a defensible solution, and then explain it, either relying on the language of commonly known design patterns, or with sufficient clarity that even if they didn't know the names of those patterns they were painting a reasonably clear picture. I want them to ask questions about aspects of the problem statement they find ambiguous, and hopefully demonstrate at least an adequate degree of curiosity for the problems my team has to work on.

Listening and communication skills are sufficiently predictive of application design skills in my experience that I don't need to ask candidates to write code.

Do you have any examples of your own experience of being the interviewee and what it uncovered?

I've been asked how I'd design a system that had to communicate with external systems that limited direct network access, but needed to transfer large files on demand. I asked followup questions about the nature of the network boundaries, and basically described two options, a "they call us" push model, or a "we regularly check in with them" pull model. The interviewer was basically looking for an awareness of what the options were, and we drilled down on the details, and basically it mapped neatly to two actual subsystems that that company actually uses. Although I didn't know all of the constraints they had to solve, I presented a defensible set of options that demonstrated I was listening and had some idea of what realistic options were.

Asking candidate "how does a web browser work?" is useful as both a design question and to assess technical communication, assuming the candidate has some experience with web applications, because you can move up and down layers of abstraction, and get a sense of how deep their mental model for the thing that they're building applications on top of is.

Ultimately, software design an act of communication. I might write code on my own, but it's almost never an act done in isolation, as someone almost always needs to integrate with it.

@JasonTrue
Copy link
Author

JasonTrue commented Jul 14, 2023

A truncated version of this article in a more interview-like format appeared on Hacker Noon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment