Skip to content

Instantly share code, notes, and snippets.

@bcantrill bcantrill/ Secret
Last active Sep 28, 2019

What would you like to do?
Assessing software engineering candidates

Assessing software engineering candidates

How does one assess candidates for software engineering positions? This is an age-old question without a formulaic answer: software engineering is itself too varied to admit a single archetype.

Most obviously, software engineering is intellectually challenging; it demands minds that not only enjoy the thrill of solving puzzles, but can also stay afloat in a sea of numbing abstraction. This raw capacity, however, is insufficient; there are many more nuanced skills that successful software engineers must posess. For example, software engineering is an almost paradoxical juxtaposition of collaboration and isolation: successful software engineers are able to work well with (and understand the needs of!) others, but are also able to focus intensely on their own. This contrast extends to the conveyance of ideas, where they must be able to express their own ides well enough to persuade others, but also be able to understand and be persuaded by the ideas of others -- and be able to implement all of these on their own. They must be able to build castles of imagination, and yet still understand the constraints of a grimy reality: they must be arrogant enough to see the world as it isn't, but humble enough to see the world as it is. Each of these is a balance, and for each, long-practicing software engineers will cite colleagues who have been ineffective because they have erred too greatly on one side or another.

The challenge is therefore to assess prospective software engineers, without the luxury of firm criteria. This document is an attempt to pull together accumulated best practices; while it shouldn't be inferred to be overly prescriptive, where it is rigid, there is often a painful lesson behind it.

In terms of evaluation mechanism: using in-person interviewing alone can be highly unreliable and can select predominantly for surface aspects of a candidate's personality. While we advocate (and indeed, insist upon) interviews, they should come relatively late in the process; as much assessment as possible should be done by allowing the candidate to show themselves as software engineers truly work: on their own, in writing.

Traits to evaluate

How does one select for something so nuanced as balance, especially when the road ahead is unknown? We must look at a wide-variety of traits, presented here in the order in which they are traditionally assessed:

  • Aptitude
  • Education
  • Motivation
  • Values
  • Integrity


As the ordering implies, there is a temptation in traditional software engineering hiring to focus on aptitude exclusively: to use an interview exclusively to assess a candidate's pure technical pulling power. While this might seem to be a reasonable course, it in fact leads down the primrose path to pop quizzes about algorithms seen primarily in interview questions. (Red-black trees and circular linked list detection: looking at you.) These assessments of aptitude are misplaced: software engineering is not, in fact, a spelling bee, and one's ability to perform during an arbitrary oral exam may or may not correlate to one's ability to actually develop production software. We believe that aptitude is better assessed where software engineers are forced to exercise it: based on the work that they do on their own. As such, candidates should be asked to provide three samples of their works: a code sample, a writing sample, and an analysis sample.

Code sample

Software engineers are ultimately responsible for the artifacts that they create, and as such, a code sample can be the truest way to assess a candidate's ability.

Candidates should be guided to present code that they believe best reflects them as a software engineer. If this seems too broad, it can be easily focused: what is some code that you're proud of and/or code that took you a while to get working?

If candidates do not have any code samples because all of their code is proprietary, they should write some: they should pick something that they have always wanted to write but have been needing an excuse -- and they should go write it! On such a project, the guideline to the candidate should be to spend at least (say) eight hours on it, but no more than twenty-four -- and over no longer than a two week period.

If the candidate is to write something de novo and/or there is a new or interesting technology that the organization is using, it may be worth guiding the candidate to use it (e.g., to write it in a language that the team has started to use, or using a component that the team is broadly using). This constraint should be uplifting to the candidate (e.g., "You may have wanted to explore this technology; here's your chance!"). At Joyent in the early days of node.js, this was what we called "the node test", and it yielded many fun little projects -- and many great engineers.

Writing sample

Writing good code and writing good prose seem to be found together in the most capable software engineers. That these skills are related is perhaps unsurprising: both types of writing are difficult; both require one to create wholly new material from a blank page; both demand the ability to revise and polish.

To assess a candidate's writing ability, they should be asked to provide a writing sample. Ideally, this will be technical writing, e.g.:

  • A block comment in source code
  • A blog entry or other long-form post on a technical issue
  • A technical architectural document, whitepaper or academic paper
  • A comment on a mailing list or open source issue or other technical comment on social media

If a candidate has all of these, they should be asked to provide one of each; if a candidate has none of them, they should be asked to provide a writing sample on something else entirely, e.g. a thesis, dissertation or other academic paper.

Analysis sample

Part of the challenge of software engineering is dealing with software when it doesn't, in fact, work correctly. At this moment, a software engineer must flip their disposition: instead of an artist creating something new, they must become a scientist, attempting to reason about a foreign world. In having candidates only write code, analytical skills are often left unexplored. And while this can be explored conversationally (e.g., asking for "debugging war stories" is a classic -- and often effective -- interview question), an oral description of recalled analysis doesn't necessarily allow the true depths of a candidate's analytical ability to be plumbed. For this, candidates should be asked to provide an analysis sample: a written analysis of software behavior from the candidate. This may be difficult for many candidates: for many engineers, these analyses may be most often found in defect reports, which may not be public. If the candidate doesn't have such an analysis sample, the scope should be deliberately broadened to any analytical work they have done on any system (academic or otherwise). If this broader scope still doesn't yield an analysis sample, the candidate should be asked to generate one to the best of their ability by writing down their analysis of some aspect of system behavior. (This can be as simple as asking them to write down the debugging story that would be their answer to the interview question -- giving the candidate the time and space to answer the question once, and completely.)


We are all born uneducated -- and our own development is a result of the informal education of experience and curiosity, as well as a better structured and more formal education. To assess a candidate's education, both the formal and informal aspects of education should be considered.

Formal education

Formal education is easier to assess by its very formality: a candidate's education is relatively easily evaluated if they had the good fortune of discovering their interest and aptitude at a young age, had the opportunity to pursue and complete their formal education in computer science, and had the further good luck of attending an institution that one knows and has confidence in.

But one should not be bigoted by familiarity: there are many terrific software engineers who attended little-known schools or who took otherwise unconventional paths. The completion of a formal education in computer science is much more important than the institution: the strongest candidate from a little-known school is almost assuredly stronger than the weakest candidate from a well-known school.

In other cases, it's even more nuanced: there have been many later-in-life converts to the beauty and joy of software engineering, and such candidates should emphatically not be excluded merely because they discovered software later than others. For those that concentrated in entirely non-technical disciplines, further probing will likely be required, with greater emphasis on their technical artifacts.

The most important aspect of one's formal education may not be its substance so much as its completion. Like software engineering, there are many aspects of completing a formal education that aren't necessarily fun: classes that must be taken to meet requirements; professors that must be endured rather than enjoyed; subject matter that resists quick understanding or appeal. In this regard, completion of a formal education represents the completion of a significant task. Inversely, the failure to complete one's formal education may constitute an area of concern. There are, of course, plausible life reasons to abandon one's education prematurely (especially in an era when higher education is so expensive), but there are also many paths and opportunities to resume and complete it. The failure to complete formal education may indicate deeper problems, and should be understood.

Informal education

Learning is a life-long endeavor, and much of one's education will be informal in nature. Assessing this informal education is less clear, especially because (by its nature) there is little formally to show for it -- but candidates should have a track record of being able to learn on their own, even when this self-education is arduous. One way to probe this may be with a simple question: what is an example of something that you learned that was a struggle for you? As with other questions posed here, the question should have a written answer.


Motivation is often not assessed in the interview process, which is unfortunate because it dictates so much of what we do and why. For many companies, it will be important to find those that are intrinsically motivated -- those who do what they do primarily for the value of doing it.

Selecting for motivation can be a challenge, and defies formula. Here, open source and open development can be a tremendous asset: it allows others to see what is being done, and, if they are excited by the work, to join the effort and to make their motivation clear.


Values are often not evaluated formally at all in the software engineering process, but they can be critical to determine the "fit" of a candidate. To differentiate values from principles: values represent relative importance versus the absolute importance of principles. Values are important in a software engineering context because we so frequently make tradeoffs in which our values dictate our disposition. (For example, the relative importance of speed of development versus rigor; both are clearly important and positive attributes, but there is often a tradeoff to be had between them). Different engineering organizations may have different values over different times or for different projects, but it's also true that individuals tend to develop their own values over their career -- and it's essential that the values of a candidate do not clash with the values of the team that they are to join.

But how to assess one's values? Many will speak to values that they don't necessarily hold (e.g., rigor), so simply asking someone what's important to them may or may not yield their true values. One observation is that one's values -- and the adherence or divergence from those values -- will often be reflected in happiness and satisfaction with work. When work strongly reflects one's values, one is much more likely to find it satisfying; when values are compromised (even if for a good reason), work is likely be unsatisfying. As such, the specifics of one's values may be ascertained by asking candidates some probing questions, e.g.:

  • What work have you done that you are particularly proud of and why?
  • What mistakes have you made that you particularly regret and why?
  • When have you been happiest in your professional career and why?
  • When have you been unhappiest in your professional career and why?

Our values can also be seen in the way we interact with others. As such, here are some questions that may have revealing answers:

  • Who is someone who has mentored you, and what did you learn from them?
  • Who is someone you have mentored, and what did you learn from them?
  • What qualities do you most admire in other software engineers?

The answers to these questions should be written down to allow them to be answered thoughtfully and in advance -- and then to serve as a starting point for conversation in an interview.

Some questions, however, are more amenable to a live interview. For example, it may be worth asking some situational questions like:

  • What are some times that you have felt values come into conflict? How did you resolve the conflict?

  • What are some times when you have diverged from your own values and how did you rectify it? For example, if you value robustness, how do you deal with having introduced a defect that should have been caught?


In an ideal world, integrity would not be something we would need to assess in a candidate: we could trust that everyone is honest and trustworthy. This view, unfortunately, is naïve with respect to how malicious bad actors can be; for any organization -- but especially for one that is biased towards trust and transparency -- it is essential that candidates be of high integrity: an employee who operates outside of the bounds of integrity can do nearly unbounded damage to an organization that assumes positive intent.

There is no easy or single way to assess integrity for people with whom one hasn't endured difficult times. By far the most accurate way of assessing integrity in a candidate is for them to already be in the circle of one's trust: for them to have worked deeply with (and be trusted by) someone that is themselves deeply trusted. But even in these cases where the candidate is trusted, some basic verification is prudent.

Criminal background check

The most basic integrity check involves a criminal background check. While local law dictates how these checks are used, the check should be performed for a simple reason: it verifies that the candidate is who they say they are. If someone has made criminal mistakes, these mistakes may or may not disqualify them (much will depend on the details of the mistakes, and on local law on how background checks can be used), but if a candidate fails to be honest or remorseful about those mistakes, it is a clear indicator of untrustworthiness.

Credential check

A hidden criminal background in software engineering candidates is unusual; much more common is a slight "fudging" of credentials or other elements of one's past: degrees that were not in fact earned; grades or scores that have been exaggerated; awards that were not in fact bestowed; gaps in employment history that are quietly covering up by changing the time that one was at a previous employer. These transgressions may seem slight, but they can point to something quite serious: a candidate's willingness or desire to mislead others to advance themselves. To protect against this, a basic credential check should be performed. This can be confined to degrees, honors, and employment.


References can be very tricky, especially for someone coming from a difficult situation (e.g., fleeing poor management). Ideally, a candidate is well known by someone inside the company who is trusted -- but even this poses challenges: sometimes we don't truly know people until they are in difficult situations, and someone "known" may not, in fact, be known at all.
Worse, references are most likely to break down when they are most needed: dishonest, manipulative people are, after all, dishonest and manipulative; they can easily fool people -- and even references -- into thinking that they are something that they are not. So while references can provide value (and shouldn't be eliminated as a tool), they should also be used carefully and kept in perspective.


For individuals outside of that circle of trust, checking integrity is probably still best done in person. There are several potential mechanisms here:

  • A very broad interview schedule that includes some people clearly subordinate to the candidate. Some people will treat people differently depending on the status that they perceive.

  • A very broad interview schedule that includes some people with a talent for reading others. For example, someone who is effective at sales often has a knack for picking up on subtle body langauge cues that others will miss.

  • Interviews that deliberately probe, e.g., asking candidates to describe a time that preserving integrity necessitated taking a more difficult path.

  • Interviews that setup role playing, e.g., asking candidates how they would handle a co-worker approaching them privately asking them to do something that they perceived as wrong.

Mechanics of evaluation

Interviews should begin with phone screens to assess the most basic viability, especially with respect to motivation. This initial conversation might include some basic but elementary (and unstructured) homework to gauge that motivation. The candidate should be pointed to material about the company and sources that describe methods of work and specifics about what that work entails. The candidate should be encouraged to review some of this material and send formal written thoughts as a quick test of motivation. If one is not motivated enough to learn about a potential employer, it's hard to see how they will suddenly gain the motivation to see them through difficult problems.

If and when a candidate is interested in deeper interviews, everyone should be expected to provide the same written material.

Candidate-submitted material

The candidate should submit the following:

  • Code sample (no more than three)
  • Code project, if deemed applicable/appropriate
  • Writing sample (no more than one per category)
  • Analysis sample (no more than three)
  • Written answers to eight questions:
    1. What work have you done that you are particularly proud of and why?
    2. What mistakes have you made that you particularly regret and why?
    3. What is an example of something that you learned that was a struggle for you?
    4. When have you been happiest in your professional career and why?
    5. When have you been unhappiest in your professional career and why?
    6. Who is someone who has mentored you, and what did you learn from them?
    7. Who is someone you have mentored, and what did you learn from them?
    8. What qualities do you most admire in other software engineers?

Candidate-submitted material should be collected and distributed to everyone on the interview list.

Before the interview

Everyone on the interview schedule should read the candidate-submitted material, and a pre-meeting should then be held to discuss approach: based on the written material, what are the things that the team wishes to better understand? And who will do what?

Pre-interview job talk

For senior candidates, it can be effective to ask them to start the day by giving a technical presentation to those who will interview them. On the one hand, it may seem cruel to ask a candidate to present to a roomful of people who will be later interviewing them, but to the candidate this should be a relief: this allows them to start the day with a home game, where they are talking about something that they know well and can prepare for arbitrarily. The candidate should be allowed to present on anything technical that they've worked on, and it should be made clear that:

  1. Confidentiality will be respected (that is, they can present on proprietary work)

  2. The presentation needn't be novel -- it is fine for the candidate to give a talk that they have given before

  3. Slides are fine but not required

  4. The candidate should assume that the audience is technical, but not necessarily familiar with the domain that they are presenting

  5. The candidate should assume about 30 minutes for presentation and 15 minutes for questions.

The aim here is severalfold.

First, this lets everyone get the same information at once: it is not unreasonable that the talk that a candidate would give would be similar to a conversation that they would have otherwise had several times over the day as they are asked about their experience; this minimizes that repetition.

Second, it shows how well the candidate teaches. Assuming that the candidate is presenting on a domain that isn't intimately known by every member of the audience, the candidate will be required to instruct. Teaching requires both technical mastery and empathy -- and a pathological inability to teach may point to deeper problems in a candidate.

Third, it shows how well the candidate fields questions about their work. It should go without saying that the questions themselves shouldn't be trying to find flaws with the work, but should be entirely in earnest; seeing how a candidate answers such questions can be very revealing about character.

All of that said: a job talk likely isn't appropriate for every candidate -- and shouldn't be imposed on (for example) those still in school. One guideline may be: those with more than seven years of experience are expected to give a talk; those with fewer than three are not expected to give a talk (but may do so); those in between can use their own judgement.


Interviews shouldn't necessarily take one form; interviewers should feel free to take a variety of styles and approaches -- but should generally refrain from "gotcha" questions and/or questions that may conflate surface aspects of intellect with deeper qualities (e.g., Microsoft's infamous "why are manhole covers round?"). Mixing interview styles over the course of the day can also be helpful for the candidate.

After the interview

After the interview (usually the next day), the candidate should be discussed by those who interviewed them. The objective isn't necessarily to get to consensus first (though that too, ultimately), but rather to areas of concern. In this regard, the post-interview conversation must be handled carefully: the interview is deliberately constructed to allow broad contact with the candidate, and it is possible than someone relatively junior or otherwise inexperienced will see something that others will miss. The meeting should be constructed to assure that this important data isn't supressed; bad hires can happen when reservations aren't shared out of fear of disappointing a larger group!

One way to do this is to structure the meeting this way:

  1. All participants are told to come in with one of three decisions: Hire, Do not hire, Insufficient information. All participants should have one of these positions and they should not change their initial position. (That is, one's position on a candidate may change over the course of the meeting, but the initial position shouldn't be retroactively changed.) If it helps, this position can be privately recorded before the meeting starts.

  2. The meeting starts with everyone who believes Do not hire explaining their position. While starting with the Do not hire positions may seem to give the meeting a negative disposition, it is extremely important that the meeting start with the reservations lest they be silenced -- especially when and where they are so great that someone believes a candidate should not be hired.

  3. Next, those who believe Insufficient information should explain their position. These positions may be relatively common, and it means that the interview left the interviewer with unanswered questions. By presenting these unanswered questions, there is a possibility that others can provide answers that they may have learned in their interactions with the candidate.

  4. Finally, those who believe Hire should explain their position, perhaps filling in missing information for others who are less certain.

If there are any Do not hire positions, these should be treated very seriously, for it is saying that the aptitude, education, motivation, values and/or integrity of the candidate are in serious doubt or are otherwise unacceptable. Those who believe Do not hire should be asked for the dimensions that most substantiate their position. Especially where these reservations are around values or integrity, a single Do not hire should raise serious doubts about a candidate: the risks of bad hires around values or integrity are far too great to ignore someone's judgement in this regard!

Ideally, however, no one has the position of Do not hire, and through a combination of screening and candidate self-selection, everyone believes Hire and the discussion can be brief, positive and forward-looking!

If, as is perhaps most likely, there is some mix of Hire and Insufficient information, the discussion should focus on the information that is missing about the candidate. If other interviewers cannot fill in the information about the candidate (and if it can't be answered by the corpus of material provided by the candidate), the group should together brainstorm about how to ascertain it. Should a follow-up conversation be scheduled? Should the candidate be asked to provide some missing information? Should some aspect of the candidate's background be explored? The collective decision should not move to Hire as long as there remain unanswered questions preventing everyone from reaching the same decision.


This comment has been minimized.

Copy link

danmcd commented Apr 24, 2018

  • s/pursuade/persuade/g

  • For the new-graduate, have them submit a code sample from a class project?

  • Hinted at above in education, but sometimes (like everything else here, it's difficult to discern) lacking "the further good luck of attending an institution that one knows and has confidence in" can sometimes disqualify someone unfairly. Nexenta's Matt Barden, without one knowing his contributions, might slip under the radar because he's (was?) a UMass Lowell student. We mightn't have this problem as badly as others, but it's a problem that needs at least a little bit of acknowledgement, even as a tradeoff. (e.g. Prolific quality contributions may outweigh a so-called "lesser" school.)


This comment has been minimized.

Copy link

joshwilsdon commented Apr 24, 2018

"unbunded" did you mean unbounded?


This comment has been minimized.

Copy link

KodyKantor commented Apr 24, 2018

I like this! It would be nice if some of this were organization-wide. We've seen that it can be difficult to have groups within the company that don't share the same values. It might be fine if different groups within a company have different values as long as there's a massive silo in place. But if a reorg happens and the folks in the silo are distributed then everybody is in for a bad time. That's probably out of the scope of this effort, but I just thought I would put it out there, and I'm sure we're aware of this.

coding exercises:
It might be nice to have an idea of criteria that we are looking these things to fulfill. Depending on the role it might not make sense to have someone do a large coding exercise and instead have them describe a system that they built.

I think what we should be looking for is the ability to design a maintainable system, ability to maintain a system, and ability to acknowledge problems with system. If they don’t acknowledge system problems (even after probing, maybe asking if they had unlimited time and budget what they would fix) that shows some amount of arrogance and possibly inadequate understanding of the system (stopping at ‘hey, it works’). Maybe this bleeds over into the ‘analysis sample.’ It seems difficult to write decent code that you haven’t also analyzed to understand how it works.

For example, maybe someone built a home-lab monitoring system or over-engineered their blog (and admit it was over-engineered). Maybe they like doing data analysis and have an example of that. All of these things probably have some artifact (an architecture diagram can be written up, infrastructure as code, scripts, etc.), but it might not be ‘code’ in the classic sense. If we're hiring someone who wants to work on our monitoring systems, then maybe it's fine if their coding exercise is to build a monitoring system. Asking someone to solve a Project Euler puzzle seems equivalent to asking them to solve a theoretical data structure problem (circular linked list) in my opinion. It's an interesting puzzle, but doesn't really seem to answer our question of 'can they build a good system?' At the same time, asking someone to build a system for an interview is ridiculous, so maybe there's a middle ground for folks that don't have project/code samples readily available.

I assume that we would accept the code, analysis, and writing samples all as one artifact too. An example would be a postmortem analysis, where they describe the system, make a tool to root cause a problem, and code a fix.

informal education
I don't know many folks who don't have a formal education. The stereotype in my head is someone that is good at writing code, but doesn't have the underlying knowledge of the system. I'm guessing this leads to designs that fail in frustrating ways. For these people, should we be focused more on hearing their war stories, and more interested in the quality/depth of a coding/project sample?

figuring out a candidate's values
The questions suggested for in-person interviews would catch me off guard if I were interviewing. These questions seem pretty abstract. I expect the interviewer would be able to get the answer they’re searching for without asking these questions verbatim. When someone is talking about when they were unhappiest in their career they might mention a manager that they didn’t get along with. We could follow on this by asking what contributed to the poor management relationship - maybe it was micromanagement (lack of two-way trust), harassment (lack of integrity), or hiding long-term plans (lack of transparency, power play). Maybe it's the opposite - employee frustrated that their manager wouldn't follow their orders, or the employee was frustrated that the manager wouldn't fund a project if the manager didn't know the long term plans.

reference checks
Reference checks sound great in the abstract. However, I would be surprised if a candidate pointed us to a 'bad' reference. Even terrible people can find three people who really like them and think they’re awesome. Maybe we could check references recursively until we find someone in the circle of trust? :) I really don't know what to think about reference checks at this point.

presentation sample
I don't know what questions this would answer aside from what is already covered by the written sample and analysis samples. I don't see anything written about it (yet), so I'll withhold further judgement.

This looks good. This is a difficult problem to solve. I'm anxious to see the rest of the document.


This comment has been minimized.

Copy link

KodyKantor commented Apr 24, 2018

I haven't been able to concisely phrase this, but I do wonder if there's even a way to prevent folks with poor integrity from entering an organization. The worst will get through despite our best efforts because they're willing to do whatever it takes to get their way, like lie through their teeth when we ask about their values because they know the 'right' answer. I think it's possible to prevent bad actors from getting into our system, but at some point it becomes too burdensome for our organization and would turn away good candidates. I've heard some orgs invite candidates in to play complicated board games with their team. I'm sure it's a good way to assess someone's personality, but bad actors could even get through that 'challenge' if they faked their personality.


This comment has been minimized.

Copy link

nickziv commented Apr 24, 2018

Nit: last paragraph in the 'Code Samples' section, misspelled opportunity as 'oppportunity'. First paragraph of 'Integrity' misspelled unbounded as 'unbunded'.


This comment has been minimized.

Copy link
Owner Author

bcantrill commented Apr 25, 2018

@KodyKantor, @nickziv, @joshwilsdon: Feedback integrated!


This comment has been minimized.

Copy link
Owner Author

bcantrill commented Apr 25, 2018

@danmcd: Feedback integrated!


This comment has been minimized.

Copy link

princesspretzel commented Apr 25, 2018


  • be persuaded the ideas of others - missing "by"
  • handle a co-working approaching - ing/er

A response to "Why do you want to work here?" whether written or verbal (or both) will cover a lot of ground. It is also different enough from written thoughts about the company based on pre-screen research that it is important.


This comment has been minimized.

Copy link

jaredmorrow commented Apr 25, 2018

Nit: In Values section sotware engineering should be software engineering

Thanks for this btw, great read


This comment has been minimized.

Copy link
Owner Author

bcantrill commented Apr 26, 2018

@princesspretzel: Fixed both typos. On the "why do you want to work here", I let that somewhat implicit in the phone screen; let me know if you think it should be more explicit.

@jaredmorrow: Thanks -- fixed!


This comment has been minimized.

Copy link

timfoster commented Apr 26, 2018

This looks great. I had a few minor suggestions for other things we may want to include:

I'm not sure where exactly this fits (possibly an interview question, possibly writing samples) but seeing examples of an engineer's argument style can be a useful indicator of what they'd be like to work with.

You touch upon this in the opening paragraphs around "conveyance of ideas", but more specifically:

  • Did the engineer argue their case effectively?
  • Were they wrong, and did they graciously admit defeat, or did it turn into a raging flamewar?
  • When both sides of the argument were valid solutions, did the engineer acknowledge this and move quickly towards a consensus?

Similarly, how an engineer deals with screwing up can be another good indicator of their integrity.

We used to test this all the time in ON when someone broke the gate - seeing gatelings put their hands up, apologise and do everything they could to get the gate back to a happy place was good, seeing them try to shift blame or downplay the seriousness of the breakage was not good. This might fall into the "What work have you done that you are particularly embarrassed of and why?" written answer, but asking a question like "Describe a recent breakage that was your fault and describe how you dealt with it" might be useful, though admittedly, nobody's going to respond with a negative-sounding answer.

Finally, is it worth suggesting that interviewers look for information about a candidate that the candidate may not have submitted formally? (I'm thinking blog posts, mailing list discussions, social media posts etc.) Seeing how a candidate conducts themselves in public could be another indicator of how well they'd fit in. This might happen some time after the interview when there's reasonable consensus that a candidate shows promise, so as not to overwhelm the hiring process.


This comment has been minimized.

Copy link

IanWyszynski commented Apr 26, 2018

I think the analysis sample is a great idea. As I understand it, it's purpose is to demonstrate the candidate's ability to reason about systems and perhaps to some extent demonstrate debugging ability. For new grads, would a sample that discusses experience with school projects be acceptable? The followup:

these analyses are often found in defect reports, which may not be public

makes it seem to me like what is being asked for is a detailed bug report from a previous employer or open-source organization, which many candidates fresh out of school may not be able to provide. If the scope of the analysis sample is wider and can be made to encompass school projects, then I think it's important the question be framed this way so that candidates understand that academic software-related experiences, which may be their most effective means of demonstrating aptitude, are fair game.

What follows is less feedback than it is a dream for the future.

Another possible avenue for demonstrating debugging and analysis skills might be an exercise like the one proposed in RFD 111. Getting the candidate to participate in a time and scope-limited debugging/software-extension exercise that requires their ability to parse logs or use post-mortem debugging tools like mdb may provide a good read on their ability reason about software they haven't seen in the past. An interviewer could periodically check in with a candidate performing this exercise for brief status reports. The candidate could also be asked to provide a succinct technical write-up of the exercise after completing it. I think there are a couple of advantages to this sort of interview:

  • It can demonstrate verbal and written communication skills
  • It allows candidates who aren't able to provide analysis samples (or who are time-constrained) to demonstrate something of comparable value
  • It mimics a situation that a Joyent employee will find themselves in
  • It is time-limited., and even if the end goal is not accomplished, there is likely to be plenty of material to write and talk about.
  • It demonstrate the candidates receptiveness to feedback or guidance from the interviewer
  • It provides an opportunity for the candidate to leave the interview with a concrete sense that they've learned and accomplished something (my personally opinion is that this is key for retaining qualified applicants fresh out of university)

I wonder whether a single-server Manta deployment could provide a good venue for such an interview task. Of course, I realize it's easy to ask for something like this and implementing it may be something that we just don't have the bandwidth for right now. It also may be that Manta is too complicated a system to expect an applicant with little background to be able to debug and/or extend it. However, I'm a proponent of such a task being included in the Joyent interview process -- one of my most memorable and productive interview experiences was one where I got to work with software during the interview.


This comment has been minimized.

Copy link

kellymclaughlin commented Apr 26, 2018

I like the idea of the Pre-interview talk. In my most recent interviewing experience I had several cases where I would talk to a recruiter who would say how much my skills and experience made me a great fit for whatever job. Then I'd get to an actual interview and all the questions would be geared toward the domain of the company's business and there was rarely an opportunity to actually present my experience or skills and talk about how I might apply it or fit in with what the company was doing. I found it disappointing at best and often just plain irritating. I think the Pre-interview talk could be a good solution for this problem (in addition to the other benefits you highlight). Then even if we end up not putting an offer out to the candidate, they at least hopefully leave more satisfied in having been able to put their best foot forward. I know I would have appreciated this in a few cases.


This comment has been minimized.

Copy link

dekobon commented Apr 30, 2018

I would like to see the document structured such that criteria for engineering is broken apart from criteria that would be common to any employee. This would be nice because we could use it as guidance for non-engineering roles.


This comment has been minimized.

Copy link

dekobon commented Apr 30, 2018

processional should be professional.


This comment has been minimized.

Copy link
Owner Author

bcantrill commented May 7, 2018

@timfoster: I tried to address this by changing the phrasing slightly from "What work have you done that you are particularly embarrassed of and why?" to "What mistakes have you made that you particularly regret and why?" This should open the door to asking people how they dealt with their mistakes during the interview.

@IanWyszynski: I tried to deliberately broaden the scope of the analysis sample to incorporate your feedback.

@kellymclaughlin: I agree! ;)

@dekobon: Fixed the typo -- and agreed about breaking them out, but sticking just with the stuff I know for now. ;)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.