Lessons from 3,000 technical interviews… or how what you do after graduation matters way more than where you went to school

By Aline Lerner | Published:

The first blog post I published that got any real attention was called “Lessons from a year’s worth of hiring data“. It was my attempt to understand what attributes of someone’s resume actually mattered for getting a software engineering job. Surprisingly, as it turned out, where someone went to school didn’t matter at all, and by far and away, the strongest signal came from the number of typos and grammatical errors on their resume.

Since then, I’ve discovered (and written about) how useless resumes are, but ever since writing that first post, I’ve been itching to do something similar with interviewing.io’s data. For context, interviewing.io is a platform where people can practice technical interviewing anonymously and, in the process, find jobs — do well in practice, and you get guaranteed (and anonymous!) technical interviews at companies like Uber, Twitch, Lyft, and more. Over the course of our existence, we’ve amassed performance data from thousands of real and practice interviews. Data from these interviews sets us up nicely to look at what signals from an interviewee’s background might matter when it comes to performance.

As often happens, what we found was surprising, and some of it runs counter to things I’ve said and written on the subject. More on that in a bit.

The setup

When an interviewer and an interviewee match on our platform, they meet in a collaborative coding environment with voice, text chat, and a whiteboard and jump right into a technical question (check out our recordings page to see this in action). Interview questions on the platform tend to fall into the category of what you’d encounter at a phone screen for a back-end software engineering role, and interviewers typically come from a mix of large companies like Google, Facebook, and Uber, as well as engineering-focused startups like Asana, Mattermark, KeepSafe, and more.

After every interview, interviewers rate interviewees on a few different dimensions, including technical ability. Technical ability gets rated on a scale of 1 to 4, where 1 is “poor” and 4 is “amazing!”. On our platform, a score of 3 or above has generally meant that the person was good enough to move forward. You can see what our feedback form looks like below:

Screenshot of the interview feedback form

To run the analysis for this post, we cross-referenced interviewees’ average technical scores (circled in red in the feedback form above) with the attributes below to see which ones mattered most.1

  • Attended a top computer science school
  • Worked at a top company
  • Took classes on Udacity/Coursera2
  • Founded a startup
  • Master’s degree
  • Years of experience

Of all of these, only 3 attributes emerged as statistically significant: top school, top company, and classes on Udacity/Coursera. Apparently, as the fine gentlemen of Metallica once said, nothing else matters. In the graph below, you can see the effect size of each of the significant attributes (attributes that didn’t achieve significance don’t have bars).

As I said at the outset, these results were quite surprising, and I’ll take a stab at explaining each of the outcomes below.

Top school & top company

Going into this, I expected top company to matter but not top school. The company thing makes sense — you’re selecting people who’ve successfully been through at least one interview gauntlet, so the odds of them succeeding at future ones should be higher.

Top school is a bit more muddy, and it was indeed the least impactful of the significant attributes. Why did schooling matter in this iteration of the data but didn’t matter when I was looking at resumes? I expect the answer lies in the disparity between performance in an isolated technical phone screen versus what happens when a candidate actually goes on site. With the right preparation, the technical phone interview is manageable, and top schools often have rigorous algorithms classes and a culture of preparing for technical phone screens (to see why this culture matters and how it might create an unfair advantage for those immersed in it, see my post about how we need to rethink the technical interview). Whether passing an algorithmic technical phone screen means you’re a great engineer is another matter entirely and hopefully the subject of a future post.

Udacity/Coursera

MOOC participation (Udacity and Coursera in particular, as those were the ones interviewing.io users gravitated to most) mattering as much as it did (and mattering way more than pedigree) was probably the most surprising finding here, and so it merited some additional digging.

In particular, I was curious about the interplay between MOOCs and top schools, so I partitioned MOOC participants into people who had attended top schools vs. people who hadn’t. When I did that, something startling emerged. For people who attended top schools, completing Udacity or Coursera courses didn’t appear to matter. However, for people who did not, the effect was huge, so huge, in fact, that it dominated the board. Moreover, interviewees who attended top schools performed significantly worse than interviewees who had not attended top schools but HAD taken a Udacity or Coursera course.

So, what does this mean? Of course (as you’re probably thinking to yourself while you read this), correlation doesn’t imply causation. As such, rather than MOOCs being a magic pill, I expect that people who gravitate toward online courses (and especially those who might have a chip on their shoulder about their undergrad pedigree and end up drinking from the MOOC firehose) already tend to be abnormally driven. But, even with that, I’d be hard pressed to say that completing great online CS classes isn’t going to help you become a better interviewee, especially if you didn’t have the benefit of a rigorous algorithms class up until then. Indeed, a lot of the courses we saw people take focused around algorithms, so it’s no surprise that supplementing your preparation with courses like this could be tremendously useful. Some of the most popular courses we saw were:

Udacity

Coursera

Founder status

Having been a founder didn’t matter at all when it came to technical interview performance. This, too, isn’t that surprising. The things that make one a good founder are not necessarily the things that make one a good engineer, and if you just came out of running a startup and are looking to get back into an individual contributor role, odds are, your interview skills will be a bit rusty. This is, of course, true of folks who’ve been in industry but out of interviewing for some time, as you’ll see below.

Master’s degree & years of experience

No surprises here. I’ve ranted quite a bit about the disutility of master’s degrees, so I won’t belabor the point.

Years of experience, too, shouldn’t be that surprising. For context, our average user has about 5 years of experience, with most having between 2 and 10. I think we’ve all anecdotally observed that the time spent away from your schooling doesn’t do you any favors when it comes to interview prep. You can see a scatter plot of interview performance vs. years of experience below as well as my attempt to fit a line through it (as you can see, the R^2 is piss poor, meaning that there isn’t a relationship to speak of).

Closing Thoughts

If you know me, or even if you’ve read some of my writing, you know that, in the past, I’ve been quite loudly opposed to the concept of pedigree as a useful hiring signal. With that in mind, I feel like I owe clearly acknowledge, up front, that we found this time runs counter to my stance. But that’s the whole point, isn’t it? You live, you get some data, you make some graphs, you learn, you make new graphs, and you adjust. Even with this new data, I’m excited to see that what mattered way more than pedigree was the actions people took to better themselves (in this case, rounding out their existing knowledge with MOOCs), regardless of their background.

Most importantly, these findings have done nothing to change interviewing.io’s core mission. We’re creating an efficient and meritocratic way for candidates and companies to find each other, and as long as you can code, we couldn’t care less about who you are or where you come from. In our ideal world, all these conversations about which proxies matter more than others would be moot non-starters because coding ability would stand for, well, coding ability. And that’s the world we’re building.

Thanks to Roman Rivilis for his help with data annotation for this post.

Footnotes

Footnotes

  1. For fun, we tried relating browser and operating system choice to interview performance, (smugly) expecting Chrome users to dominate. Not so. Browser choice didn’t matter, nor did what OS people used while interviewing. We got this data from looking at interviewees’ LinkedIn profiles.

  2. We got this data from looking at interviewees' LinkedIn profiles.

We know exactly what to do and say to get the company, title, and salary you want.

Interview prep and job hunting are chaos and pain. We can help. Really.