interviewing.io logo interviewing.io blog
better interviewing through data

Lessons from 3,000 technical interviews… or how what you do after graduation matters way more than where you went to school

Introduction

Aline Lerner

Aline Lerner

Aline is the CEO and founder of interviewing.io.


LATEST POSTS

We analyzed thousands of technical interviews on everything from language to code style. Here’s what we found. 13th June, 2017

LinkedIn endorsements are dumb. Here’s the data. 27th February, 2017

Navigation

Uncategorized

Lessons from 3,000 technical interviews… or how what you do after graduation matters way more than where you went to school

Posted on .

The first blog post I published that got any real attention was called “Lessons from a year’s worth of hiring data“. It was my attempt to understand what attributes of someone’s resume actually mattered for getting a software engineering job. Surprisingly, as it turned out, where someone went to school didn’t matter at all, and by far and away, the strongest signal came from the number of typos and grammatical errors on their resume.

Since then, I’ve discovered (and written about) how useless resumes are, but ever since writing that first post, I’ve been itching to do something similar with interviewing.io’s data. For context, interviewing.io is a platform where people can practice technical interviewing anonymously and, in the process, find jobs — do well in practice, and you get guaranteed (and anonymous!) technical interviews at companies like Uber, Twitch, Lyft, and more. Over the course of our existence, we’ve amassed performance data from thousands of real and practice interviews. Data from these interviews sets us up nicely to look at what signals from an interviewee’s background might matter when it comes to performance.

As often happens, what we found was surprising, and some of it runs counter to things I’ve said and written on the subject. More on that in a bit.

The setup

When an interviewer and an interviewee match on our platform, they meet in a collaborative coding environment with voice, text chat, and a whiteboard and jump right into a technical question. Interview questions on the platform tend to fall into the category of what you’d encounter at a phone screen for a back-end software engineering role, and interviewers typically come from a mix of large companies like Google, Facebook, and Uber, as well as engineering-focused startups like Asana, Mattermark, KeepSafe, and more.

After every interview, interviewers rate interviewees on a few different dimensions, including technical ability. Technical ability gets rated on a scale of 1 to 4, where 1 is “poor” and 4 is “amazing!”. On our platform, a score of 3 or above has generally meant that the person was good enough to move forward. You can see what our feedback form looks like below:

new_interviewer_feedback_circled

The results

To run the analysis for this post, we cross-referenced interviewees’ average technical scores (circled in red in the feedback form above) with the attributes below to see which ones mattered most. Here’s the full attribute list1:

  • Attended a top computer science school
  • Worked at a top company
  • Took classes on Udacity/Coursera2
  • Founded a startup
  • Master’s degree
  • Years of experience

Of all of these, only 3 attributes emerged as statistically significant: top school, top company, and classes on Udacity/Coursera. Apparently, as the fine gentlemen of Metallica once said, nothing else matters. In the graph below, you can see the effect size of each of the significant attributes (attributes that didn’t achieve significance don’t have bars).

As I said at the outset, these results were quite surprising, and I’ll take a stab at explaining each of the outcomes below.

Top school & top company

Going into this, I expected top company to matter but not top school. The company thing makes sense — you’re selecting people who’ve successfully been through at least one interview gauntlet, so the odds of them succeeding at future ones should be higher.

Top school is a bit more muddy, and it was indeed the least impactful of the significant attributes. Why did schooling matter in this iteration of the data but didn’t matter when I was looking at resumes? I expect the answer lies in the disparity between performance in an isolated technical phone screen versus what happens when a candidate actually goes on site. With the right preparation, the technical phone interview is manageable, and top schools often have rigorous algorithms classes and a culture of preparing for technical phone screens (to see why this culture matters and how it might create an unfair advantage for those immersed in it, see my post about how we need to rethink the technical interview). Whether passing an algorithmic technical phone screen means you’re a great engineer is another matter entirely and hopefully the subject of a future post.

Udacity/Coursera

MOOC participation (Udacity and Coursera in particular, as those were the ones interviewing.io users gravitated to most) mattering as much as it did (and mattering way more than pedigree) was probably the most surprising finding here, and so it merited some additional digging.

In particular, I was curious about the interplay between MOOCs and top schools, so I partitioned MOOC participants into people who had attended top schools vs. people who hadn’t. When I did that, something startling emerged. For people who attended top schools, completing Udacity or Coursera courses didn’t appear to matter. However, for people who did not, the effect was huge, so huge, in fact, that it dominated the board. Moreover, interviewees who attended top schools performed significantly worse than interviewees who had not attended top schools but HAD taken a Udacity or Coursera course.

So, what does this mean? Of course (as you’re probably thinking to yourself while you read this), correlation doesn’t imply causation. As such, rather than MOOCs being a magic pill, I expect that people who gravitate toward online courses (and especially those who might have a chip on their shoulder about their undergrad pedigree and end up drinking from the MOOC firehose) already tend to be abnormally driven. But, even with that, I’d be hard pressed to say that completing great online CS classes isn’t going to help you become a better interviewee, especially if you didn’t have the benefit of a rigorous algorithms class up until then. Indeed, a lot of the courses we saw people take focused around algorithms, so it’s no surprise that supplementing your preparation with courses like this could be tremendously useful. Some of the most popular courses we saw were:

Udacity
Design of Computer Programs
Intro to Algorithms
Computability, Complexity & Algorithms

Coursera
Algorithms Specialization
Functional Programming Principles in Scala
Machine Learning
Algorithms on Graphs

Founder status

Having been a founder didn’t matter at all when it came to technical interview performance. This, too, isn’t that surprising. The things that make one a good founder are not necessarily the things that make one a good engineer, and if you just came out of running a startup and are looking to get back into an individual contributor role, odds are, your interview skills will be a bit rusty. This is, of course, true of folks who’ve been in industry but out of interviewing for some time, as you’ll see below.

Master’s degree & years of experience

No surprises here. I’ve ranted quite a bit about the disutility of master’s degrees, so I won’t belabor the point.

Years of experience, too, shouldn’t be that surprising. For context, our average user has about 5 years of experience, with most having between 2 and 10. I think we’ve all anecdotally observed that the time spent away from your schooling doesn’t do you any favors when it comes to interview prep. You can see a scatter plot of interview performance vs. years of experience below as well as my attempt to fit a line through it (as you can see, the R^2 is piss poor, meaning that there isn’t a relationship to speak of).

Closing thoughts

If you know me, or even if you’ve read some of my writing, you know that, in the past, I’ve been quite loudly opposed to the concept of pedigree as a useful hiring signal. With that in mind, I feel like I owe clearly acknowledge, up front, that we found this time runs counter to my stance. But that’s the whole point, isn’t it? You live, you get some data, you make some graphs, you learn, you make new graphs, and you adjust. Even with this new data, I’m excited to see that what mattered way more than pedigree was the actions people took to better themselves (in this case, rounding out their existing knowledge with MOOCs), regardless of their background.

Most importantly, these findings have done nothing to change interviewing.io’s core mission. We’re creating an efficient and meritocratic way for candidates and companies to find each other, and as long as you can code, we couldn’t care less about who you are or where you come from. In our ideal world, all these conversations about which proxies matter more than others would be moot non-starters because coding ability would stand for, well, coding ability. And that’s the world we’re building.

Thanks to Roman Rivilis for his help with data annotation for this post.

1For fun, we tried relating browser and operating system choice to interview performance, (smugly) expecting Chrome users to dominate. Not so. Browser choice didn’t matter, nor did what OS people used while interviewing.

2We got this data from looking at interviewees’ LinkedIn profiles.

Aline Lerner

Aline Lerner

Aline is the CEO and founder of interviewing.io.

Comments
  • user

    AUTHOR Peter

    Posted on 6:42 am January 25, 2017.
    Reply

    You have grammatical error in this sentence “meritocratic way to candidates and companies to find each other”

  • user

    AUTHOR Jeff

    Posted on 11:31 am January 13, 2017.
    Reply

    Why did you leave the other two questions out of the analysis (problem solving and communication)? Those are important aspects to being able to do a job well. It would be interesting to see how they correlate to the experiences of candidates.

  • user

    AUTHOR Danny

    Posted on 11:07 pm January 7, 2017.
    Reply

    I guess my take away is similar to Marc’s. If experience doesn’t matter when comparing results of a technical test maybe the technical test is not accurately testing real-life challenges. Common sense and anecdotal proof would clearly show that having experience makes a better developer. I think everyone would agree that being able to solve computer science riddle and being successful in a company are two very different things. I guess the impossible challenge is figuring out how to test for grit, creativity, and humility in addition to not just knowing technical skills, but how to use them effectively.

  • user

    AUTHOR Eliot

    Posted on 9:40 pm December 30, 2016.
    Reply

    Excellent article! Thank you for compiling and presenting this data. You specifically mention Coursera and Udacity. Do you think that edX MOOCs would yield similar results?

  • user

    AUTHOR Thomas

    Posted on 10:31 pm December 29, 2016.
    Reply

    I’m not sure what this part means, “With that in mind, I feel like I owe clearly acknowledge, up front, that we found this time…”

  • user

    AUTHOR Jack Lingwood

    Posted on 5:11 pm December 29, 2016.
    Reply

    Is there any business relationship between interviewing.io and/or the author of this blog and Coursera/Udacity ?

    • user

      AUTHOR Aline Lerner

      Posted on 10:32 pm December 29, 2016.
      Reply

      There is not. I used to run hiring for Udacity before I started interviewing.io though.

  • user

    AUTHOR John

    Posted on 12:00 pm December 29, 2016.
    Reply

    But surely, people who were using IE must have had a worse score? Did you check that?

  • user

    AUTHOR Mario Hesles

    Posted on 11:41 am December 29, 2016.
    Reply

    ‘…these interviews sets up us nicely…’

    should read:

    ‘…these interviews sets us up nicely…’

  • user

    AUTHOR AV

    Posted on 10:23 am December 29, 2016.
    Reply

    You should not believe your straight line because its not real. You should fit a gaussian instead, because it can be seen with bare eye people with zero experience fail most.

  • user

    AUTHOR Brent Baisley

    Posted on 10:19 am December 29, 2016.
    Reply

    My impression is that people interview for what is taught in Udacity/Coursera. Thus the strong connection. Most people don’t know how to interview for experience. I’d be curious what the average years of experience of the interviewer is. Does the interviewer lack interviewing experience and thus asks Udacity/Coursera type questions?
    A company should care about who you are (not where you came from). If who you are doesn’t matter, than you’re just a coder and not contributing to company culture. Can you code is just one aspect, not the only one that needs to be evaluated.
    I love that you are strongly addressing the coding aspect.

  • user

    AUTHOR Johnny

    Posted on 9:13 am December 29, 2016.
    Reply

    “how what you do”: Is that even English?

  • user

    AUTHOR Josh Suich

    Posted on 7:43 am December 29, 2016.
    Reply

    Great article! As a graduate of a 9 week coding program (Dev Bootcamp, Chicago – Fall ’13) I directly observed that my fellow classmates almost universally possessed motivation/drive, discipline, a vision to improve career path, and the attitude of humility and curiosity as lifetime-learners. I’d be curious to see how attendance at a program like Dev Bootcamp or Code Academy factors into these data sets.

    One technical note, however; please revise this sentence: “With that in mind, I feel like I owe clearly acknowledge, up front, that we found this time runs counter to my stance.”

  • user

    AUTHOR Sreejith

    Posted on 4:45 am December 29, 2016.
    Reply

    I’d say you missed adding open source contributions in the mix. I know from a common sense POV that open source contributions speak a lot about the candidate and his capacity to do great work. But, is it true in your data?

  • user

    AUTHOR Brendan Hart

    Posted on 3:53 am December 29, 2016.
    Reply

    The article text is extremely difficult to read. Light grey on white is not a good contrast choice.

    Please consider making accessibility for all users a priority in your design decisions.

    webaim.org/resources/contrastchecker/

  • user

    AUTHOR Erlend

    Posted on 2:15 am December 29, 2016.
    Reply

    Can we really conclude from the bottom graph that past experience is worthless? Or should we conclude that the interviewing.io process didn’t at all reflect the candidate’s experience?

    • user

      AUTHOR Jeff

      Posted on 3:09 pm January 2, 2017.
      Reply

      My thoughts, exactly. I can understand that years of experience may not be a “silver bullet” indicator of somebody’s “technical ability”. But it’s really hard to believe that it doesn’t count for anything. This leads me to believe that the definition of “technical ability” is flawed here.

  • user

    AUTHOR Pawan

    Posted on 10:35 pm December 28, 2016.
    Reply

    In our hiring process, we always ask candidates what their area of interest is, what projects have they done outside work, the last tech book they read or last programming language /tech skill they learnt. MOOC makes this concept more formal and it should because there are so many terrific courses on Udacity/Coursera etc that they are as good or better than the best text available on the subject.

  • user

    AUTHOR Amit N

    Posted on 8:05 pm December 28, 2016.
    Reply

    Great write up!! Curious though what’s the precise definition here of a top computer science school and top company?

  • user

    AUTHOR Andrew

    Posted on 7:46 pm December 28, 2016.
    Reply

    Your conclusion that “what mattered way more than pedigree was the actions people took to better themselves (in this case, rounding out their existing knowledge with MOOCs), *regardless of their background*.” seems to explicit contradict what you say earlier, that “For people who attended top schools, completing Udacity or Coursera courses didn’t appear to matter.”

    Rather, that seems to indicate that the value of MOOC’s (at least in this context) is very much dependent on background.

  • user

    AUTHOR Marc

    Posted on 6:45 pm December 28, 2016.
    Reply

    >observed that the time spent away from your schooling doesn’t do you any favors when it comes to interview prep
    This is a sad truth. Companies end up hiring bad workers, but ones with good memory that remember algos by hearth. It creates a non-autonomous work force with sky rocking PEP quantity. Recently went back to look for a job after working 5 years, was asked a ton of questions about general algos, analyse them, compare them etc. Nothing about what I built or done in the past. For sure a fresh grad looked better than me, until you ask him to create/build something by himself, alone. The fresh grad is lost. Where to start? What is a requirement? How do you validate? You have to tell them each step one after another. In the end when you have 10 of those guys it creates a code review & administrative burden where instead of working, you take care of the kids.

  • user

    AUTHOR Mark Bullock

    Posted on 6:27 pm December 28, 2016.
    Reply

    What is a top company?

  • user

    AUTHOR Amy

    Posted on 1:38 pm December 28, 2016.
    Reply

    Thanks for writing this! I also have the opinion that a college pedigree isn’t a great hiring indicator, but you hit the nail on the head. Hiring practices and interview questions tend to be biased towards those who have the degrees and have been trained to solve algorithm questions. Until there’s a large shift in trend to ask more practical questions relevant to the job, I think that this is an unfortunate correlation you will see.

  • user

    AUTHOR ActualEngineer

    Posted on 1:28 pm December 28, 2016.
    Reply

    This shows me that you are not interviewing for technical skill but for cargo cult fit.

  • user

    AUTHOR Neal Fultz

    Posted on 1:05 pm December 28, 2016.
    Reply

    This sounds like an interesting data set; would you be willing to share an anonymized CSV?

  • user

    AUTHOR Jason

    Posted on 12:51 pm December 28, 2016.
    Reply

    Interested from an interviewer’s prospective ( we’re in a hiring kick right now):

    How did you control for job title in the interview? Specifically that a Jr. Developer w/ 0 years of experience might have gotten a 4* rating on a junior level technical screen, where a dev lead with 10 years experience might have scored low comparatively to his or her peers. Or perhaps a dev management roll had a request to include the tech screen so the powers that be can asses all strengths and weaknesses?

    Also, it feels like there’s a bit of subjective nature of “solid developer, would interview again” type responses that have little meaning in the comparison of a developer fresh out of college to a seasoned developer? Some interviewers may have less rigorous standards than others and depending on how you weighted the data that could be open to interpretation as well.

    Maybe I’m missing something in the data analysis that addressed this – it doesn’t feel wrong, just inconclusive? While I subjectively and anecdotally agree with the results, I don’t know that I feel comfortable using the data as a supporting reference to why I feel that way.

  • user

    AUTHOR Aaron

    Posted on 12:35 pm December 28, 2016.
    Reply

    I work with someone who is a terrible coder, contributes next to nothing, and spends most of his days taking coursera courses. I hope he interviews with you, I’m sure he will do amazing.

  • View Comments (42) ...