interviewing.io logo interviewing.io blog
better interviewing through data

We looked at how a thousand college students performed in technical interviews to see if where they went to school mattered. It didn’t.

Introduction

Samantha Jordan

Samantha Jordan


LATEST POSTS
Navigation

Uncategorized

We looked at how a thousand college students performed in technical interviews to see if where they went to school mattered. It didn’t.

Posted on .

interviewing.io is a platform where engineers practice technical interviewing anonymously. If things go well, they can unlock the ability to participate in real, still anonymous, interviews with top companies like Twitch, Lyft and more. Earlier this year, we launched an offering specifically for university students, with the intent of helping level the playing field right at the start of people’s careers. The sad truth is that with the state of college recruiting today, if you don’t attend one of very few top schools, your chances of interacting with companies on campus are slim. It’s not fair, and it sucks, but university recruiting is still dominated by career fairs. Companies pragmatically choose to visit the same few schools every year, and despite the career fair being one of the most antiquated, biased forms of recruiting that there is, the format persists, likely due to the fact that there doesn’t seem to be a better way to quickly connect with students at scale. So, despite the increasingly loud conversation about diversity, campus recruiting marches on, and companies keep doing the same thing expecting different results.

In a previous blog post, we explained why companies should stop courting students from the same five schools. Regardless of your opinion on how important that idea is (for altruistic reasons, perhaps), you may have been left skeptical about the value and practicality of broadening the college recruiting effort, and you probably concede that it’s rational to visit top schools, given limited resources — while society is often willing to agree that there are perfectly qualified students coming out of non-top colleges, they maintain that they’re relatively rare. We here to show you, with some nifty data from our university platform, that this not true.

To be fair, this isn’t the first time we’ve looked at whether where you went to school matters. In a previous post, we found that taking Udacity and Coursera programming classes mattered way more than where you went to school. And way back when, one of our founders figured out that where you went to school didn’t matter at all but that the number of typos and grammatical errors on your resume did. So, what’s different this time? The big, exciting thing is that these prior analyses were focused mostly on engineers who had been working for at least a few years already, making it possible to argue that a few years of work experience smoothes out any performance disparity that comes from having attended (or not attended a top school). In fact, the good people at Google found that while GPA didn’t really matter after a few years of work, it did matter for college students. So, we wanted to face this question head-on and look specifically at college juniors and seniors while they’re still in school. Even more pragmatically, we wanted to see if companies limiting their hiring efforts to just top schools means they’re going to get a higher caliber of candidate.

Before delving into the numbers, here’s a quick rundown of how our university platform works and the data we collect.

The setup

For students who want to practice on interviewing.io, the first step is a brief (~15-minute) coding assessment on Qualified to test basic programming competency. Students who pass this assessment, i.e. those who are ready to code while another human being breathes down their neck, get to start booking practice interviews.

When an interviewer and an interviewee match on our platform, they meet in a collaborative coding environment with voice, text chat, and a whiteboard and jump right into a technical question. Interview questions on the platform tend to fall into the category of what you’d encounter at a phone screen for a back-end software engineering role, and interviewers typically come from top companies like Google, Facebook, Dropbox, Airbnb, and more.

After every interview, interviewers rate interviewees on a few different dimensions, including technical ability. Technical ability gets rated on a scale of 1 to 4, where 1 is “poor” and 4 is “amazing!”. On our platform, a score of 3 or above has generally meant that the person was good enough to move forward. You can see what our feedback form looks like below:

new_interviewer_feedback_circled

On our platform, we’re fortunate to have thousands of students from all over the U.S., spanning over 200 universities. We thought this presented a unique opportunity to look at the relationship between school tier and interview performance for both juniors (interns) and seniors (new grads). To study this relationship, we first split schools into the following four tiers, based on rankings from U.S. News & World Report:

  • “Elite” schools (e.g. MIT, Stanford, Carnegie Mellon, UC-Berkeley)
  • Top 15 schools (not including top tier, e.g. University of Wisconsin, Cornell, Columbia)
  • Top 50 schools (not including top 15, e.g. Ohio State University, NYU, Arizona State University)
  • The rest (e.g. Michigan State, Vanderbilt University, Northeastern University, UC-Santa Barbara)

Then, we ran some statistical significance testing on interview scores vs. school tier to see if school tier mattered, for both interns (college juniors) and new grads (college seniors), comprising a set of roughly 1000 students.

Does school have anything to do with interview performance?

In the graphs below, you can see technical score distributions for interviews with students in each of the four school tiers (see legend). As you recall from above, each interview is scored on a scale of 1 to 4, where 1 is the worst and 4 is the best.

First, the college juniors…

interns by tier

And then, the seniors…

New grads by tier

What’s pretty startling is that the shape of these distributions, for both juniors and seniors, is remarkably similar. Indeed, statistical significance testing revealed no difference between students of any tier when it came to interview performance.1 What this means is that top-tier students are achieving the same results as those in no-name schools. So the question becomes: if the students are comparable in skill, why are companies spending egregious amounts of money attracting only a subset of them?

Okay, so what are companies missing?

Besides missing out on great, cheaper-to-acquire future employees, companies are missing out on an opportunity to save time and money. Right now a ridiculous amount of money is being spent on university recruiting. We’ve previously cited the $18k price tag just for entry to the MIT career fair. In a study done by Lauren Rivera through the Harvard Business Review, she reveals that one firm budgeted nearly $1m just for social recruiting events on a single campus.

The higher price tag of these events also means it makes even less sense for smaller companies or startups to try and compete with high-profile, high-profit tech giants. Most of the top schools that are being heavily pursued already have enough recruiters vying for their students. Unwittingly, this pursuit seems to run contrary to most companies desires for high diversity and long-term sustainable growth.

Even when companies do believe talent is evenly distributed across school tiers, there are still reasons for why companies might recruit at top schools. There are other factors that help elevate certain schools in a recruiter’s mind. There are long-standing company-school relationships (for example, the number of alumni who work at the company currently). There are signaling effects too — companies get Silicon Valley bonus points by saying their eng team is comprised of a bunch of ex-Stanford, ex-MIT, ex- etc. etc. students.

So what can companies do?

As such, companies may never stop recruiting at top-tier schools entirely, but they ought to at least include schools outside of that very small circle in the search for future employees. The end result of the data is the same: for good engineers, school means a lot less than we think. The time and money that companies put in to compete for candidates within the same select few schools would be better spent creating opportunities that include everyone, as well as developing tools to vet students more fairly and efficiently.

As you saw above, we used a 15-minute coding assessment to cull our inbound student flow, and just a short challenge leveled the playing field between students from all walks of life. At the very least, we’d recommend employers do the same thing in their process. But, of course, we’d be remiss if we didn’t suggest one other thing.

At interviewing.io, we’ve proudly built a platform that grants the best-performing students access to top employers, no matter where they went to school or where they come from. Our university program, in particular, allows us to grant companies the privilege to reach an exponentially larger pool of students, for the same cost of attending one or two career fairs at top target schools. Want diverse, top talent without the chase? Sign up to be an employer on our university platform!

1Of course, this hinges on everyone completing a quick 15-minute coding challenge first, to ensure they’re ready for synchronous technical interviews. We’re excited about this because companies can replicate this step in their process as well!
Samantha Jordan

Samantha Jordan

Comments
  • user

    AUTHOR Noone

    Posted on 4:07 am February 14, 2018.
    Reply

    Your graphs need to have the same Y axis scale or they look really misleading.

  • user

    AUTHOR David Hancock

    Posted on 3:12 am February 14, 2018.
    Reply

    When comparing graphs, try to avoid varying the vertical axis – the 40% line moves up and down between graphs.

  • user

    AUTHOR Simone

    Posted on 1:46 am February 14, 2018.
    Reply

    Interesting.
    But please, charts should be coherent in scales, otherwise they are as useful as plain data in tables.

  • user

    AUTHOR Jordan

    Posted on 11:08 pm February 13, 2018.
    Reply

    I was one of these students, and I would like to comment that the interview heavily covered standard data structures (freshmen/sophomore courses in my program), like BSTs, hashtables, stacks, and queues. Potentially algorithms like dijkstra’s? The fact that my “elite” education covered these data structures in freshmen courses, before shifting into heavy big-O/theta/omega analysis and proofs of correctness/language analysis in sophomore/junior year is somewhat overlooked. Interesting data structures like splay trees, red-black trees, augmented data structures for different cost-lookup performance, graph algorithms like boruvkas and star-contraction, convex optimizations, are part of a long list of more advanced skills. It’s debatable how useful any of these things are in industry, because maybe the only thing that matters is project experience and memorization of java standards/shell commands. I don’t think it’s surprising that everyone looks the same when tested on the things that everyone should know. I wouldn’t be surprised to learn that elite institutions are overdrawn, but when I take a math class with a visiting professor who informs us that our “standard” class covered 2 semesters of number theory relative to Purdue and UNC, you have to wonder whether the students who don’t go out on the weekends because Saturday and Sunday are homework days aren’t learning anything extra.

  • user

    AUTHOR Tommy

    Posted on 8:11 pm February 13, 2018.
    Reply

    Have you considered clustering schools based on scores to redefine what category a school belongs to?

  • user

    AUTHOR Steve Bennett

    Posted on 8:03 pm February 13, 2018.
    Reply

    There’s an obvious alternative hypothesis here that the technical interviews aren’t generating meaningful data. They’re randomly assigning scores of 1-4 in ways that don’t correlate with actual skill, and hence don’t correlate with what school you went to.

    (Not saying this is true, but it needs to be disproven.)

  • View Comments (8) ...