interviewing.io logo interviewing.io blog
better interviewing through data
Navigation
Featured

Uncategorized

Impostor syndrome strikes men just as hard as women… and other findings from thousands of technical interviews

Posted on October 30th, 2018.

The modern technical interview is a rite of passage for software engineers and (hopefully!) the precursor to a great job. But it’s also a huge source of stress and endless questions for new candidates. Just searching “how do I prepare for a technical interview” turns up millions of Medium posts, coding bootcamp blogs, Quora discussions, and entire books.

Despite all this conversation, people struggle to know how they’re even doing in interviews. In a previous post, we found that a surprisingly large number of interviewing.io’s users consistently underestimate their performance, making them more likely to drop out of the process and ultimately harder to hire. Now, and with considerably more data (over 10k interviews led by real software engineers!), we wanted to go deeper: what seems to make candidates worse at gauging their own performance?

We know some general facts that make accuracy a challenge: people aren’t always great at assessing or even remembering their performance on difficult cognitive tasks like writing code.1 Technical interviews can be particularly hard to judge if candidates don’t have much experience with questions with no single right answer. Since many companies don’t share any kind of detailed post-interview feedback (beyond a yes/no) with candidates for liability reasons, many folks never get any sense of how they did, what they did well, or what could have been better.2, 3 Indeed, pulling back the curtain on interviewing, across the industry, was one of the primary motivators for building interviewing.io!

But to our knowledge there’s little data out there looking specifically at how people feel after real interviews on this scale, across different companies–so we gathered it, giving us the ability to test interesting industry assumptions about engineers and coding confidence.

One big factor we were interested in was impostor syndrome. Impostor syndrome resonates with a lot of engineers,4 indicating that many wonder whether they truly match up to colleagues and discount even strong evidence of competence as a fluke. Impostor syndrome can make us wonder whether we can count on the positive performance feedback that we’re getting, and how much our opportunities have come from our own effort, versus luck. Of particular interest to us was whether this would show up for women on our platform. There’s a lot of research evidence that candidates from underrepresented backgrounds experience a greater lack of belonging that feeds impostor syndrome,5 and this could show up as inaccuracy about judging your own interview performance.

The setup

interviewing.io is a platform where people can practice technical interviewing anonymously, and if things go well, get jobs at top companies in the process. We started it because resumes uck and because we believe that anyone, regardless of how they look on paper, should have the opportunity to prove their mettle.

When an interviewer and an interviewee match on interviewing.io, they meet in a collaborative coding environment with voice, text chat, and a whiteboard and jump right into a technical question (feel free to watch this process in action on our interview recordings page).  After each interview, people leave one another feedback, and each party can see what the other person said about them once they both submit their reviews.

Here’s an example of an interviewer feedback form:

Feedback form for interviewers

Feedback form for interviewers

Immediately after the interview, candidates answered a question about how well they thought they’d done on the same 1-4 scale:

Feedback form for interviewees

Feedback form for interviewees

For this post, we looked at over 10k technical interviews led by real software engineers from top companies. In each interview, a candidate was rated by an interviewer on their problem-solving ability, technical ability, and communication skills, as well as whether the interviewer would advance them to the next round. This gave us a measure of how different someone’s self-rating was from the rating that the interviewer actually gave them, and in which direction. In other words, how skewed was their estimation from their true performance?

Going in, we had some hunches about what might matter:

  • Gender. Would women be harder on their coding performance than men?
  • Having been an interviewer before. It seems reasonable that having been on the other side will pull back the curtain on interviews.
  • Being employed at a top company. Similar to above.
  • Being a top-performing interviewee on interviewing.io — people who are better interviewees overall might have more confidence and awareness of when they’ve gotten things right (or wrong!)
  • Being in the Bay Area or not. Since tech is still so geographically centered on the Bay Area, we considered that folks who live in a more engineering-saturated culture could have greater familiarity with professional norms around interviews.
  • Within the interview itself, question quality and interviewer quality. Presumably, a better interviewer is also a better communicator, whereas a confusing interviewer might throw off a candidates’ entire assessment of their performance. We also looked at whether it was a practice interview, or for a specific company role.
  • For some candidates, we could also look at few measures of their personal brand within the industry, like their number of GitHub and Twitter followers. Maybe people with a strong online presence are more sure of themselves when they interview?

So what did we find?

Women are just as accurate as men at assessing their technical ability

Contrary to expectations around gender and confidence, we didn’t find a reliable statistically significant gender difference in accuracy. At first, it looked like female candidates were more likely to underestimate their performance, but when we controlled for other variables, like experience and rated technical ability, it turned out the key differentiator was experience. More experienced engineers are more accurate about their interview performance, and men are more likely to be experienced engineers, but experienced female engineers are just as accurate about their technical ability.

Based on previous research, we hypothesized that impostor syndrome and a greater lack of belonging could result in female candidates penalizing their interview performance, but we didn’t find that pattern.6 However, our finding echoes a research project from the Stanford Clayman Institute for Gender Research, which looked at 1,795 mid-level tech workers from high tech companies. They found that women in tech aren’t necessarily less accurate when assessing their own abilities, but do have significantly different ideas about what success requires (e.g., long working hours and risk-taking). In other words, women in tech may not doubt their own abilities but might have different ideas about what’s expected. And a survey from Harvard Business Review  asking over a thousand professionals about their job application decisions also made this point. Their results emphasized that gender gaps in evaluation scenarios could be more about different expectations for how scenarios like interviews are judged.

That said, we did find one interesting difference: women went through fewer practice interviews overall than men did. The difference was small but statistically significant, and harkens back to our earlier finding that women leave interviewing.io roughly 7 times as often as men do, after a bad interview.

But in that same earlier post, we also found that masking voices didn’t impact interview outcomes. This whole cluster of findings affirms what we suspected and what the folks doing in-depth studies of gender in tech have found: it’s complicated. Women’s lack of persistence in interviews can’t be explained only by impostor syndrome about their own abilities, but it’s still likely that they’re interpreting negative feedback more severely and making different assumptions about interviews.

Here’s the distribution of accuracy distance for both female and male candidates on our platform (zero indicates a rating that matches the interviewer’s score, while negative values indicate underestimated score, and positive values indicate an overestimated score). The two groups look pretty much identical:

Accuracy by gender

What else didn’t matter?

Another surprise: having been an interviewer didn’t help. Even people who had been interviewers themselves don’t seem to get an accuracy boost from that. Personal brand was another non-finding. People with more GitHub followers weren’t more accurate than people with few to no GitHub followers. Nor did interviewer rating matter (i.e. how well an interviewer was reviewed by their candidates), although to be fair, interviewers are generally rated quite highly on the site.

So what was a statistically significant boost to accurate judgments of interview performance? Mostly, experience.

Experienced engineers have a better sense for how well they did in interviews, compared with engineers earlier in their careers.7 But it doesn’t seem to just be that you’re better at gauging your interview performance because you’re better at writing code; although there is a small lift from this, with higher rated engineers being more accurate. But when you look at junior engineers, even top-performing junior candidates struggled to accurately assess their performance.8  

experienced versus juniors

Our data mirrors a trend seen in Stack Overflow’s 2018 Developer survey. They asked respondents several questions about confidence and competition with other developers, and noted that more experienced engineers feel less competitive and more confident.9 This isn’t necessarily surprising: experience is correlated with skill level, after all, and highly skilled people are likely to be more confident. But our analysis let us control for performance and code skill within career groups, and we still found that experienced engineers were better at predicting their interview scores. There are probably multiple factors here: experienced engineers have been through more interviews, have led interviews themselves, and have a stronger sense of belonging, all of which may combat impostor syndrome.

Insider knowledge and context also seems to help: Being in the Bay Area and being at a top company both made people more accurate. Like the experienced career group, engineers who seem more likely to have contextual industry knowledge are also more accurate. We found small but statistically significant lifts from factors like being located in the Bay Area and working at a top company. However, the lift from working at a top company seems to mostly measure a lift from overall technical ability: being at a top company is essentially a proxy measure for being a more experienced, higher quality engineer.

Finally, as you get better at interviewing and move into company interviews, you do get more accurate. People were more accurate about their performance in company interviews compared to practice interviews, and their overall ranking on the interviewing.io site also predicted improved accuracy: interviewing.io also gives users an overall ranking, based on their performance over multiple interviews and weighted toward more recent measures. People who scored in the top 25% were more likely to be accurate about their interview performance.

In general, how are people at gauging their interview performance overall? We’ve looked at this before, with roughly a thousand interviews, and now, with ten thousand, the finding continues to hold up. Candidates were accurate about how they did in only 46% of interviews, and underestimated themselves in 35% of interviews (and the remaining 19%, of course, are the overestimators). Still, candidates are generally on the right track — it’s not like people who score a 4 are always giving themselves a 1.10 Self-ratings are statistically significantly predictive for actual interview scores (and positively correlated), but that relationship is noisy.

The implications

Accurately judging your own interview performance is a skill in its own right and one that engineers need to learn from experience and context in the tech industry. But we’ve also learned that many of the assumptions we made about performance accuracy didn’t hold up to scrutiny — female engineers had just as accurate a view of their own skills as male ones, and engineers who had led more interviews or were well known on GitHub weren’t particularly better at gauging their performance.

What does this mean for the industry as a whole? First off, impostor syndrome appears to be the bleary-eyed monster that attacks across gender ability, and how good you are, or where you are, or how famous you are isn’t that important. Seniority does help mitigate some of the pain, but impostor syndrome affects everyone, regardless of who they are or where they’re from. So, maybe it’s time for a kinder, more empathetic interviewing culture. And a culture that’s kinder to everyone, because though marginalized groups who haven’t been socialized in technical interviewing are hit the hardest by shortcomings in the interview process, no one is immune to self-doubt.

We’ve previously discussed what makes someone a good interviewer, and empathy plays a disproportionately large role. And we’ve seen that providing immediate post-interview feedback is really important for keeping candidates from dropping out. So, whether you’re motivated by kindness and ideology or cold, hard pragmatism, a bit more kindness and understanding toward your candidates is in order.

Cat Hicks, the author of this guest post, is a researcher and data scientist with a focus on learning. She’s published empirical research on learning environments, and led research on the cognitive work of engineering teams at Google and Travr.se. She holds a PhD in Psychology from UC San Diego.

1Self-assessment has been explored in a number of domains, and often used to measure learning. One important criticism is that it’s highly impacted by people’s motivation and emotional state at the time of asking. See: Sitzmann, T., Ely, K., Brown, K. G., & Bauer, K. N. (2010). Self-assessment of knowledge: A cognitive learning or affective measure?. Academy of Management Learning & Education, 9(2), 169-191.

2Designing a good technical interview is no small task on the interviewer side. For an informal discussion of this, see this post.

3For some anecdotal conversation about interview self-assessment, see this one

4E.g., this article and this one.

5Some examples of further reading in social science research:
Good, C., Rattan, A., & Dweck, C. S. (2012). Why do women opt out? Sense of belonging and women’s representation in mathematics. Journal of personality and social psychology, 102(4), 700.
Master, A., Cheryan, S., & Meltzoff, A. N. (2016). Computing whether she belongs: Stereotypes undermine girls’ interest and sense of belonging in computer science. Journal of Educational Psychology, 108(3), 424.

6One complication for our dataset is the representation of experienced female engineers: we simply didn’t have very many, which is true to the demographics of the tech industry, but also means that selection biases in the small group of experienced female engineers we do have are more likely to be present, and this isn’t the be-all and end-all of exploring for group differences. We’d like to continue looking at interviews with female participants to explore this fully.

7These effects and the previous non-findings were all explored in a linear mixed model. Significant results for the individual effects are all p<.05

8Experienced engineers have an average skew of -.14; Junior engineers have an average skew of -.22, New Grads have an average skew of -.25.

9See also: https://insights.dice.com/2018/03/19/imposter-syndrome-tech-pros-age/

10Another wrinkle with the design behind this data is that there’s a floor and a ceiling on the scale: people who always score a 4, for example, can’t ever overrate themselves, because they’re already at the top of the scale. We dealt with this a couple of ways: by excluding people at the floor and ceiling and re-running analyses on the middle subset, and by binning skew into either accurate or not and looking at that. The findings hold up across this.

Featured

Uncategorized

Exactly what to say when recruiters ask you to name the first number… and other negotiation word-for-words

Posted on August 16th, 2018.

There are a lot of resources out there that talk about salary negotiation but many tend to skew a bit theoretical. In my experience, one of the hardest things about negotiating your salary is knowing what to say in tough, ambiguous situations with a power balance that’s not in your favor. What’s OK? What’s rude? What are the social norms? And so on.

Before I started interviewing.io, I’ve worked as a software engineer, an in-house recruiter, and an agency recruiter, so I’ve literally been on all sides of the negotiating table. For the last few years, I’ve been guest-lecturing MIT’s 6.UAT, a class about technical communication for computer science majors. Every semester, negotiation is one of the most-requested topics from students. In this post, I’m sharing the content of that lecture, which is targeted toward students, but has served seasoned industry folks just as well. You’re never too young or too old to advocate for yourself.

Btw, if you don’t like reading and prefer long, rambly diatribes in front of an unsightly glass wall, I covered most of this material (and other stuff) in a webinar I did with the fine people at Udacity (where I used to run hiring) a few months ago. So, pick your poison.

Why negotiate at all, especially if I’m junior?

If you’re early in your career, you might say that negotiation isn’t worth the hassle — after all, junior roles have pretty narrow salary bands. There are a few reasons this view is short-sighted and wrong. First, though it’s pretty unlikely in the grand scheme of things, if you’re applying to a startup, there might come a magical day when your equity is worth something. This is especially true if you’re an early employee — with a good exit, a delta of a few tenths of a percent might end up being worth a down payment on a home in San Francisco.

But, let’s get real, your equity is likely worthless (except interviewing.io’s equity… that’s totes gonna be worth something), so let me give you a better, more immediate reason to learn to haggle early in your career, precisely because that’s when the stakes are low. Humans are frighteningly adaptable creatures. Scared of public speaking? Give 3 talks. The first one will be gut-wrenchingly horrific, the stuff of nightmares. Your voice will crack, you’ll mumble, and the whole time, you’ll want to vomit. The next one will be nerve-wracking. The last one will mostly be OK. And after that, you’ll be just fine. Same thing applies to approach anxiety, mathematical proofs, sex, and, you guessed it, salary negotiation!

So, make all the awkward, teeth-cringing mistakes now, while it doesn’t matter, and where failure will cost you $5K or $10K a year. Because the further along you get in your career, the bigger the upside will be… and the bigger the downside will be for not negotiating. Not only will the salary bands be wider for senior roles, but as you get more senior, more of your comp comes from equity, and equity has an even wider range for negotiating. Negotiating your stock well can make 6-figure differences and beyond (especially if you apply some of these same skills to negotiating with investors over term sheets, should you ever start your own company)… so learn these skills (and fail) while you’re young, because the older you get, the harder it’s going to be to start and the more high-stakes it’s going to be.

So, below, as promised, I’ll give you a few archetypal, stress-inducing situations and what to say, word-for-word in each one. But first, let me address the elephant in the room…

Will my offer be rescinded if I try to negotiate?

As I mentioned earlier, this blog post is coming out of a lecture I give at MIT. Every semester, I start the negotiation portion of the lecture with the unshakeable refrain that no one will ever rescind your offer for negotiating. Last semester was different, though. I was just starting to feel my oats and get into my talk (the negotiation piece comes about halfway through) and smugly recited the bit about offers never being rescinded, followed by my usual caveat… “unless you act like a douche while negotiating.” Then, a hand shot up in the back of the room. Ah ha, I thought to myself, one of the non-believers. Ready to placate him, I called on the gentleman in the back.

“My offer got rescinded for negotiation.”

The class broke out into uproarious laughter. I laughed too. It was kind of funny… but it was also unnerving, and I wanted to get to the bottom of it.

“Were you a giant jerk when you negotiated?”

“Nope.” Shit, OK, what else can I come up with…

“Were you applying at a really small company with maybe one open role?” I asked, praying against hope that he’d say yes.

“Yes.”

“Thank god.”

So, there’s the one exception I’ve found so far to my blanket statement. After working with hundreds and hundreds of candidates back when I was still a recruiter, I had never heard or seen an offer get rescinded (and none of my candidates acted like douches while negotiating, thank god), until then. So, if you’re talking to a super small company with one role that closes as soon as they find someone, yes, then they might rescind the offer.

But, to be honest, and I’m not just saying this because I was wrong in front of hundreds of bloodthirsty undergrads, an early startup punishing a prospective employee for being entrepreneurial is a huge red flag to me.

OK, so, now onto the bit where I tell you exactly what to say.1

What to say when asked to name the first number

There will come a time in every job search where a recruiter will ask you about your compensation expectations. This will likely happen very early in said search, maybe even during the first call you’ll ever have with the company.

I think this is a heinous, illaudable practice fraught with value asymmetry. Companies know their salary ranges and roughly what you’re worth to them before they ever talk to you (barring phenomenal performance in interviews which kicks you into a different band). And they know what cost of living is in your area. So they already have all the info they need about you, while you have none about them or the role or even your market value. Sure, there are some extenuating circumstances where you are too expensive, e.g. you’re like an L6 at Google and are talking to an early stage startup that can only afford to pay you 100K a year in base, but honestly even in that situation, if the job is cool enough and if you have the savings, you might take it anyway.

So, basically, telling them something will only hurt you and never help you. So don’t do it. Now, here’s exactly what to say when asked to name the first number.

At this point, I don’t feel equipped to throw out a number because I’d like to find out more about the opportunity first – right now, I simply don’t have the data to be able to say something concrete. If you end up making me an offer, I would be more than happy to iterate on it if needed and figure out something that works. I also promise not to accept other offers until I have a chance to discuss them with you.

TADA!

What to say when you’re handed an exploding offer

Exploding offers, in my book, are the last bastion of the incompetent. The idea goes something like this… if we give a candidate an aggressive deadline, they’ll have less of a chance to talk to other companies. Game theory for the insipid.

Having been on the other side of the table, I know just how arbitrary offer deadlines often are. Deadlines make sense when there is a limited number of positions and applicants all come in at the same time (e.g. internships). They do not make any sense in this market, where companies are perpetually hiring all the time — therefore it’s an entirely artificial construct. Joel Spolsky, the creator of Trello and Stack Overflow, had something particularly biting to say on the matter of exploding offers many years ago (the full post, Exploding Offer Season, is really good):

“Here’s what you’re thinking. You’re thinking, well, that’s a good company, not my first choice, but still a good offer, and I’d hate to lose this opportunity. And you don’t know for sure if your number one choice would even hire you. So you accept the offer at your second-choice company and never go to any other interviews. And now, you lost out. You’re going to spend several years of your life in some cold dark cubicle with a crazy boss who couldn’t program a twenty out of an ATM, while some recruiter somewhere gets a $1000 bonus because she was better at negotiating than you were.”

Even in the case of internships, offer deadlines need not be as aggressive as they often are, and I’m happy to report that many college career centers have taken stands against exploding offers. Nevertheless, if you’re not a student or if your school hasn’t outlawed this vile practice, here’s exactly what to say if it ever happens to you.

I would very much appreciate having a bit more time. I’m very excited about Company X. At the same time, choosing where I work is extremely important to me. Of course, I will not drag things out, and I will continue to keep you in the loop, but I hope you can understand my desire to make as informed of a decision as possible. How about I make a decision by…?

The reverse used car salesman… or what to say to always get more

At the end of the day, the best way to get more money is to have other offers. I know, I know, interviewing sucks and is a giant gauntlet-slog, but in many cases, having just one other offer (so, I don’t know, spending a few extra days of your time spread over a few weeks) can get you at least $10K extra. It’s a pretty rational, clear-cut argument for biting the slog-bullet and doing a few more interviews.

One anecdote I’ll share on the subject goes like this. A few years ago, a close friend of mine who’s notoriously bad at negotiation and hates it with a passion was interviewing at one of the big 4 companies. I was trying to talk to him into getting out there just a little bit, for the love of god, and talk to at least one more company. I ended up introducing him to a mid-sized startup where he quickly got an onsite interview. Just mentioning that he had an onsite at this company to his recruiter from the bigco got him an extra $5K in his signing bonus.

Offers are, of course, better than onsites, but in a pinch, even onsites will do… because every onsite increases your odds of not accepting the offer from the company you’re negotiating with. So, let’s say you do have some offers. Do you reveal the details?

The answer is that it depends. If the cash parts of the offers you have are worth more than the one you have in hand, then you can reveal the details. If they’re worth more in total but less in cash, it’s a bit dicier because equity at smaller companies is kind of worthless… you can still use it as leverage if you tell the story that that equity is worth more to YOU, but that’s going to take a bit more finesse, so if you’ve never negotiated before, you might want to hold off.

If the cash part of your equity is not worth more, it’s sufficient to say you have offers and when pressed, you can simply say that you’re not sharing the details (it’s ok not to share the details).

But whether you reveal details or not, here’s the basic formula for getting more. See why I call it the reverse used car salesman?

I have the following onsites/offers, and I’m still interviewing at Company X and Company Y, but I’m really excited about this opportunity and will drop my other stuff and SIGN TODAY if…

So, “if” what? I propose listing 3 things you want, which will typically be:

  • Equity
  • Salary
  • Signing/relocation bonus

The reason I list 3 things above isn’t because I expect you’ll be able to get all 3, but this way, you’re giving the person you’re negotiating with some options. In my experience, you’ll likely get 2 out of the 3.

So, what amounts should you ask for when executing on the reverse used car salesman? It’s usually easier to get equity and bonuses than salary (taxed differently from the company’s perspective, signing bonus is a one-off rather than something that repeats every year). Therefore, it’s not crazy to ask for 1.5X-2X the equity and an extra 10-15% in salary. For the bonus portion, a lot depends on the size of the company, but if you’re talking to a company that’s beyond seed stage, you can safely ask for at least 20% of your base salary as a signing bonus.2

What if the company says no to all or most of these and are a big enough brand to where you don’t have much of a leg to stand on? You can still get creative. One of our users told me about a sweet deal he came up with — he said he’d sign today if he got to choose the team he could join and had a specific team in mind.

Other resources

As I mentioned at the beginning of this post, there are plenty of blog posts and resources on the internets about negotiation, so I’ll just mention two of my favorites. The first is a riveting, first-hand account of negotiation adventures from one of my favorite writers in this space, Haseeb Qureshi. In his post, Haseeb talks about how he negotiated for a 250K (total package) offer with Airbnb and what he learned along the way. It’s one of the most honest and thoughtful accounts of the negotiation process I’ve ever read.

The second post I’ll recommend is a seminal work in salary negotiation by Patrick McKenzie (patio11 on Hacker News, in case that’s more recognizable). I read it back when I was still an engineer, and it was one of those things that indelibly changed how I looked at the world. I still madly link anyone and everyone who asks me about negotiation to this piece of writing, and it’s still bookmarked in my browser.

If you’re an interviewing.io user and have a job offer or five that you’re weighing and want to know exactly what to say when negotiating in your own nuanced, unique situation, please email me, and I’ll whisper sweet, fiscal nothings in your ear like a modern-day Cyrano de Bergerac wooing the sweet mistress that is capitalism.3

1If you’re interviewing at interviewing.io, USE THESE ON ME. IT’LL BE GREAT. And while you’re at it, use these on me as well.

2Some of the larger tech companies offer huge signing bonuses to new grads (~100K-ish). Obviously this advice is not for that situation.

3An increasing number of our customers pay us on subscription, so we don’t get more money if you do.4 And for the ones who don’t, salary and recruiting fees typically come out of a different budget.

4In the early days of interviewing.io, we tried to charge a flat per-hire fee in lieu of a percentage of salary, precisely for this reason — we wanted to set ourselves up as an entirely impartial platform where lining up with our candidates’ best interests was codified into our incentive structure. Companies were pretty weirded out by the flat fee, so we went back to doing percentages, but these days we’re moving over as many of our customers to subscription as possible — it’s cheaper for them, better for candidates, and I won’t lie that I like to see that recurring revenue.

Featured

Uncategorized

We looked at how a thousand college students performed in technical interviews to see if where they went to school mattered. It didn’t.

Posted on February 13th, 2018.

interviewing.io is a platform where engineers practice technical interviewing anonymously. If things go well, they can unlock the ability to participate in real, still anonymous, interviews with top companies like Twitch, Lyft and more. Earlier this year, we launched an offering specifically for university students, with the intent of helping level the playing field right at the start of people’s careers. The sad truth is that with the state of college recruiting today, if you don’t attend one of very few top schools, your chances of interacting with companies on campus are slim. It’s not fair, and it sucks, but university recruiting is still dominated by career fairs. Companies pragmatically choose to visit the same few schools every year, and despite the career fair being one of the most antiquated, biased forms of recruiting that there is, the format persists, likely due to the fact that there doesn’t seem to be a better way to quickly connect with students at scale. So, despite the increasingly loud conversation about diversity, campus recruiting marches on, and companies keep doing the same thing expecting different results.

In a previous blog post, we explained why companies should stop courting students from the same five schools. Regardless of your opinion on how important that idea is (for altruistic reasons, perhaps), you may have been left skeptical about the value and practicality of broadening the college recruiting effort, and you probably concede that it’s rational to visit top schools, given limited resources — while society is often willing to agree that there are perfectly qualified students coming out of non-top colleges, they maintain that they’re relatively rare. We’re here to show you, with some nifty data from our university platform, that this not true.

To be fair, this isn’t the first time we’ve looked at whether where you went to school matters. In a previous post, we found that taking Udacity and Coursera programming classes mattered way more than where you went to school. And way back when, one of our founders figured out that where you went to school didn’t matter at all but that the number of typos and grammatical errors on your resume did. So, what’s different this time? The big, exciting thing is that these prior analyses were focused mostly on engineers who had been working for at least a few years already, making it possible to argue that a few years of work experience smoothes out any performance disparity that comes from having attended (or not attended a top school). In fact, the good people at Google found that while GPA didn’t really matter after a few years of work, it did matter for college students. So, we wanted to face this question head-on and look specifically at college juniors and seniors while they’re still in school. Even more pragmatically, we wanted to see if companies limiting their hiring efforts to just top schools means they’re going to get a higher caliber of candidate.

Before delving into the numbers, here’s a quick rundown of how our university platform works and the data we collect.

The setup

For students who want to practice on interviewing.io, the first step is a brief (~15-minute) coding assessment on Qualified to test basic programming competency. Students who pass this assessment, i.e. those who are ready to code while another human being breathes down their neck, get to start booking practice interviews.

When an interviewer and an interviewee match on our platform, they meet in a collaborative coding environment with voice, text chat, and a whiteboard and jump right into a technical question. Check out our recordings page to see this process in action.

Interview questions on the platform tend to fall into the category of what you’d encounter at a phone screen for a back-end software engineering role, and interviewers typically come from top companies like Google, Facebook, Dropbox, Airbnb, and more.

After every interview, interviewers rate interviewees on a few different dimensions, including technical ability. Technical ability gets rated on a scale of 1 to 4, where 1 is “poor” and 4 is “amazing!”. On our platform, a score of 3 or above has generally meant that the person was good enough to move forward. You can see what our feedback form looks like below:

new_interviewer_feedback_circled

On our platform, we’re fortunate to have thousands of students from all over the U.S., spanning over 200 universities. We thought this presented a unique opportunity to look at the relationship between school tier and interview performance for both juniors (interns) and seniors (new grads). To study this relationship, we first split schools into the following four tiers, based on rankings from U.S. News & World Report:

  • “Elite” schools (e.g. MIT, Stanford, Carnegie Mellon, UC-Berkeley)
  • Top 15 schools (not including top tier, e.g. University of Wisconsin, Cornell, Columbia)
  • Top 50 schools (not including top 15, e.g. Ohio State University, NYU, Arizona State University)
  • The rest (e.g. Michigan State, Vanderbilt University, Northeastern University, UC-Santa Barbara)

Then, we ran some statistical significance testing on interview scores vs. school tier to see if school tier mattered, for both interns (college juniors) and new grads (college seniors), comprising a set of roughly 1000 students.

Does school have anything to do with interview performance?

In the graphs below, you can see technical score distributions for interviews with students in each of the four school tiers (see legend). As you recall from above, each interview is scored on a scale of 1 to 4, where 1 is the worst and 4 is the best.

First, the college juniors…

interns by tier

And then, the seniors…

New grads by tier

What’s pretty startling is that the shape of these distributions, for both juniors and seniors, is remarkably similar. Indeed, statistical significance testing revealed no difference between students of any tier when it came to interview performance.1 What this means is that top-tier students are achieving the same results as those in no-name schools. So the question becomes: if the students are comparable in skill, why are companies spending egregious amounts of money attracting only a subset of them?

Okay, so what are companies missing?

Besides missing out on great, cheaper-to-acquire future employees, companies are missing out on an opportunity to save time and money. Right now a ridiculous amount of money is being spent on university recruiting. We’ve previously cited the $18k price tag just for entry to the MIT career fair. In a study done by Lauren Rivera through the Harvard Business Review, she reveals that one firm budgeted nearly $1m just for social recruiting events on a single campus.

The higher price tag of these events also means it makes even less sense for smaller companies or startups to try and compete with high-profile, high-profit tech giants. Most of the top schools that are being heavily pursued already have enough recruiters vying for their students. Unwittingly, this pursuit seems to run contrary to most companies desires for high diversity and long-term sustainable growth.

Even when companies do believe talent is evenly distributed across school tiers, there are still reasons for why companies might recruit at top schools. There are other factors that help elevate certain schools in a recruiter’s mind. There are long-standing company-school relationships (for example, the number of alumni who work at the company currently). There are signaling effects too — companies get Silicon Valley bonus points by saying their eng team is comprised of a bunch of ex-Stanford, ex-MIT, ex- etc. etc. students.

A quick word about selection bias

Since this post appeared on Hacker News, there’s been some loud, legitimate discussion about how the pool of students on interviewing.io may not be representative of the population at large because we have a self-selected pool of students who decided to practice interviewing. Certainly, all the blog posts we publish are subject to this (very valid) line of criticism, and for this post in particular. As such, selection bias in our user pool might mean that 1) we’re getting only the worst students from top schools (because, presumably, the best ones don’t need the practice), or 2) we’re getting only the best/most motivated students for non-top schools, or both. Any subset of these is entirely possible, but we have a few reasons why we believe what we’ve published here might hold true regardless.

First off, in our experience, regardless of their background or pedigree, everyone is scared of technical interviewing. Case in point… before we started working on interviewing.io, we didn’t really have a product yet. So before investing a lot of time and heartache into this questionable undertaking, we wanted to test the waters to see if interview practice was something engineers really wanted, and more so, who these engineers that wanted practice were. So, we put up a pretty mediocre landing page on Hacker News… and got something like 7,000 signups the first day. Of these 7,000 signups, roughly 25% were senior (4+ years of experience) engineers from companies like Google and Facebook (this isn’t to say that they’re necessarily the best engineers out there… but just that the engineers the market seems to value the most still need our service).

Another data point comes from one of our founders. Every year, Aline does a guest lecture on job search preparedness for a technical communication course at MIT. This course is one way to fulfill the computer science major communication requirement, so enrollment tends to span the gamut of computer science students. Before every lecture, she sends out a survey asking students what their biggest pain points are in preparing for their job search. Every year, trepidation about technical interviewing is either at the top of the list of 2nd from the top.

And though this doesn’t directly address the issue of whether we’re only getting the best of the worst or the worst of the best (I hope the above has convinced you there’s more to it than that), here’s the distribution of school tiers among our users, which I expect mirrors the kinds of distributions companies see in their student applicant pool:

New grads by tier

So what can companies do?

As such, companies may never stop recruiting at top-tier schools entirely, but they ought to at least include schools outside of that very small circle in the search for future employees. The end result of the data is the same: for good engineers, school means a lot less than we think. The time and money that companies put in to compete for candidates within the same select few schools would be better spent creating opportunities that include everyone, as well as developing tools to vet students more fairly and efficiently.

As you saw above, we used a 15-minute coding assessment to cull our inbound student flow, and just a short challenge leveled the playing field between students from all walks of life. At the very least, we’d recommend employers do the same thing in their process. But, of course, we’d be remiss if we didn’t suggest one other thing.

At interviewing.io, we’ve proudly built a platform that grants the best-performing students access to top employers, no matter where they went to school or where they come from. Our university program, in particular, allows us to grant companies the privilege to reach an exponentially larger pool of students, for the same cost of attending one or two career fairs at top target schools. Want diverse, top talent without the chase? Sign up to be an employer on our university platform!

1Of course, this hinges on everyone completing a quick 15-minute coding challenge first, to ensure they’re ready for synchronous technical interviews. We’re excited about this because companies can replicate this step in their process as well!

Featured

Uncategorized

What do the best interviewers have in common? We looked at thousands of real interviews to find out.

Posted on November 29th, 2017.

At interviewing.io, we’ve analyzed and written at some depth about what makes for a good interview from the perspective of an interviewee. However, despite the inherent power imbalance, interviewing is a two-way street. I wrote a while ago about how, in this market, recruiting isn’t about vetting as much as it is about selling, and not engaging candidates in the course of talking to them for an hour is a woefully missed opportunity. But, just like solving interview questions is a learned skill that takes time and practice, so, too, is the other side of the table. Being a good interviewer takes time and effort and a fundamental willingness to get out of autopilot and engage meaningfully with the other person.

Of course, everyone and their uncle has strong opinions about what makes someone a good interviewer, so instead of waxing philosophical, we’ll present some data and focus on analytically answering questions like… Does it matter how strong of an engineering brand your company has, for instance? Do the questions you ask actually help get candidates excited? How important is it to give good hints to your candidate? How much should you talk about yourself? And is it true that, at the end of the day, what you say is way less important than how you make people feel?1 And so on.

Before I delve into our findings, I’ll say a few words about interviewing.io and the data we collect.

The setup

interviewing.io is an anonymous technical interviewing platform. On interviewing.io, people can practice technical interviewing anonymously, and if things go well, unlock real (still anonymous) interviews with companies like Lyft, Twitch, Quora, and more.

The cool thing is that both practice and real interviews with companies take place within the interviewing.io ecosystem. As a result, we’re able to collect quite a bit of interview data and analyze it to better understand technical interviewing. One of the most important pieces of data we collect is feedback from both the interviewer and interviewee about how they thought the interview went and what they thought of each other. If you’re curious, you can watch a real interview on our recordings page, and see what the feedback forms for interviewers and interviewees look like below — in addition to one direct yes/no question, we also ask about a few different aspects of interview performance using a 1-4 scale. We also ask interviewees some extra questions that we don’t share with their interviewers, one of which is their own take on how they thought they did.

Feedback form for interviewers

Feedback form for interviewers

Feedback form for interviewees

Feedback form for interviewees

In this post, we’ll be analyzing feedback and outcomes of thousands of real interviews with companies to figure out what traits the best interviewers have in common.

Before we get into the nitty-gritty of individual interviewer behaviors, let’s first put the value of a good interviewer in context by looking at the impact of a company’s brand on the outcome. After all, if brand matters a lot, then maybe being a good interviewer isn’t as important as we might think.

Brand strength

So, does brand really matter for interview outcomes? One quick caveat before we get into the data: every interview on the platform is user-initiated. In other words, once you unlock our jobs portal (you have to do really well in practice interviews to do so), you decide who you talk to. So, candidates talking to companies on our platform will be predisposed to move forward because they’ve chosen the company in the first place. And, as should come as no surprise to anyone, companies with a very strong brand have an easier time pulling candidates (on our platform and out in the world at large) than their lesser-known counterparts. Moreover, many of the companies we work with do have a pretty strong brand, so our pool isn’t representative of the entire branding landscape. However, all is not lost — in addition to working with very recognizable brands, we work with a number of small, up-and-coming startups, so we hope that if you, the reader, are coming from a company that’s doing cool stuff but that hasn’t yet become a household name, our findings likely apply to you. And, as you’ll see, getting candidates in the door isn’t the same as keeping them.

To try to quantify brand strength, we used three different measures: the company’s Klout Score (yes, that still exists), its Mattermark Mindshare Score, and its score on Glassdoor (under general reviews).2

When we looked at interview outcomes relative to brand strength, its impact was not statistically significant. In other words, we found that brand strength didn’t matter at all when it came to either whether the candidate wanted to move forward or how excited the candidate was to work at the company.

This was a bit surprising, so I decided to dig deeper. Maybe brand strength doesn’t matter overall but matters when the interviewer or the questions they asked aren’t highly rated? In other words, can brand buttress less-than-stellar interviewers? Not so, according to our data. Brand didn’t matter even when you corrected for interviewer quality. In fact, of the top 10 best-rated companies on our platform, half have no brand to speak of, 3 are mid-sized YC companies that command respect in Bay Area circles but are definitely not universally recognizable, and only 2 have anything approaching household name status.

So, what’s the takeaway here? Maybe the most realistic thing we can say is that while brand likely matters a lot for getting candidates in the door, once they’re in, no matter how well-branded you are, they’re yours to lose.

Choosing the question

If brand doesn’t matter once you’ve actually gotten a candidate in the door, then what does? Turns out, the questions you ask matter a TON. As you recall, feedback on interviewing.io is symmetric, which means that in addition to the interviewer rating the candidate, the candidate also rates the interviewer, and one of the things we ask candidates is how good the question(s) they got asked were.

Question quality was extremely significant (p < 0.002 with an effect size of 1.25) when it came to whether the candidate wanted to move forward with the company. This held both when candidates did well and when they did poorly.

While we obviously can’t share the best questions (these are company interviews, after all), we can look at what candidates had to say about the best and worst-rated questions on the platform.

The good

I liked the fact that questions were building on top of each other so that previous work was not wasted and
finding ways to improve on the given solution.

Always nice to get questions that are more than just plain algorithms.

Really good asking of a classic question, opened my mind up to edge cases and considerations that I never contemplated the couple of times I’ve been exposed to the internals of this data structure.

This was the longest interviewing.io interview I have ever done, and it is also the most enjoyable one! I really like how we started with a simple data structure and implemented algorithms on top of it. It felt like working on a simple small-scale project and was fun.

He chose an interesting and challenging interview problem that made me feel like I was learning while I was solving it. I can’t think of any improvements. He would be great to work with.

I liked the question — it takes a relatively simple algorithms problem (build and traverse a tree) and adds some depth. I also liked that the interviewer connected the problem to a real product at [Redacted] which made it feel like less like a toy problem and more like a pared-down version of a real problem.

This is my favorite question that I’ve encountered on this site. it was one of the only ones that seem like it had actual real-life applicability and was drawn from a real (or potentially real) business challenge. And it also nicely wove in challenges like complexity, efficiency, and blocking.

The bad

Question wasn’t straightforward and it required a lot of thinking/understanding since functions/data structures weren’t defined until a lot later. [Redacted] is definitely a cool company to work for, but some form of structure in interviews would have been a lot more helpful. Spent a long time figuring out what the question is even asking, and interviewer was not language-agnostic.

I was expecting a more technical/design question that showcases the ability to think about a problem. Having a domain-specific question (regex) limits the ability to show one’s problem-solving skills. I am sure with enough research one could come up with a beautiful regex expression but unless this is something one does often, I don’t think it [makes for] a very good assessment.

This is not a good general interview question. A good interview question should have more than one solution with simplified constraints.

Anatomy of a good interview question

  1. Layer complexity (including asking a warmup)
  2. No trivia
  3. Real-world components/relevance to the work the company is doing are preferable to textbook algorithmic problems
  4. If you’re asking a classic algorithmic question, that’s ok, but you ought to bring some nuance and depth to the table, and if you can teach the interviewee something interesting in the process, even better!

Asking the question

One of the other things we ask candidates after their interviews is how helpful their interviewer was in guiding them to the solution. Providing your candidate with well-timed hints that get them out of the weeds without giving away too much is a delicate art that takes a lot of practice (and a lot of repetition), but how much does it matter?

As it turns out, being able to do this well matters a ton. Being good at providing hints was extremely significant (p < 0.00001 with an effect size of 2.95) when it came to whether the candidate wanted to move forward with the company (as before, we corrected for whether the interview went well).

You can see for yourself what candidates thought of their interviewers when it came to their helpfulness and engagement below. Though this attribute is a bit harder to quantify, it seems that hint quality is actually a specific instance of something bigger, namely the notion of turning something inherently adversarial into a collaborative exercise that leaves both people in a better place than where they started.3

And if you can’t do that every time, then at the very least, be present and engaged during the interview. And no matter what the devil on your shoulder tells you, no good will ever come of opening Reddit in another tab.4

One of the most memorable, pithy conversations I ever had about interviewing was with a seasoned engineer who had spent years as a very senior software architect at a huge tech company before going back to what he’d always liked in the first place, writing code. He’d conducted a lot of interviews over a career spanning several decades, and after trying out a number of different interview styles, what he settled on was elegant, simple, and satisfying. According to him, the purpose of any interview is to “see if we can be smart together.” I like that so much, and it’s advice I repeat whenever anyone will listen.

The good

I liked that you laid out the structure of the interview at the outset and mentioned that the first question did not have any tricks. That helped set the pace of the interview so I didn’t spend an inordinate amount of time on the first one.

The interview wasn’t easy, but it was really fun. It felt more like making a design discussion with a colleague than an interview. I think the question was designed/prepared to fill the 45 minute slot perfectly.

I’m impressed by how quickly he identified the issue (typo) in my hash computation code and how gently he led me to locating it myself with two very high-level hints (“what other tests cases would you try?” and “would your code always work if you look for the the pattern that’s just there at the beginning of the string?”). Great job!

He never corrected me, instead asked questions and for me to elaborate in areas where I was incorrect – I very much appreciate this.

The question seemed very overwhelming at first but the interviewer was good at helping to break it down into smaller problems and suggest we focus on one of those first.

The bad

[It] was a little nerve-wracking hearing you yawn while I was coding.

What I found much more difficult about this interview was the lack of back and forth as I went along, even if it was simple affirmation that “yes, that code you just wrote looks good”. There were times when it seemed like I was the only one who had talked in the past five minutes (I’m sure that’s an exaggeration). This made it feel much more like a performance than like a collaboration, and my heart was racing at the end as a result.

While the question was very straightforward, and [he] was likely looking for me to blow through it with no prompting whatsoever in order to consider moving forward in an interview process, it would have been helpful to get a discussion or even mild hinting from him when I was obviously stuck thinking about an approach to solve the the problem. While I did get to the answer in the end, having a conversation about it would have made it feel more like a journey and learning experience. That would have also been a strong demonstration of the collaborative culture that exists while working with teams of people at a tech company, and would have sold me more vis-a-vis my excitement level.

If an interview is set to 45 minutes, the questions should fit this time frame, because people plan accordingly. I think that if you plan to have a longer interview you should notify the interviewee beforehand, so he can be ready for it.

One issue I had with the question though is what exactly he was trying to evaluate from me with the question. At points we talking about very nitty-gritty details about python linked list or array iteration, but it was unclear at any point if that was what he was judging me on. I think in the future he could outline at the beginning what exactly he was looking for with the problem in order to keep the conversation focused and ensure he is well calibrated judging candidates.

Try to be more familiar with all the possible solutions to the problem you choose to pose to the candidate. Try to work on communicating more clearly with the candidate.

Anatomy of a good interview

  1. Set expectations, and control timing/pacing
  2. Be engaged!
  3. Familiarity with the problem and its associated rabbit holes/garden paths
  4. Good balance of hints and letting candidate think
  5. Turn the interview into a collaborative exercise where both people are free to be smart together

The art of storytelling… and the importance of being human

Beyond choosing and crafting good questions and being engaged (but not overbearing) during the interview, what else do top-rated interviewers have in common?

The pervasive common thread I noticed among the best interviewers on our platform is, as above, a bit hard to quantify but dovetails well with the notion of being engaged and creating a collaborative experience. It’s taking a dehumanizing process and elevating it to an organic experience between two capable, thinking humans. Many times, that translates into revealing something real about yourself and telling a story. It can be sharing a bit about the company you work at and why, out of all the places you could have landed, you ended up there. Or some aspect of the company’s mission that resonated with you specifically. Or how the projects you’ve worked on tie into your own, personal goals.

The good

I like the interview format, in particular how it was primarily a discussion about cool tech, as well as an honest description of the company… the discussion section was valuable, and may be a better gauge of fit anyway. It’s nice to see a company which places value on that 🙂

The interviewer was helpful throughout the interview. He didn’t mind any questions on their company’s internal technology decisions, or how it’s structured. I liked that the interviewer gave me a good insight of how the company functions.

Extremely kind and very generous with explaining everything they do at [redacted]. Really interested in the technical challenges they’re working on. Great!

Interesting questions but the most valuable and interesting thing were the insights he gave me about [redacted]. He sounded very passionate about engineering in general, particularly about the challenges they are facing at [redacted]. Would love to work with him.

The bad

[A] little bit of friendly banter (even if it’s just “how are you doing”?) at the very beginning of the interview would probably help a bit with keeping the candidate calm and comfortable.

I thought the interview was very impersonal, [and] I could not get a good read on the goal or mission of the company.

And, as we wrote about in a previous post, one of the most genuine, human things of all is giving people immediate, actionable feedback. As you recall, during the feedback step that happens after each interview, we ask interviewees if they’d want to work with their interviewer. As it turns out, there’s a very statistically significant relationship (p < 0.00005)5 between whether people think they did well and whether they’d want to work with the interviewer. This means that when people think they did poorly, they may be a lot less likely to want to work with you. And by extension, it means that in every interview cycle, some portion of interviewees are losing interest in joining your company just because they didn’t think they did well, despite the fact that they actually did.

How can one mitigate these losses? Give positive, actionable feedback immediately (or as soon as possible)! This way people don’t have time to go through the self-flagellation gauntlet that happens after a perceived poor performance, followed by the inevitable rationalization that they totally didn’t want to work there anyway.

How to be human

  1. Talk about what your company does… and what specifically about it appealed to you and made you want to join
  2. Talk about what you’re currently working on and how that fits in with what you’re passionate about
  3. When you like a candidate, give positive feedback as quickly as you can to save them from the self-flagellation that they’ll likely go through otherwise… and which might make them rationalize away wanting to work with you
  4. And, you know, be friendly. A little bit of warmth can go a long way.

Becoming a better interviewer

Interviewing people is hard. It’s hard to come up with good questions, it’s hard to give a good interview, and it’s especially hard to be human in the face of conducting a never-ending parade of interviews. But, being a good interviewer is massively important. As we saw, while your company’s brand will get people in the door, once they’ve reached the technical interview, the playing field is effectively level, and you can no longer use your brand as a crutch to mask poor questions or a lack of engagement. And in this market, where the best candidates have a ton of options, when wielded properly, a good interview that elevates a potentially cold, transactional interaction into something real and genuine can become the selling point that gets great engineers to work for you, whether you’re a household name or a startup that just got its first users.

Given how important it is to do interviews well, what are some things you can do to get better right away? One thing I found incredibly useful for coming up with good, original questions is to start a shared doc with your team where every time someone solves a problem they think is interesting, no matter how small, they jot down a quick note. These notes don’t have to be fleshed out at all, but they can be the seeds for unique interview questions that give candidates insight into the day-to-day at your company. Turning these disjointed seeds into interview questions takes thought and effort — you have to prune out a lot of the details and distill the essence of the problem into something it doesn’t take the candidate a lot of work/setup to grok, and you’ll likely have to iterate on the question a few times before you get it right — but they payoff can be huge.

Another thing you can do to get actionable feedback like the kind you saw in this post (and then immediately level up) is to get on interviewing.io as an interviewer. If you interview people in our double-blind practice pool, no one will know who you are or which company you represent, which means that you get a truly unbiased take on your interviewing ability, which includes your question quality, how excited people would be to work with you, and how good you are at helping people along without giving away too much. It’s also a great way to go beyond your team, which can be pretty awkward, and try out new questions on a very engaged, high-quality user base. You’ll also get to keep replays of your interviews so you can revisit crucial moments and figure out exactly what you need to do to get better next time.

Become a better interviewer with honest, actionable feedback from candidates

Become a better interviewer with honest, actionable feedback from candidates

Want to hone your skills as an interviewer? Want to help new interviewers at your company warm up before they officially get added to your interview loops? You can sign up to our platform as an interviewer, or (especially for groups) ping us at interviewers@interviewing.io.

1“People will forget what you said, people will forget what you did, but people will never forget how you made them feel.” -Maya Angelou

2It’s important to call out that brand and engineering brand are two separate things that can diverge pretty wildly. For instance, Target has a strong brand overall but probably not the best engineering brand (sorry). Heap, on the other hand, is one of the better-respected places to work among engineers (both on interviewing.io and off), but it doesn’t have a huge overall brand. Both the Klout and Mattermark Mindshare scores aren’t terrible for quantifying brand strength, but they’re not amazing at engineering brand strength (they’re high for Target and low for Heap). The Glassdoor score is a bit better because reviewers tend to skew engineering-heavy, but it’s still not that great of a measure. So, if anyone has a better way to quantify this stuff, let me know. If I were doing it, I’d probably look at GitHub repos of the company and its employees, who their investors are, and so on and so forth. But that’s a project that’s out of scope for this post.

3If you’re familiar with Dan Savage’s campsite rule for relationships, I think there should be a similar for interviewing… leave your candidates in better shape than when you found them.

4Let us save you the time: Trump is bad, dogs are cute, someone ate something.

5This time with even more significance!

Featured

Uncategorized

If you care about diversity, don’t just hire from the same five schools

Posted on October 24th, 2017.

EDIT: Our university hiring platform is now on Product Hunt!

If you’re a software engineer, you probably believe that, despite some glitches here and there, folks who have the technical chops can get hired as software engineers. We regularly hear stories about college dropouts, who, through hard work and sheer determination, bootstrapped themselves into millionaires. These stories appeal to our sense of wonder and our desire for fairness in the world, but the reality is very different. For many students looking for their first job, the odds of breaking into a top company are slim because they will likely never even have the chance to show their skills in an interview. For these students (typically ones without a top school on their resume), their first job is often a watershed moment where success or failure can determine which opportunities will be open to them from that point forward and ultimately define the course of their entire career. In other words, having the right skills as a student is nowhere near enough to get you a job at a top-tier tech company.

To make this point concrete, consider three (fictitious, yet indicative) student personas, similar in smarts and skills but attending vastly different colleges. All are seeking jobs as software engineers at top companies upon graduation.

Mason goes to Harvard. He has a mediocre GPA but knows that doesn’t matter to tech companies, where some of his friends already work. Come September, recent graduates and alums fly back to campus on their company’s dime in order to recruit him. While enjoying a nice free meal in Harvard Square, he has the opportunity to ask these successful engineers questions about their current work. If he likes the company, all he has to do is accept the company’s standing invitation to interview on campus the next morning.

Emily is a computer science student at a mid-sized school ranked in the top 30 for computer science. She has solid coursework in algorithms under her belt, a good GPA, and experience as an engineering intern at a local bank. On the day of her campus’s career fair, she works up the courage to approach companies – this will be her only chance to interact with companies where she dreams of working. Despite the tech industry being casual, the attire of this career fair is business formal with a tinge of sweaty. So after awkwardly putting together an outfit she would never wear again1, she locates an ancient printer on the far side of campus and prints 50 copies of her resume. After pushing through the lines in order to line up at the booths of tech companies, she gives her resume to every single tech company at the fair over the course of several hours. She won’t find out for two more weeks if she got any interviews.

Anthony goes to a state school near the town where he grew up. He is top in his class, as well as a self-taught programmer, having gone above and beyond his coursework to hack together some apps. His school’s career fair has a bunch of local, non-tech employers. He has no means of connecting with tech companies face-to-face and doesn’t know anyone who works in tech. So, he applies to nearly a hundred tech companies indiscriminately through their website online, uploading his resume and carefully crafted cover letter. He will probably never hear from them.

Career fair mania

The status quo in university recruiting revolves around career fairs and in-person campus recruiting, which have serious limitations. For one, they are extremely expensive, especially at elite schools. Prime real estate at the MIT career fair will run you a steep $18,000, for entry alone. That’s not counting the price of swag (which gets more exorbitant each year), travel, and, most importantly, the opportunity cost of attending engineers’ time. While college students command the lowest salaries, it’s not uncommon for tech companies to spend 50% more on recruiting a student than a senior engineer.

At elite schools, the lengths to which companies go to differentiate themselves is becoming more exorbitant with each passing year. In fact, students at elite colleges suffer from company overload because every major tech company, big and small, is trying to recruit them. All of this, while students at non-elite colleges are scrambling to get their foot in the door without any recruiters, let alone VPs of high-profile companies, visiting their campus.

Of course, due to this cost, companies are limited in their ability to visit colleges in person, and even large companies can visit around 15 or 20 colleges at most. This strategy overlooks top students at solid CS programs that are out of physical reach.

In an effort to overcome this, companies are attending conferences and hackathons out of desperation to reach students at other colleges. The sponsorship tier for the Grace Hopper Conference, the premier gathering for women in tech, tops out at $100,000, with the sponsorship tier to get a single interview booth starting at $30,000. Additionally, larger companies send representatives (usually engineers) to large hackathons in an effort to recruit students in the midst of a 48-hour all-nighter. However, the nature of in-person career fairs and events are that not all students will be present. Grace Hopper is famously expensive to attend as a student, especially when factoring in airfare and hotel.

This cost is inefficient at best, and prohibitive at worst, especially for small startups with low budget and brand. Career fairs serve a tiny portion of companies and a tiny portion of students, and the rest are caught in the pecuniary crossfire. Demand for talented engineers out of college who bring a different lived experience to tech has never been higher, yet companies are passing on precisely these students via traditional methods. Confounding the issue even further is the fundamental question of whether having attended a top school has much bearing on candidate quality in the first place (more on that in the section on technical screening below).

Homogeneity of hires

The focus of companies on elite schools has notable, negative implications for the diversity of their applicants. In particular, many schools that companies traditionally visit are notably lacking in diversity, especially when it comes to race and socioeconomic status. According to a survey of computer science students at Stanford, there were just fifteen Hispanic female and fifteen black female computer science majors in the 2015 graduating class total. In this analysis, the Stanford 2015 CS major was 9% Hispanic and 6% black. According to a 2015 analysis, the Harvard CS major was just 3% black and 5 percent Hispanic. Companies that are diversity-forward and constrained to recruiting at the same few schools end up competing over this small pool of diverse students. Meanwhile, there is an entire ocean of qualified, racially diverse students from less traditional backgrounds whom companies are overlooking.

The focus on elite schools also has meaningful implications on socioeconomic diversity. According to a detailed New York Times infographic, “four in 10 students from the top 0.1 percent attend an Ivy League or elite university, roughly equivalent to the share of students from poor families who attend any two- or four-year college.” The infographic highlights the rigid segmentation of students by class background in college matriculation.

Source: New York Times

The article finds that the few lower-income students who end up at elite colleges do about as well as their more affluent classmates but that attending an elite versus non-elite college makes a huge difference in future income.

The focus of tech companies on elite schools lends credence to this statistic, codifying the rigidity with which students at elite college are catapulted into the 1 percent, while others are left behind. Career-wise, it’s that first job or internship you get while you’re still in school that can determine what opportunities you have access to in the future. And yet, students at non-elite colleges have trouble accessing these very internships and jobs, or even getting a meager first round interview, contributing to the lack of social mobility in our society not for lack of skills but for lack of connections. This sucks. A lot.

The technical screen

Let’s return to our three students. Let’s say that Emily, the student who attended her college’s career fair, gets called back by one or two companies for a first round interview if her resume meets the criteria that companies are looking for. Not having an internship at a top tech company already — quite the catch-22 — puts her at a disadvantage. Anthony has little to no chance of hearing back from employers via his applications online, but let’s say that by some miracle lands a phone screen with one of the tech giants (his best shot, as there are more recruiters to look through the resume dump on the other end).

What are their experiences when it comes to prepping for upcoming technical interviews?

Mason, the Harvard student, attends an event on campus with Facebook engineers teaching him how to pass the technical interview. He also accepts a few interviews at companies he’s less excited with for practice, and just in case. While he of course needs be sharp and prepare in order to get good at these sorts of algorithmic problems, he has all of the resources he could ask for and more at his disposal. Unsurprisingly, his Facebook interview goes well.

Emily’s school has an informal, undergraduate computer science club in which they are collectively reading technical interviewing guides and trying to figure out what tech companies want from them. She has a couple interviews lined up, but all of which are for jobs she’s desperate to get. They trade tips after interviews but ultimately have a shaky understanding of they did right and wrong in the absence of post-interview feedback from companies. Only a couple of alumni from their school have made it to top tech companies in the past, and so they lack the kinds of information that Mason has on what companies are looking for. (E.g. Don’t be afraid to take hints, make sure to explain your thought process, what the heck is this CoderPad thing anyway…)

Anthony doesn’t know anyone who has a tech job like the one he’s interviewing for, and only one of his friends is also interviewing. He doesn’t know where to start when it comes to getting ready for his upcoming interview at GoogFaceSoft. He only has one shot at it with no practice interviews lined up. He prepares by googling “tech interview questions” and stumbles upon a bunch of unrealistic interview questions, many of them behavioral or outdated. He might be offered the interview and be fit for the job, but he sure doesn’t know how to pass the interview.

For students who may be unfamiliar with the art of the technical interview, algorithmic interviews can be mystifying, leading to an imbalance of information on how to succeed. Given that technical interviewing is a game, it is important that everyone knows the rules, spoken and unspoken. There are many practice resources available, but no amount of reading and re-reading Cracking the Coding Interview can prepare you for that moment when you are suddenly in a live, technical phone screen with another human.

We built a better way to hire

Ultimately, as long as university hiring relies on a campus-by-campus approach, the status quo will continue to be fundamentally inefficient and unmeritocratic. No company, not even the tech giants, can cover every school or every resume submitted online. And, in the absence of any meaningful information on a student’s resume, companies default to their university as the only proxy. This approach is inefficient at best and, at worst, it’s the first in a series of watershed moments that derail the promise of social mobility for the non-elite, taking with them any hope of promoting diversity among computer science students.

Because this level of inequity, placed for maximum damage right at the start of people’s careers, really pissed us off, we decided to do something about it. interviewing.io’s answer to the unfortunate status quo is a university-specific hiring platform. If you’re already familiar with how core interviewing.io works, you’ll see that the premise is exactly the same. We give out free practice to students, and use their performance in practice to identify top performers, completely independently of their pedigree. Those top performers then get to interview with companies like Lyft and Quora on our platform. In other words, we’re excited to provide students with pathways into tech that don’t involve going to an elite school or knowing someone on the inside. So far, we’ve been very pleased with the results. You can see our student demographics and where they’re coming from below. Students from all walks of life, whether they’re from MIT or a school you’d never visit, are flocking to the platform, and we couldn’t be prouder.

school tier distribution

interviewing.io evaluates students based on their coding skills, not their resume. We are open to students regardless of their university affiliation, college major, and pretty much anything else (we ask for your class year to make sure you’re available when companies want you and that’s about it). Unlike traditional campus recruiting, we attract students organically (getting free practice with engineers from top companies is a pretty big draw) from schools big and small from across the country.

student heatmap

We’re also proud that almost 40 percent of our university candidates come from backgrounds that are underrepresented in tech.

student heatmap

Because of our completely blind, skills-first approach, we’ve seen an interesting phenomenon happen time and time again: when a student unmasks at the end of a successful interview, the company in question realizes that the student who just aced their technical phone screen was one whose resume was sitting at the bottom of the pile all along.

In addition to identifying top students who bring a different lived experience to tech, we’re excited about the economics of our model. With interviewing.io, a mid-sized startup can staff their entire intern class for the same cost as attending 1-2 career fairs at top schools… with a good chunk of those interns coming from underrepresented backgrounds. Want to hire interns and new grads in the most efficient, fair way possible? Sign up to be an employer on our university platform!

Meena runs interviewing.io’s university hiring platform. We help companies hire college students from all over the US, with a focus on diversity. Prior to joining interviewing.io, Meena was a software engineer at Clever, and before that, Meena was in college on the other side of the engineer interviewing equation.

1At least her school didn’t send out this.