Recently, someone asked us how you know you’re ready to succeed in a Facebook/Amazon/Apple/Netflix/Google (FAANG) interview.
It’s an interesting question, and one I’m sure many of you job seekers out there are wondering.
Internally, we have our own beliefs, but we wanted to see if we could answer this question more objectively. So we set off on a journey to acquire data to try answering it.
interviewing.io helps prospective job candidates practice mock interviews with actual interviewers from the major tech companies like the aforementioned FAANG companies, as well as others like Dropbox, Uber, LinkedIn, and Slack.
After each mock interview, candidates are measured on a 1-4 scale against three criteria (technical ability, problem solving, and communication) and are also given an overall hire/no hire rating.
In our analysis, we used these scores to estimate a candidate’s overall skill level, which we would expect to be positively associated with “readiness”.
Additionally, we needed to know whether a job candidate’s skill level and preparation ultimately resulted in the candidate getting the job. Since most real world interviews happen outside of interviewing.io, we surveyed our users to learn how far through the hiring funnel they progressed at three popular tech companies: Google, Facebook, and Amazon. For each of these companies, at least 150 respondents reported participating in the company’s hiring processes. These were the hiring stages we asked for in the survey:
Technical phone interview
We also needed to consider variables that could affect a person’s chance of progressing through the hiring funnel, but weren’t necessarily related to “readiness”. In addition to the mock interview feedback ratings mentioned above, we grabbed the following data about users that we collect at interviewing.io:
How many mock interviews completed on interviewing.io
Self-reported career experience level (e.g. junior, intermediate, experienced)
Finally, we asked survey respondents to share data that our platform doesn’t collect but could be associated with a higher chance of landing a job. Here were those other attributes:
Possessed a computer science degree?
How many real tech interviews they’ve attended as interviewee (excludes mock interviews)
How the person learned to code
How long ago did they last go through a job search
Relationship of respondents’ technical scores to interview success
Intuitively, you would think that better technical ability would be associated with a higher chance of succeeding at interviews. To see if this intuition holds true, let’s look at the frequency of people passing a phone interview, bucketed by a person’s average technical rating on interviewing.io, and see if a positive correlation exists.
Based on 158 respondents who reported to have progressed at least as far as Google’s phone interview phase, we observed a positive-looking relationship, but maybe not as obvious a relationship as one might expect.
Looking at respondents who went through the Facebook and Amazon hiring processes, the relationship seemed even less obvious:
While it would be simple to conclude there exists no relationship between technical ability and passing a phone interview, it seems more likely that we were experiencing selection bias. After all, 60-80% of people progressing past Google/Amazon/Facebook phone screens seems a bit high relative to what one might expect.
Compared to the population of all people on interviewing.io, survey respondents with high average technical scores between 3 and 4 were over-represented relative to those who scored between 1.5 and 2.75. Maybe non-respondents with lower scores and who failed Google/Facebook/Amazon phone interviews happened to be less likely to respond to our survey.
So that’s pretty limiting. But let’s keep in mind that all models are wrong, yet some are useful, so perhaps it’s still possible to learn other stuff from the data. If the survey respondents’ technical abilities weren’t obviously associated with interview success, what about other factors?
More experience with real technical interviews is associated with future interview success
When digging further, the factor that stood out most was how many real technical interviews the candidate had done in the past. Across all three companies, people who had completed 5 or more real technical interviews tended to have higher rates of passing a phone interview than those who had less real interview experience.
It’s possible this large effect could have been confounded by other factors, so I iterated on the model to account for these factors, to see whether the effect of prior interview experience on interview success would continue to shine through.
Here were the factors considered:
Average technical rating
Average problem solving rating
Average communication rating
Number of mock interviews completed on interviewing.io
Did the person have a computer science degree?
For 171 respondents who went through the Amazon hiring process, a statistically significant association continued to exist between prior technical interview experience and succeeding in a phone interview. Below were the predicted probabilities of passing a phone interview for a typical respondent in this survey (males with a computer science degree and 4+ years of experience, with average technical, communication, and problem solving ability), conditional on their prior technical interview experience.
Predicted Probability of Passing Amazon Phone Screen
1-4 technical interviews
5+ technical interviews
For Facebook we observed a similar effect:
Predicted Probability of Passing Facebook Phone Screen
1-4 technical interviews
5+ technical interviews
Finally, for Google applicants we observed that the number of interviewing.io mock interviews completed had the clearest association with interview success, not prior technical interview experience. While mock interviews aren’t exactly the same as real-world technical interviews, they are conducted by the same kinds of people asking similar questions and using similar assessment criteria. Because of this, it seems possible that the effects of mock interviews on interview success could be similar to the effect of real technical interview experience.
Below was the estimated chance of passing the Google phone interview for the typical survey respondent, conditional on the number of interviewing.io practice interviews attended.
Mock interviews on interviewing.io
Predicted Probability of Passing Google Phone Screen
Looking at the other variables we included in the model, we observed no other statistically significant relationships with success in phone interviews. It’s possible those relationships actually exist, but based on this sample, we did not observe enough evidence to reject the null hypothesis that no relationship exists.
So even after accounting for other factors, more technical interview experience was still associated with greater success in phone interviews.
We also wondered whether similar effects might exist when you get further down the hiring funnel. For example, could prior technical interview experience also improve your chances of receiving an offer after an onsite? We performed a similar analysis to predict the chance of receiving an offer conditional upon attending an onsite interview, but we found no noticeable relationships between the predictors listed above and the probability of receiving an offer. Sample sizes are naturally smaller when looking this far down the funnel, and perhaps the differences between candidates are smaller, which could be harder to detect.
Based on an analysis of this particular group of people, it seems we have a pretty clear answer to the original question of “What metric should be used to know you’re prepared to succeed in a FAANG interview?”
The answer is to experience five or more real technical interviews. Simple, right?
Why more experience with technical interviews helps you succeed in future technical interviews
I don’t know about you, but I find that answer unsatisfying. That conclusion seems really questionable and self-serving, especially when the recommendation comes from a company that offers mock technical interviews as a product.
I don’t blame you, and quite frankly, this result wasn’t what we expected either. The metric of technical interviewing experience is probably just a proxy for some other phenomenon that does correlate with interview success in some explainable way. After all, it’s unlikely that just showing up to five interviews will magically bestow upon you new talents.
So what is it about having technical interview experience that might be associated with success in future technical interviews? To dig even deeper, we followed up with survey respondents who successfully received an offer with Google, Facebook, or Amazon, and asked what factors they felt contributed to their success.
As you’d expect, nearly all respondents said practicing technical problems was the foundation for their success. So yes, technical competency matters, and no, past interview experience doesn’t appear to be a substitute for it.
Beyond technical ability, respondents shared anecdotes that hint at why such a relationship might exist, as well as some possible underlying factors that could explain what technical interview experience might proxy for.
Direct feedback from other people helps you improve quicker
It’s one thing to get a question wrong in a practice environment, and it’s another thing to get a question wrong when you have an interviewer looking over your shoulder. A few respondents shared instances when they performed poorly in an interview, and explained how those instances influenced their future behavior. These negative experiences clearly highlighted areas in their skillset or presentation that companies tended to rate less favorably.
I bombed multiple phone interviews with both Google and Facebook where the questions were about graphs or trees, and the questions were actually trivial. I didn’t have a formal CS background, and I knew that I was weak in those areas.
Once identified, respondents addressed those weaknesses through study and repetition, helping them allocate their preparation time more effectively.
For one job hunter, a past Google interview failure not only helped shore up a specific technical weakness, but also helped the person learn how to maintain a more focused mindset in general, which proved valuable in a future interview with Facebook.
Because my main weak spot in my Google onsite was being rusty in data structures and algorithms, I studied key data structures in CLRS, such as heaps and red-black trees. The Facebook interviews did not actually ask me to implement any data structures, but studying data structures helped keep my mind “on the game”.
Obviously, there are many other ways to receive feedback other than getting it directly from another person. For example, respondents also made extensive use of LeetCode and Cracking the Coding Interview-style exercises, and I’m sure many of you out there do too.
However, receiving direct feedback from another person seems different for some reason. Whether it’s about social acceptance or proving oneself or something else, feedback from another person seems to be internalized more, which can act as an efficient catalyst for personal growth as long as you keep your mind open to suggestions.
Respondents found direct personal feedback to be very useful, not only from feedback received in real interviews, but also from direct feedback in non-interview settings.
Whether it’s interviewing.io or friends asking each other questions or any generic peer interview platform, practice is different than practicing yourself. I found interviewers to be a resource rather than just someone evaluating you which is something you don’t get when practicing yourself.
While real interviews give you unambiguous feedback about your overall performance (i.e. you did or didn’t get the job), you don’t always receive specific feedback about problems you answered well or skills that you exhibited effectively. Simulated interview environments can be uniquely beneficial because they allow job seekers to engage in a two-way dialogue with the interviewer, which can yield more information than a simple “hire” or “no hire” decision.
For practice interviews, I worked with a friend who was also interviewing for Google and other FAANG companies. I heavily leaned on interviewing.io interviews from FAANG interviewers during the last stage of my preparation, to make sure I was ready for Google, Amazon and FAANG interviews in particular.
Knowing I received solid (and specific) feedback from FAANG interviewers was a huge help and signal to me that I was ready for Google and Amazon interviews.
So maybe the reason why prior technical interview experience correlates with interview success is because interviews happen to be the most common avenue for receiving direct, honest feedback from other people about how you and your skills are perceived.
You learn how to communicate in an interview setting, which is different than how you communicate in everyday work
Technical interviews can also require different communication patterns than what you might normally use in a typical workplace or academic environment.
For example, some respondents believe that it’s not enough to solve a problem correctly. Additionally, you are expected to narrate your thought process as you solve it.
Another thing study materials remind people is the need to constantly communicate your thought processes during the coding interview. Having done many interviews (and many more pair programming sessions), this is second nature to me, but I know it’s not second nature to everyone and bears repeating.
This implicit expectation can catch some off-guard, particularly if their preparation focused solely on programming exercises.
Solving Leetcode on your own is quite different from having to explain your thinking process to someone else.
These perspectives echo the advice given by our interviewer Ian Douglas in his guest blog post. All five of Ian’s tips help you improve how you communicate with your interviewer while you’re in the middle of the stressful interview environment. At the end of the day, a debugger won’t be making the final assessment on you, a bunch of human beings will, and the things interviewers look for encompass a lot more than the correctness of the code you write.
It is okay to get some guidance from the interviewer. You actually can feel that the interviewers are evaluating more than just your problem solving skills (your communication, how you work as a team, will you be a good person to collaborate with, etc).
By doing more technical interviews, you gain a better understanding of the unique interpersonal dynamics that exist during interviews. Those dynamics impact how interviewers assess you, and failing to adapt to those dynamics could obfuscate your true abilities. But once you’ve gained that understanding, you are able to hone specialized interviewing skills like the ones Ian suggests and apply them in future interviews.
One respondent took this concept to an extreme, re-learning a particular programming language from his or her past to be used primarily for interviewing.
[Python] is much more succinct and expressive than C++, Java, or C#, which I had used earlier in my career. By not having to write all the braces and semicolons, I free my hands and mind to dig deeper into the problem and better engage with the interviewer. I haven’t made my way up Paul Graham’s succinctness = power hierarchy, but in an interview situation, communication is the most important thing, and Python allows me to communicate with the interviewer better than C++ does.
This tactic probably won’t work for everyone. But there probably does exist a tactic that works for you. The more technical interviews you experience, the more chances you’ll have to discover those tactics.
You learn a lot about how to interview effectively when you communicate directly with other people
Going back to the original question, we said that the metric you should track is how many technical interviews you’ve experienced, because that is what the analysis of this particular set of people outputted.
But that shouldn’t be your main takeaway. The real learning is to acquire another person’s opinion about your interview performance, because you’ll learn a lot of different things from that person’s feedback.
We can help you accomplish that here at interviewing.io, with options to receive direct feedback from currently-employed Google or Facebook interviewers, and even help assess specific skills like systems design or front-end development. But as our survey respondents mentioned, there are other ways of receiving that direct feedback.
As I’ve written before, interviewing isn’t all that objective because people aren’t always objective. Given that interviews are still going to be conducted by other people for the foreseeable future, gaining direct feedback from others appears to be an effective tool for succeeding within the existing system, so you might as well take advantage of it.
As an incentive for responding to the survey, we gave a reward of either a $30 credit toward a professional interview or a free peer interview on our platform. If there were any bias, we’d have thought people who scored lower would have been comparatively more likely to respond, rather than less.↩ The idea that Python might be a better programming language for interviews isn’t totally crazy. Empirically, interviewing.io mock interviews conducted in Python have the second-highest success rate among the most popular programming languages. The highest? C++. So if you’re blindly going by the numbers, the respondent should have stuck to C++. However, chances are that this data also suffers from selection bias: people who know C++ might be different in many ways than people who know Python. Key point: use whatever works for you. For this person, Python happened to work better than C++.↩
Hey, Aline (founder of interviewing.io) here. This is the third post in our Guest Author series.
In this post, our latest Guest Author looks at interviews from the company’s perspective. So much engineering time goes into interviewing… we know this firsthand, but what can be done about it? Some companies solve this problem by introducing homework. In this post, our Author digs into some historical data to unearth a really clever, elegant way to save eng time that’s also better for candidate experience!
If you have strong opinions about interviewing or hiring that you’ve been itching to write about, we’d love to hear from you. Please email me at firstname.lastname@example.org to get started.
Alexey Komissarouk is a growth engineering leader. He’s currently working on growth at MasterClass. Before that, he spent 2016-2020 at Opendoor, first as an early engineer, then as an Engineering Manager. Between 2013 and 2016, he built out a product engineering consulting company, helping clients such as Dropbox, Pebble, Boomerang, and Binti grow and expand lines of business through a combination of product management and engineering. In his other lives, Alexey co-founded a boutique work+travel company, Hacker Paradise. Since the company’s inception in 2015, they’ve run trips to over a dozen locations and been joined by more than 800 alumni.
“The new VP wants us to double engineering’s headcount in the next six months. If we have a chance in hell to hit the hiring target, you seriously need to reconsider how fussy you’ve become.”
It’s never good to have a recruiter ask engineers to lower their hiring bar, but he had a point. It can take upwards of 100 engineering hours to hire a single candidate, and we had over 50 engineers to hire. Even with the majority of the team chipping in, engineers would often spend multiple hours a week in interviews. Folks began to complain about interview burnout. Also, fewer people were actually getting offers; the onsite pass rate had fallen by almost a third, from ~40% to under 30%. This meant we needed even more interviews for every hire.
Visnu and I were early engineers bothered most by the state of our hiring process. We dug in. Within a few months, the onsite pass rate went back up, and interviewing burnout receded. We didn’t lower the hiring bar, though. There was a better way.
Introducing: the Phone Screen Team
We took the company’s best technical interviewers and organized them into a dedicated Phone Screen Team. No longer would engineers be assigned between onsite interviews and preliminary phone screens at recruiting coordinators’ whims. The Phone Screen Team specialized in phone screens; everybody else did onsites.
Why did you think this would be a good idea?
Honestly, all I wanted at the start was to see if I was a higher-signal interviewer than my buddy Joe. So I graphed people’s phone screen pass rate against how those candidates performed in their onsite pass rate.
Joe turned out to be the better interviewer. More importantly, I stumbled into the fact that a number of engineers doing phone screens performed consistently better across the board. They both had more candidates pass their phone screens and then those candidates would get offers at a higher rate.
These numbers were consistent, quarter over quarter. As we compared the top quartile of phone screeners to everybody else, the difference was stark. Each group included a mix of strict and lenient phone screeners; on average, both groups had a phone screen pass rate of 40%.
The similarities ended there: the top quartile’s invitees were twice as likely to get an offer after the onsite (50% vs 25%). These results also were consistent across quarters.
Armed with newfound knowledge of phone screen superforecasters, the obvious move was to have them do all the interviews. In retrospect, it made a ton of sense that some interviewers were “just better” than others.
A quarter after implementing the new process, the “phone screen to onsite” rate stayed constant, but the “onsite pass rate” climbed from ~30% to ~40%, shaving more than 10 hours-per-hire. Opendoor was still running this process when I left several years later.
You should too, .
Starting your own Phone Screen Team1. Identifying interviewers
Get your Lever or Greenhouse (or ATS of choice) into an analyzable place somewhere, and then quantify how well interviewers perform. There’s lots of ways to analyze performance; here’s a simple approach which favors folks who generated lots of offers from as few as possible onsites and phone screens.
You can adjust the constants to where zero would match a median interviewer. A score of zero, then, is good. Your query will look something like this:
(45 – 20 – 20) / 20 = 0.25
(60 – 36 – 20) / 20 = 0.2
(30 – 16 – 20) / 20 = -0.3
(45 – 40 – 20) / 20 = -0.75
No Good Nick
(30 – 48 – 20) / 30 = -1.9
Ideally, hires would also be included in the funnel, since a great phone screen experience would make a candidate more likely to join. I tried including them; unfortunately, the numbers get too small and we start running out of statistical predictive power.
2. Logistics & Scheduling
Phone Screen interviewers no longer do onsite interviews (except as emergency backfills). The questions they ask are now retired from the onsite interview pool to avoid collisions.
Ask the engineers to identify and block off 4 hour-long weekly slots to make available to recruiting (recruiting coordinators will love you). Use a tool like youcanbook.me or calendly to create a unified availability calendar. Aim to have no more than ~2.5 interviews per interviewer per week. To minimize burnout, one thing we tried was to take 2 weeks off interviewing every 6 weeks.
To avoid conflict, ensure that interviewers’ managers are bought in to the time commitment and incorporate their participation during performance reviews.
3. Onboarding Interviewers
When new engineers join the company and start interviewing, they will initially conduct on-site interviews only. If they perform well, consider inviting them into the phone screen team as slots open up. Encourage new members to keep the same question they were already calibrated on, but adapt it to the phone format as needed. In general, it helps to make the question easier and shorter than if you were conducting the interview in person.
When onboarding a new engineer onto the team, have them shadow a current member twice, then be reverse-shadowed by that member twice. Discuss and offer feedback after each shadowing.
4. Continuous Improvement
Interviewing can get repetitive and lonely. Fight this head-on by having recruiting coordinators add a second interviewer (not necessarily from the team) to join 10% or so of interviews and discuss afterwords.
Hold a monthly retrospective with the team and recruiting, with three items on the agenda:
discuss potential process improvements to the interviewing process
review borderline interviews with the group to review together, if your interviewing tool supports recording and playback
have interviewers read through feedback their candidates got from onsite interviewers and look for consistent patterns
Eventually, interviewers may get burnt out and say things like “I’m interviewing way more people than others on my actual team – why? I could just go do onsite interviews.” This probably means it’s time to rotate them out. Six months feels about right for a typical “phone screen team” tour of duty, to give people a rest. Some folks may not mind and stay on the team for longer.
Buy exclusive swag for team members. Swag are cheap and these people are doing incredibly valuable work. Leaderboards (“Sarah interviewed 10 of the new hires this year”) help raise awareness. Appreciation goes a long way.
Also, people want to be on teams with cool names. Come up with a cooler name than “Phone Screen Team.” My best idea so far is “Ambassadors.”
There’s something very Dunder Mifflin about companies that create Growth Engineering organizations to micro-optimize conversion, only to have those very growth engineers struggle to focus due to interview thrash from an inefficient hiring process. These companies invest millions into hiring, coaching and retaining the very best sales people. Then they leave recruiting – selling the idea of working at the company – in the hands of an engineer that hasn’t gotten a lick of feedback on their interviewing since joining two years ago, with a tight project deadline on the back of her mind.
If you accept the simple truth that not all interviewers are created equal, that the same rigorous quantitative process with which you improve the business should also be used to improve your internal operations, and if you’re trying to hire quickly, you should consider creating a Technical Phone Screen Team.
FAQs, Caveats, and Preemptive Defensiveness
Was this statistically significant, or are you conducting pseudoscience? Definitely pseudoscience. Folks in the sample were conducting about 10 interviews a month, ~25 per quarter. Perhaps not yet ready to publish in Nature but meaningful enough to infer from, especially considering the relatively low cost of being wrong.
Why didn’t the on-site pass rate double, as predicted? First, not all of the top folks ended up joining the team. Second, the best performers did well because of a combination of skill (great interviewers, friendly, high signal) and luck (got better candidates). Luck is fleeting, resulting in a regression to the mean.
What size does this start to make sense at? Early on, you should just identify who you believe your best interviewers are and have them (or yourself) do all the phone screens. Then, once you start hiring rapidly enough that you are doing about 5-10 phone screens a week, run the numbers and invite your best 2-3 onsite interviewers to join and create the team.
What did you do for specialized engineering roles? They had their own dedicated processes. Data Science ran a take home, Front-End engineers had their own Phone Screen sub-team, and Data and ML Engineers went through the general full-stack engineer phone screen.
Didn’t shrinking your Phone Screener pool hurt your diversity? In fact, the opposite happened. First, the phone screener pool had a higher percentage of women than the engineering organization at the time; second, a common interviewing anti-pattern is “hazing” – asking difficult questions and then rejecting somebody for “not even remembering about Kahn’s algorithm, lolz.” The best phone screeners don’t haze, bringing a more diverse group onsite.
Hey, Aline (founder of interviewing.io) here. This is the second post in our Guest Author series The first post talked about red flags you might encounter while interviewing with companies. Complementarily, this post, authored by one of our prolific, long-time interviewers, explores common missteps that interviewees make.
One of the things I’m most excited about with the Guest Author series is the diversity of opinions it’s bringing to our blog. Technical interviewing and hiring is fraught with controversy, and not everything these posts contain will be in line with my opinions or the official opinions of interviewing.io. But that’s what’s great about it. After over a decade in this business, I still don’t think there’s a right way to conduct interviews, and I think hiring is always going to be a bit of a mess because it’s a fundamentally human process. Even if we don’t always agree, I do promise that the content we put forth will be curated, high quality, and written by smart people who are passionate about this space.
If you have strong opinions about interviewing or hiring that you’ve been itching to write about, we’d love to hear from you. Please email me at email@example.com to get started.
William Ian Douglas goes by “Ian”, and uses he/him pronouns. He lives in the Denver, Colorado region and graduated from a Computer Engineering program in 1996. His career spans back-end systems, API architecture, DevOps/DBA duties and security, and has been a team lead managing small teams, and Director of Engineering. Ian branched out into professional technical interview coaching in 2014, and in 2017 pivoted his entire career to teaching software development for the Turing School of Software & Design in the Denver area. He joined interviewing.io as a contract interviewer in the summer of 2017 and is a big fan of the data analytics blog posts that IIO produces to help expose and eliminate bias in our tech industry interviews. Ian writes technical coaching information at https://techinterview.guide and you can reach him on Twitter, LinkedIn and GitHub.
I recently conducted my 600th interview on interviewing.io (IIO). I’d like to share lessons learned, why I approach interviews the way that I do, and shed some light on common problem areas I see happen in technical interviews. Every interviewer on the platform is different, and so your results may vary. We have some excellent folks helping out on the platform, and have a wonderful community working to better ourselves.
The interviewing.io Mock Interview
During our interviews on IIO, we rate people on three 4-point scales. A score of 1 means they did extremely poorly, and a 4 means they did extremely well in that category. I typically start my interview where everyone gets 3 out of 4 points right away, and then earn/lose points as the interview goes on.
Every interviewer on the platform will have some aspect that they favor over others. My own bias as an interviewer tends to be around communication and problem solving, which I’ll point out below.
In this category, I grade a candidate on how proficient they seem in their language of choice, whether they had significant problems coding an algorithm of a particular style, if I needed to give a lot of hints during coding.
Here, I grade a candidate on how well they break the problem into smaller pieces, come up with a strategy for solving the smaller problems, and also debugging issues along the way. The ability to think through problems while debugging is just as important as writing the code in the first place. Are they stumped when a problem happens, or are they able to find the root cause on their own?
Interviewers really want to hear your decision-making process. This is also very important when debugging code. I tended to hire folks who would fit in well on smaller teams or clusters of developers. With that in mind, collaboration and easy communication are a good way to win me over.
Common Problem Areas I See in Interviews
Here are the top problem areas I see in interviews, not just on IIO, but in general. I hope you find this advice helpful.
Common Problem Area 1: Jumping into code too soon
I see this in developers of all types and levels, but mostly in the “intermediate” level of 2-5 years of experience. They hear a problem, talk about a high-level design for 30 seconds or less, and are eager to get coding. They feel like they’re on a timer. They want to rush to get things finished. It’s a race to the finish line. First one across the finish line is the winner.
Please, slow down. Plan your work. And share your thought process along the way.
People who take time to think out a mid-level design, whether that’s pseudocode or just writing out notes of their approach, tend to spend less time debugging their code later. Folks who jump right into coding fall into what I call “design-as-you-go” problems, where you’re spending lots of time refactoring your code because you need to change a parameter passed or a return value, or wait, that loop is in the wrong place, etc.. This is very easy to spot as an interviewer.
Spending some time on mid-level design doesn’t guarantee your success, but it might save you time in the long run by thinking through your plan a little deeper, and that extra time you bought could be used to fix problems later.
Also, as an interviewer, I want to see you succeed. Especially if you’re “on-site” (in person, or remote nowadays) because you’re costing our company a lot more money in an on-site interview process. While I need to be fair to all candidates in the amount of help I can give, if I can see your design ahead of time, and spot a flaw in that design, I can ask leading questions to guide you to the problem and correct your approach earlier.
If you jump straight into code, I have no idea if your implementation is even going to work, and that’s not a great place to put your interviewer. It’s much harder for me to correct a design when you have 100 lines of Java code written before I really understand what’s going on in your code.
I saw this lack of planning backfire in a horrible way in a real interview in 2012. The candidate was brought to my interview room by someone in HR, asked if they would like a bottle of water, and promised to return. We introduced ourselves and got down to the technical challenge. The candidate shared no details, no design, barely talked about a high-level approach, wrote nothing down, and started writing code on a whiteboard. (This would be my second-to-last whiteboard interview I ever conducted, I hate whiteboard interviews!) HR showed up a few minutes later, knocking loudly on the door, offering the bottle of water and leaving. The candidate, grateful for a drink, uncapped the bottle and started to take a sip when this awful, draining look came over their face. The distraction of delivering a bottle of water made them completely lose their train of thought, and I couldn’t help them recover because they hadn’t shared any details with me about their approach. They spent several minutes re-thinking the problem and starting over.
On the “other side” of this coin, however, you can spend “too long” on the design stage and run out of time to implement your genius plan. I’ve seen candidates talk through a mid-level design, then write notes, then manually walk through an example with those notes to really make sure their plan is a good one, and now they only have a few minutes left to actually implement the work. Extra points on communication, maybe, but we need to see some working code, too.
So what’s the best approach here?
I typically recommend practicing until you spend about 5 minutes thinking through high-level design choices, 5 minutes to plan and prove the mid-level design, and then get to work on code. The good news here is that “practice makes better” — the more you practice this design break-down and problem solving, the better you’ll get. More on this later.
Common Problem Area 2: Communicating “Half-thoughts”
This is a term I’ve coined over the years, where you start to say a thought out loud, finish the thought in your head, and then change something about your code. It usually sounds something like this:
“Hmm, I wonder if I could … … … no, never mind, I’ll just do this instead.”
Back to my bias for communication.
Interviewers want to know what’s going on in your thought process. It’s important that they know how you’re making decisions. How are you qualifying or disqualifying ideas? Why are you choosing to implement something in a particular way? Did you spot a potential problem in your code? What was it?
This missing information is a hidden treasure for your interviewer. It takes mere seconds to change your communication to something more like this:
“I wonder if … hmm … well, I was thinking about implementing this as a depth-first-search, but given a constraint around ___ I think a better approach might be ___, what do you think?”
That took maybe 2 or 3 extra seconds, and you’ve asked for my opinion or buy-in, we can consider possibilities together, and now we’re collaborating on the process. You already feel like my future coworker!
Common Problem Area 3: Not asking clarifying questions
An interview challenge I often ask as a warm-up question goes something like this:
You have a grouping of integer numbers. Write a method that finds two numbers that add up to a given target value, stop immediately, and report those numbers. Return two ‘null’ values if nothing is found.
This is a great question that shows me how you think about algorithms and the kinds of assumptions you make when you hear a problem.
I’ve been coding for a pretty long time. Since 1982, actually. There’s no data structure called “a grouping” in any language I’ve ever used. So what assumptions are you going to make about the problem?
Most candidates immediately assume the “grouping” of numbers is in an array. You can successfully solve this problem by using an array to store your numbers. Your algorithm will likely be an O(n^2) (n-squared) algorithm because you’ll be iterating over the data in an exponential way: for each value, iterate through the rest of the values. There’s a more efficient way to solve this in O(n) time by choosing a different data structure.
Go ahead and ask your interviewer questions about the problem. If they tell you to make your own assumptions that’s different, but ask if they’re good assumptions. Ask if there are alternate data sets that you’ll be using as test cases which could impact your algorithm.
Common Problem Area 4: Assuming your interviewer sets all the rules
Yeah, you read me right.
Yes, you’re there for the interview, but you’re there to show them how you’ll work on the team, and teams work best when there is clear, open communication and a sense of collaboration. Spend the first few minutes of the interview setting expectations, especially around communication and work process.
There’s nothing wrong with having this kind of chat with your interviewer: “My typical work process in a technical challenge like this is to spend a minute or two thinking quietly about the problem and writing down notes, I’ll share those thoughts with you in a moment to get your input. Then, while I code, I tend to work quietly as well, but I’ll be sure to pause now and then to share my thought process as I go, and then walk you through the code more thoroughly before we run it the first time. Would that be okay with you, or do you have different expectations of how you’d like me to communicate or work through the problem?”
I promise you’ll blow their mind. Most interviewers won’t be ready for you to take their expectations into consideration like this. It shows that you’ll work well on a team. You’re setting the environment where you’re advocating for yourself, but also being considerate of others. You’re stating your intentions up front, and giving them the opportunity to collaborate on the process.
Common Problem Area 5: Not asking for help sooner
As your interviewer, I have a small amount of help that I’m likely able to provide during a technical challenge. I can’t coach you through everything, obviously, but I’d rather give you a hint, deduct a point on a rubric, and see you ultimately succeed at the problem, than to struggle silently and spin in circles and make us both feel like the interview is a waste of time.
As a professional interviewer and an instructor at a software school, I’ve become pretty good at asking leading questions to guide you to a realization or answer without me giving you the solution.
It’s okay to admit when you’re stuck. It doesn’t make you a failure, it makes you human. Let your interviewer know what you’re thinking and where you’re having problems. Listen very carefully to their response, they might be offering a clue to the problem, or might give you more thorough advice on how to proceed.
My Favorite Resources to Share
When our interviews at IIO are over, I like to dive into a lot of feedback on their process and where I think they could use extra practice to improve. Generally, I spend 10 to 20 minutes, sometimes going way beyond my one-hour expected time slot, to answer questions for someone, and going into more detail on things. I LOVE to help people on IIO.
Here are a few common areas of advice I offer to folks.
There’s nothing worse than listening to your own recorded voice. But all IIO interviews are recorded, and I often tell folks in my feedback and in the review notes I type up afterward to listen to the last few minutes of the interview recording to review the feedback I give them. You can also pause those recordings and grab a copy of your code at any time. (These recordings are of course private to your and your interviewer.)
During the playback, listen to your own thought process and how you communicate your ideas. As you work through other challenges, find a way to record yourself talking through the problem out loud if possible, and play that back for yourself. You’ll get better at articulating full and complete thoughts.
Problem Solving and Mid-Level Design
The more common practice sites like HackerRank, CodeWars, LeetCode, etc, are great for writing a coded algorithm, but don’t give you any way to exercise your design process.
I send my students to Project Euler. Euler was a mathematician, so the problems on the website will generally be pretty math-heavy, but you can change the problems to be whatever you’re comfortable building. If you don’t know how to calculate a prime number, that’s fine, swap that out for whether a number is equally divisible by 17 or something instead.
I like Project Euler because the challenges there are just word problems. You have to think of everything: the algorithm, which data structure(s) to use, and especially how to break the problem into smaller pieces.
One of my favorite problems is #19 in their archive: counting how many months between January 1901 and December 1999 began on a Sunday. They give you the number of days in each calendar month, tell you January 1st 1900 is a Monday, and how to calculate a leap year. The rest is up to you.
The more you expose yourself to different types of problems, the better you’ll get at spotting patterns.
Practice, Practice, Practice
One piece of advice we give our students is to practice each technical challenge several times. Our executive director, Jeff Casimir, tells students to practice something 10 times. That feels like a big effort. I aim more for 3 to 4 times, and here’s my reasoning:
The first time you solve a problem, all you’ve done is solve the problem. You might have struggled through certain parts, but your only real achievement here is finishing.
If you erase your work and start it a second time, you might think of a different approach to solving the problem, maybe a more efficient solution. Maybe not, but at least you’re getting practice with this kind of problem.
Now erase your work and do it a third time. Then a fourth time. These are the times when you will start to actively build a memory of the strategy it takes to solve this particular problem. This “muscle memory” will help you when you see other technical challenges, where you’ll start to spot similarities. “Oh, this looks like the knapsack problem” and because you’ve solved that several times, the time you take on high level design and mid-level design just shortened quite a lot.
One of my favorite technical challenges can be solved using a handful of different algorithms (DFS, BFS, DP, etc). If you think you can solve a problem in a similar fashion, solve it 3 or 4 times with each of those algorithms as well. You’ll get REALLY good at spotting similarities, and have a great collection of strategies to approach other technical problems.
I’ve been writing up notes for aspiring new developers at https://techinterview.guide. It’s not complete, but I have a lot of my own thoughts on preparing for technical interviews, networking and outreach, resumes and cover letters, and so on. I still have a few chapters to write about negotiation tactics and graceful resignations, but I’m happy to take feedback from others on the content.
I also have a daily email series covering several kinds of interview questions, but not from a perspective of how to answer the questions perfectly, there are plenty of resources out there to do that. Instead, I examine questions from an interviewer’s perspective — what am I really asking, what do I hope you’ll tell me, what do I hope you won’t say, and so on. A small preview, for example: when you’re asked “Tell me about yourself” they’re not really asking for your life story. They’re really asking “Tell me a summary of things about you that will make you a valuable employee here”.
Hey, Aline (founder of interviewing.io) here. We’re trying something new. Up till now, all posts on this blog have been written by interviewing.io employees or contractors. Why? Frankly, it’s hard to find great content in the recruiting space. There’s so much fluff and bad advice out there, and we didn’t want any part of that.
The other day though, I was reading Hacker News and saw an article by Uduak Obong-Eren about how he did over 60 technical interviews in 30 days and what he learned from that gauntlet of an experience. I thought it was honest, vulnerable, well-written, and brimming with actionable advice. So, I reached out to him to see if he’d want to write something else. Fortunately, he did, and the article below is the inaugural post in what I hope will become our Guest Author series. You can read more about Uduak in the bio below.
A quick note because this is the first time we’re doing this. One of the things I’m most excited about with this new Guest Author series is the diversity of opinions it will bring to our blog. Technical interviewing and hiring is fraught with controversy, and not everything these posts contain will be in line with my opinions or the official opinions of interviewing.io. But that’s what’s great about it. After over a decade in this business, I *still* don’t think there’s a right way to conduct interviews, and I think hiring is always going to be a bit of a mess because it’s a fundamentally human process. Even if we don’t always agree, I do promise that the content we put forth will be curated, high quality, and written by smart people who are passionate about this space.
Now, off we go!
Uduak Obong-Eren is a Software Engineer based in the San Francisco Bay Area who is passionate about architecting and building scalable software systems. He has about five years of industry experience and holds a Masters in Software Engineering from Carnegie Mellon University. He is also an open source enthusiast and writes technical articles – you can view some of his writings at https://meekg33k.dev. He especially enjoys conducting free mock technical interviews to help folks get better at technical interviewing. You can follow him on Twitter @meekg33k.
What is the one thing you would look out for if you had to join a company?
Sometime between January and February 2020, I wanted to change jobs and was looking to join a new company. This, among other reasons, led me to embark on a marathon of technical interviews – 60+ technical interviews in 30 days.
Doing that many number of interviews in such a short time meant I had an interesting mix of experiences from the various companies I interviewed with, each with their unique culture and values that often reflected in the way their interviews were conducted, intentionally or not.
In this article, I will be sharing some of the red flags I observed while I was on this marathon of technical interviews. I will not be mentioning names of any companies because that’s not the intent behind this article.
The goal of this article is also not to make you paranoid and be on the hunt for red flags in your next interview, far from it. Rather the goal is to equip you with knowledge to help you immediately identify exactly the same or similar red flags in your next interview, and hopefully identifying them will set you up to better handle them.
Even though the stories I’ll be sharing come from my marathon of technical interviews, these red flags do not apply only to technical interviews. They apply to all kinds of interviews and so there’s a lot to learn here for everyone.
The Red FlagsYour interviewer is only open to solving the problem ONE way
In the world of computing and in life generally, for any given problem, there is typically more than one way to solve that problem. For example, given a sorting problem, you could solve it using a merge-sort algorithm or a heap sort algorithm.
Having this rich number of techniques to solve a problem makes it even more interesting and the general expectation in technical interviews is that you should have the flexibility to solve a problem using your preferred technique.
I had an interview where the interviewer asked me to solve an algorithmic problem. I had started solving the problem using a specific technique when the interviewer stopped me in my tracks and asked that I use another technique.
When I probed a bit further to know why, it appeared that the reason he asked that I used the second wasn’t to test my knowledge of that second technique; it was because he was more ‘comfortable’ with that approach.
It is different if the interviewer wants to test your knowledge of something very specific. For example, given a problem that can be solved using iteration and recursion, the interviewer may want to test your knowledge of recursion and can ask you to solve the problem recursively. That wasn’t the case here.
I ended up using both techniques and discussed the trade-offs but frankly that experience left a bad taste in my mouth especially because that interview was with the hiring manager — my would-be manager, who is someone that can significantly influence your career growth and trajectory.
Undue pressure to accept an offer letter
It’s very exciting and fulfilling when you go through all preliminary stages of a technical interview, through to the onsite interview (remote or in-person) and then you receive that “Congratulations <insert name here>, we are pleased to offer you…” email.
However, that excitement often becomes short-lived when there is some form of pressure from your soon-to-be employer to accept the offer. It’s a bit more manageable when the pressure comes from someone in HR or the recruiter, but when it’s from the hiring manager, that can be harder to manage.
That was the case for me when I interviewed with a startup based in Palo Alto. They were a small company in terms of staff strength. My onsite interview with them had gone quite well. I had a good conversation with the hiring manager and an even better conversation with the VP of Engineering, so much so that I could tell that I was going to be extended an offer. I asked to know how long I had to accept the offer letter and I was told seventy-two hours.
The offer letter arrived later that evening, and it looked great — a six-figure offer definitely didn’t seem like a bad start. I was also at the final stage of the interview process with other companies too and thankfully, I had enough time to negotiate and accept the offer, or so I thought.
Then the pressure started, incessant calls from the hiring manager and the VP of Engineering, back-to-back emails, all within the allotted time. So much was the pressure that it got to a point where I wasn’t sure I wanted to negotiate the offer anymore. I turned down the offer.
I turned down the offer because the experience got me thinking about the company’s work culture. Were the methods employed by the company to get me to accept the offer indicative of their work culture? If they needed to get something done, how far would they go?
Now don’t get me wrong, yes the company wants to employ you, yes the recruiting team wants to ‘close the deal’ however, it’s very important to pay attention to how the company does this. Do they remain professional about it?
A company’s values go beyond what they say, it shows in what they do and how they do it.
Not enough clarity about your role
Among the many reasons why you would join a company is your desire to be involved in valuable work. I had the opportunity to join a US company based in Boulder, Colorado. They had contracted a recruiting agency to help them find someone to fill a Software Engineer position in their firm.
The hiring process started with an exploratory interview with the recruiting agency closely followed by a second interview with a recruiter from the company. In both interviews, I couldn’t get a clear sense of the details of my specific role was — what team I would be on, what kinds of projects I’d be working on, what the career growth pathway was, etc.
I understand that sometimes companies can be going through restructuring, but that didn’t seem to be the case here. It seemed more like the company was focused on completing their headcount. Even though there’s nothing wrong with completing a headcount, I think there is everything wrong with not having a clear purpose for a role for a couple of reasons:
It means the role may not be critical to the company’s core business.
If the role isn’t that important, it may mean when a layoff comes, your position may be impacted.
On a more personal note, I don’t want to be just a number. I want to work at a place where I have the opportunity to contribute in an impactful way, and I like to believe you would too. So it’s important to get clarity about your role, for where you are today and for future career growth.
Consistent lack of interest or low morale from interviewers
When looking to join a company, one of the things you simply must care about is the team you will end up working on. At least 25% of your waking hours will be spent interacting with that team whether in-person or virtually.
Interviews offer you an opportunity to experience firsthand what it will look like to work with your prospective teammates, especially since, unless you’re interviewing at a huge company, your interviewers are likely to become your teammates.
If through all the different stages of the interview process, you experience a consistent lack of interest or low morale from your interviewers, you might want to pay attention .
When I experienced that during one of my interviews, I couldn’t exactly tell what the cause was, but I knew something just wasn’t right. After some internal tussle, I decided to trust my gut feelings and ended the interview process with the company.
Fast forward to two months after, two of my interviewers (would-be teammates) had left the company and joined another company (no I wasn’t stalking, I just checked on LinkedIn).
Now I’m not saying that during the interview process, there won’t be one or two people, who because of their busy schedules, would have preferred to be doing something else rather than interviewing. Yet, when all of the interviewers don’t want to be there, you certainly want to pay attention to that.
A lack of interest or low morale could be pointers to a combination of any of the following:
Your prospective team-mates may be experiencing burnout.
Some internal dissatisfaction with company — culture, policies, something, anything.
The team isn’t that interested in you (hard pill to swallow?), maybe they don’t see you as a long-term hire.
Or it could be for reasons that I have not included here, but I implore you to not ignore this red flag if you see it in your next interview.
Your interviewers aren’t prepared for the interview
Have you been in an interview before where the interviewer doesn’t seem to have any questions to ask you? Trust me it can get really awkward.
That was my experience during a technical phone-screen interview with an educational technology company based in California. The interviewer wasn’t prepared for our interview and didn’t have any questions at hand. He wasn’t even sure of who he was interviewing and what role I was interviewing for. It wasn’t a pleasant experience.
I understand that there are a myriad of reasons why interviewers may not be prepared for an interview. Some of which include:
Lack of proper planning by the HR/recruiting team.
Last-minute changes on the interviewee.
Busy schedules for the interviewer.
The interviewer just wasn’t prepared.
I typically won’t act on this red flag in isolation. I will be looking for other red flags in a bid to form a cluster of patterns before making any decision.
Lack of a clear direction on where the company is headed
It’s fulfilling to be a part of a company that is involved in meaningful work that creates value for its users. Joining such a company would mean you have the desire to contribute in helping the organization meet its goals. This invariably means the organization must have some goals right?
I was contacted by a startup based in San Francisco via AngelList. I had a first introductory call with a recruiter from the company, closely followed by a phone screen technical interview.
In both interviews, even though the interviewers shared some details about the company, there was a lot of vagueness and about the company’s direction and where the company was headed.
I particularly remember that one question I asked at the time, was about how the company would deal with its growing competition. Sadly, the answers I got didn’t seem convincing and the company later got acquired by the competition.
When you are interviewing to join a company, you are selling more than just your skills, but also yourself — your unique experience. While it’s important to do that, I think it’s equally important that the company should be able to sell you on its vision and what it hopes to achieve.
When I think of joining a company, I picture myself in that company for the next 2–5 years. If my vision for where I want to be in my career doesn’t align with the company’s vision, that is a mismatch that shouldn’t be ignored.
We sometimes focus more on securing the job and even though that is very important, even more important than getting the job is staying fulfilled on the job. For me, fulfillment meant joining a company that had a clear vision of where they were headed, working in a role that was critical to the company’s business while being equipped with a lot of growth opportunities.
Hopefully, these red flags I have shared will equip you to make better decisions on what companies you choose to grow your career with. I would generally not advise making a decision based on one or two red flags, but if you see a cluster of red flags, you shouldn’t ignore them. I wish you the best in your career journey.
If you ever need someone to do a mock interview with you, feel free to schedule one here or you can reach out directly to me on Twitter @meekg33k.
And if you’d like a list of things to ask companies while you’re interviewing that may help you identify these red flags (and others!) sooner, take a look at this one.
If you have something to say about your adventures in interviewing or hiring, write a guest post on our blog! Please email me at firstname.lastname@example.org to get started.
I started interviewing.io because I was frustrated with how inefficient and unfair hiring was and how much emphasis employers placed on resumes.
But the problem is bigger than resumes. We’ve come to learn that interview practice matters just as much. The resume gets you in the door, and your interview performance is what gets you the offer. But, even though technical interviews are hard and scary for everyone — many of our users are senior engineers from FAANG who are terrified of getting back out there and code up the kinds of problems they don’t usually see at work while someone breathes down their neck — interview prep isn’t equitably distributed.
As you may have read, if you follow interviewing.io news, COVID-19 turned our world upside down. In its wake, the pandemic left a deluge of hiring slowdowns and freezes. For a recruiting marketplace, this was an existential worst nightmare — in a matter of weeks, we found ourselves down from 7-figure revenue to literally nothing. Companies didn’t really want or need to pay for hiring anymore, and we were screwed.
Then, we pivoted and started charging our users, who had previously been able to practice on our platform completely for free (albeit with some strings, more on that in a moment). While this pivot was the right thing to do — without it, we would have had to shut down the company, unable to provide any practice at all — charging people, especially those from underrepresented backgrounds, didn’t sit right with us, and in our last post announcing our model, we made the following promises:
We’d ALWAYS have a free tier
We’d immediately start working on a fellowship for engineers from underrepresented backgrounds or in a visa crisis/experiencing financial hardship ← That’s what this post is about!
We’d find a way to let people defer their payments
We launched with a free tier, and it’s still there and going strong. We’re still working on deferred payments and are in the thick of user research and price modeling.
But, the rest of this post is about the 2nd promise. To wit, I’m so proud to tell you that we’ve officially launched the first (pilot) cohort of the interviewing.io Technical Interview Practice Fellowship. This cohort will be focused on engineers from backgrounds that are underrepresented in tech. We are acutely aware, of course, that our first cohort couldn’t capture everyone who’s underrepresented, that gender and race isn’t enough, and that we need to do more for our users who can’t afford our price tags, regardless of who they are or where they come from.
Our hope is to expand this Fellowship to anyone who needs it.
We’re also working on the much harder problem of how to navigate the visa situation we’re in right now (different than when we wrote the first post, sadly… but especially important to me, given that I’m an immigrant myself).
What is the Fellowship, and why does it exist?
Before we tell you a little bit about the Fellows in our inaugural cohort and what the Fellowship entails, a quick word about why this matters.
In order to get a job as a software engineer, it’s not enough to have a degree in the field from a top school. However you learned your coding skills, you also have to pass a series of rigorous technical interviews, focusing on analytical problem solving, algorithms, and data structures.
To account for both of these limitations, the best strategy to maximize your chances of success is to practice a lot so you can 1) get better and 2) accept that the results of a single interview are not the be-all and end-all of your future aptitude as a software engineer and that it’s ok to keep trying.
The main problem created by modern interview techniques is that, despite interview practice being such a critical prerequisite to success in this field, access to practice isn’t equitably distributed. We want to fix this, and we’re well equipped to do so. Based on our data, engineers are twice as likely to pass a real interview after they’ve done 3-5 practice sessions on our platform.
Our Fellows will get these practice sessions completely for free. These will be 1:1 hour-long sessions with senior engineers from a top company who have graciously volunteered their time and expertise. Huge thank you and a big shout-out to them all.
After each session, Fellows will get actionable feedback that will help them in their upcoming job search, and we will be helping Fellows connect with top companies as well.
Note: We’d like to be able to offer even more support – and are actively seeking more partners to do so. Please see the How you can help section below if you or your organization would like to get involved!
The world seems to be in a place, now more than ever, to have the conversation about race, gender, socioeconomic, and other kinds of equity, in hiring. This is our small part of that conversation.
Who are the Fellows?
After opening up our application process, we close to 1,000 submissions in a week, and (though it was really, really hard) we culled those down to 56 Fellows.
Our first cohort is:
82% Black, Latinx, and/or Indigenous
55% senior (4+ years of experience) & 45% junior (0-3 years of experience)
Here are some of their (anonymized) stories. There were a lot of stories like these.
My goal is to keep pressing as well as to share and give to underrepresented communities because the journey in tech can be isolating. Often I am the only one. It is critical that there are more people that look like me that are engineers *and* ascend the leadership ladder.
My parents immigrated from [redacted] to The Bronx without a formal education. I’m the first individual in my household to graduate from college and I’m the only Software Engineer in my family. I grew up in a poor neighborhood where many individuals had limited economic and educational opportunity. I aim to make the path to become a Software Engineer easier for those who were in my situation.
My journey to becoming a software engineer almost never happened. Throughout my undergraduate studies I was faced with having to drop out multiple times, due to the immigration status of my parents…. I was tasked with assisting in my family’s living situation and paying for school. I worked full time and started my own construction company in order to take care of my family and studies. It was always tough having to work 8-10 hours a day and then going to class or doing homework… Becoming a software engineer was always a goal of mine, and realizing that goal was well worth the struggle, given the struggle my parents went through to bring us here in the first place.
I spent 5 years in public education working directly with marginalized communities in the struggle for equity. My journey through software engineering is a continuation of this spirit of advocacy and changemaking. Software engineering is a tool to be put at the service of advocacy.
What can I do to help?
There are a number of ways you can help and get involved!
Help sponsor future Fellowship cohorts & create scholarships for underrepresented engineers!
Every Fellow in this first cohort represents at least 100X who are not. We have the tech to scale the hell out of this program, and all we need is backing and resources from people or organizations who recognize there’s a need (donations are tax-deductible). Please email email@example.com if you’d like to get involved or want more information.
Hire through us!
Despite mounting evidence that resumes are poor predictors of aptitude, companies were obsessed with where people had gone to school and worked previously. On interviewing.io, software engineers, no matter where they come from or where they’re starting, can book anonymous mock interviews with senior interviewers from top companies. We use data from these interviews to identify top performers much more reliably than a resume, and fast-track them to real job interviews with employers on our platform through the same anonymous, fair process. Because we use data, not resumes, our candidates end up getting hired consistently by companies like Facebook, Uber, Twitch, Lyft, Dropbox, and many others, and 40% of the hires we’ve made to date have been candidates from non-traditional backgrounds. Many of our candidates have literally been rejected based on their resumes by the same employer who later hired them when they came through our anonymous platform (one notable candidate was rejected 3 times from a top-tier public company based on his resume before he got hired at that same company through our anonymous interview format).
Buy an individual practice session for someone who can’t afford it
If you know individual engineers who need interview practice but can’t afford it, use our handy interview gifting feature. Interviews are $100 each. They’re not cheap, but we have to price them that way to pay for interviewer time (interviewers are senior FAANG engineers) and cover our costs. Sadly that means practice interviews are not affordable to everyone. Even if you can’t get involved to help us fund interviews at scale, if you know someone who needs practice but can’t afford it, you can buy them an anonymous mock interview or two individually. It’s the best gift you can give to an engineer who’s starting their job search.