Note: This transcription is split and grouped by topics and subtopics. You can navigate through the Table of Contents on the left. It's interactive. All paragraphs are timed to the original video. Click on the time (e.g., 01:53) to jump to the specific portion of the video.
Hey guys, today we have Amman Bartram, co-founder of Socialcam, TripleBite, and he is here to talk to us about hiring. So could you just give us a quick intro about what you've worked on? Cool. So I joined Justin TV fresh out of school in 2009 when it was just 25 folks and kind of went through the rollercoaster of the early days of Justin TV. And there I worked mostly on the video system. And I think that was where I got my first sort of taste of hiring. You know, we were, at the end of that we were hiring pretty aggressively and that was when I first realized how much noise the reason that I was in that ring process. And then I was part of the spin-off of Socialcam, and that was a sort of video sharing app. I did that for about three and a half years. And we were acquired by Autodesk in 2012. And I worked there until 2014. And then took a bit of time off and started at TripleBite. Cool. And so TripleBite just for context for people. Can you explain? Sure. So yeah, so we're a recruiting startup. So we help startups hire engineers. And so engineers apply to us. And then we do sort of a full interview with them, pass the engineers who are good and match them with companies where they have a high probability of doing well. Cool. So people ask us a million questions about hiring, recruiting, all of it. In general, let's assume that you're, you know, an early stage startup. What should companies be looking for in an engineer? There's not a crisp answer to that. I think that the pithiest answer I can give to that is you have to decide what it is you want to look for it, and then you have to effectively look for that. So sort of the status, actually, I think there's some sort of an important truth here. Most companies think that they are trying to hire good engineers. That's what they say to themselves. And they, what they don't realize is that, you know, company A's definition of a good engineer is significantly different from company B's. And so what you have is a situation where everyone has this definition in mind, and they're all different. And this is the big source of noise. So for example, if one, you know, company thinks that engineers need to be, you know, very fast and productive and be able to show that in an interview. And a different company thinks that engineers need to be, you know, very intellectual and be able to deeply understand computer science topics and talk crystal about that. What happens is sort of all of the, all of the awesomely productive, you know, engineers very practical, not necessarily in strong academics who apply to the second company fail. And, you know, all of the, all of the very academic could totally solve all your hard problems, but maybe aren't quite as flashy in, you know, web environment, engineers who apply to the first company also fail. And so it's, you know, if a company's at a bit of a larger stage, I think the obvious answer is you want to hire both those people. And so it's about building a process that can identify more broadly different types of skill.
Engineering Job Interview Tips And Insights
The most important thing a company can do for vetting engineers is to have a process that can identify (03:11)
For smaller companies, I think you're more in a situation where you may well actually only want one of those folks. You may well, you know, the thing that's going to come in and be productive and sort of bang out code. And if that's the case, you know, you need to realize that it's not important that everyone you hire be strong in computer science. And if you are, you know, a company where you're facing security issues and, you know, really clean code, really precise code and solving hard problems is important to you, then, you know, at an early stage, and it probably makes sense to have a process that skews more in that direction. And so do you have general advice for people that come to you guys at TripleBite or just you as a friend advisor for diagnosing like what kind of engineer is a good engineer for your company? What like, what do you tell people? We don't. So it's funny. We're, we're rarely in that situation, actually. I think most, most people have strong preconceptions. So we're more often in the situation of sort of broadening people's vision of what a scale engineer can be. But I think the obvious, yeah, let me let me circle back to the question. What I think what happens a lot is the mistake that's easy to fall into is when people are interviewing an engineer, they tend to ask about the things that they are the best at. There's this overlap between the things that you're the best at and the things that you think are the most important. Yeah, like, every engineer thinks the things that they know are kind of the core of the discipline. And so you ask everyone who you interview those questions, you bias yourself toward hiring engineers who have those skills, they join your team, they ask people, they interview the same type of question. And so the whole organization can grow in a direction that might not make sense. It's very complicated because there are plenty of examples of companies with, you know, with defined engineering cultures that have worked out very well. So for example, Google has intentionally or intentionally grown very much in a computer science direction. And that's obviously worked out very well for them. There are other companies in the YC that I don't want to say. There are companies that take a complete opposite, take a very human productivity friendly approach. And also extremely successful excellent companies. And so, you know, answering that question is not, there are success cases on both sides. But I think as, you know, when you're hiring your first employers, I think you need to just basically try to decide what's holding us back. Yeah. And so say, like, say I'm trying to vet this pool of engineers and they all like fit the certain rubric that I've created, right? But maybe one of them did a bootcamp and had some projects. And then one of them, you know, went to a great school, has a CS degree. So how do you, how should I think about, you know, credentials and experience? Bootcamps versus CS degree, I don't think are all that different. Now, okay, that's obviously a forceful statement. So I think, I think experience matters more than where you got your education, right?
Having experience matters more than where you got your education. (06:27)
So someone to refresh out of a CS program who doesn't have internships, is still essentially a junior engineer. And perhaps they may have a more academic slant to what they've studied than some out of a bootcamp. But both those people lack real world experience. I think, categorizing that differently than someone who has, you know, worked in the industry for five years and can own projects, right? So the skill that you can most easily measure in an interview is ability to think through small problems, that'll be small problems quickly. That's really what interviews can evaluate. The skill that you need in an employee is the ability, it's like quite different. The ability to solve large sprawling problems well, but over a long period of time. And there's obviously a correlation there. You know, it's not, it's not, you know, we use interviews as a proxy of evaluating actual skill because there is a correlation there. But the correlation is not perfect. And an interesting observation is that people who are fresh out of university and bootcamps, actually in many cases, because they've been practicing, are better at the kind of problem that gets asked to interview them. So then you're very senior, you know, 10 years of experience at that large company engineer. What the senior engineer has typically is experience making our structured decisions, owning projects, gathering requirements, carrying that whole process through. And so, okay, how do we evaluate for that? It's super hard. Basically, what ends up happening, and this is, this is honestly unfair, is that experience gets, gets, gets, gets, is us as a proxy for that. This is something that we're focusing on a lot at Trill by, but what, you know, this is very strange. Actually, if you are, if you're a, if you have five years of experience, it's, it's just flat out easier to pass an interview. You will get a job offer, you know, after a worse performance, you people think maybe that senior engineers, you know, you expect their senior, they should, they should perform better in the interview. And that's actually not generally true. The bar to getting a job offer goes down as you have sort of a, a present-looking resume. And, and this is, this is not, it's, you know, it's not irrational in the part of the companies. It's just the reality. Sure. Okay. And so then when you guys are like, pre-screening these people for Trill by, for example, like, what are you looking at? What are you having them do? So we, the approach that we take is to evaluate as much as we can in isolation and be aware what we're evaluating.
Garys Challenge - Debugging (08:59)
So we explicitly evaluate programming, just programming productivity, given a relatively, you know, a specked out problem. So for example, we like, describing an algorithm to solve a problem, it's not super math, they just use the set of steps that you have to do. Can the candidate take that and, you know, render it into well-working, well-structured code? And interestingly, junior folks actually often do better than senior folks at that sort of problem. We then separately do a evaluation of sort of academic computer science skills. You know, you know, is the engineer knowledgeable about computer science and about that approach to problem solving? We then separately, one thing that we took from Stripe, actually, is we do a debugging section. And so we give the candidate a large code base and that has some bugs and we ask them to sort of, you know, take some time, dig into the code base and then try to find and fix these bugs. And I think this does a great job of solving some of those problems, basically, because this is a skill that comes from experience that is often missed by more, more, more traditional engineers. And then finally, we do a system design section. So, you know, here's a, here's, here's some requirements, design a production web system to satisfy these requirements. Okay, now we're going to change the requirements. You know, how do you adapt your design? How do you talk about trade-offs? And all of that's done remotely because of a person's at home, right? Yes, we do this all over Google hands. Okay. And so what's a reasonable amount of time for someone to just like go through one of these exercises? Are they all widely different or? To go through our exercises? Our interviews are about two hours in length. And so we spend about 30 minutes on each section. Okay, cool. And you find that's like a very strong data set in terms of correlating how successful they are. Yeah. Well, we've done about, we've done about 2000 of these interviews over the last year and a half. And so we've been able to sort of drill in on the parts that are, that are, that are, you know, most predictive and cut time off ensuring it.
Trying too hard (11:01)
I think if you're starting from scratch, you would probably need about twice the amount of time to get through all that stuff. Okay. And so what, like, so having gone through all these interviews at this point, was there anything that you thought was really important in the beginning or something that's like very common in the valley that many people think is important? That isn't really important.
Noise in the job-candidate selection process (11:19)
Too much reliance on a single question. So the, the, sort of the classic interview format in tech companies is, you know, a number of, of one hour, 45 minute to one hour sessions. And, you know, often engineers pick the questions themselves. And they're usually like little nuggets of, of, of, like, you know, they're sometimes pejoratively called braintasers. I don't think, like, almost no one actually asks braintasers. They're more like legitimate programming problems. But they are the nuggets of difficulty. Like, how do you, you know, given a string of parentheses, how do you know if they're well matched? Okay, given multiple types, how do you turn them on match? If you don't, you know, you know, that's a classic interview question. And it ends up, there's just a huge amount of noise. If you just take a bunch of candidates and in a controlled setting, have them all answer, you know, three or four of these questions, you'll see, there's just, you know, there's some correlation, but there's way less correlation than you would think. And companies, I mean, honestly, I, I believe this in my previous, you know, when I was, you know, like, you, you have this question, you ask them on the question, and you see this variation, you assume, oh, the people who answer this question well must be smart and good programmers. And people who get totally tripped up must not be. And then when you actually, like, inspect that, you see that there's this huge amount of noise. And like, we, we have this pretty incredible lens on this, this because we evaluate engineers pretty rigorously, and then send out to multiple companies and see what happens. Get, get detailed feedback. And so, yeah, is that feedback from the actual interview process or then once they're placed, you actually know as well how they're doing? Both. Okay. But I'm, I'm kind of talking about the interview process. Okay. So we, so we, we, we, we, engineers, we send them to companies, and then they do the interview there, and we get feedback. And it's just pretty incredible how, how, how, how much disagreement there is. So a candidate who, you know, does really well at one company, and we get total, this is what the best people, we've been able to in months, this is a rock star, goes on to fail, an interview at some, at somewhere else. And, and like, a pretty interesting stat we dug up is I, I, I, compared the rate of agreement between interviewers at companies, with a data set of users reviewing movies online. Right. And the numbers were actually, basically, and the, the, the inter-rate of agreement was, was, was equivalent. So basically, the, knowing that it engineer did well at one company, gives you about as much, you know, information about whether that engineer is, you know, skilled, as, you know, knowing whether, you know, the New York Times film critic, you know, rated 12 years of slave as, as, as, excellent or, or terrible. So, okay. Uh, maybe you don't have an answer to this, but say I'm really good at brainteasers. Where should I interview? I, larger companies, probably larger companies. So, and this, this makes a certain amount of, okay, this is all very complicated. It makes a certain amount of sense. So bigger companies, so, so brainteasers always introduce noise, but we've found that bigger companies rely more on that type of interview. And I do that partly for some rational reasons. So bigger companies care more about measuring your innate ability and less about measuring whether you can jump into their particular codebase and be productive on day one. But it's way more likely that a smaller company is going to say, we're using, you know, Ruby on Rails, we need a very productive Rails developer, you know, come and take this interview, you know, and, and, and, and, and we're going to ask you how well you can do this, maybe how well you can work on our actual codebase. Okay. Whereas the big companies, you know, Facebook, Dropbox, Apple, Google are more likely to say, we care about smart people and, you know, within the confines of the noise of the interview process, they're trying to identify intelligence rather than specific experience.
The ideal type of problem to be asked when being interviewed (15:00)
I mean, they also have like capacity time to train people. Yeah, precisely. Whereas like a small company in Norway. Okay. And then like prior, one of the questions, what about skills that people don't think is correlated that are strongly correlated to a successful engineer? Relatively easy problem-solving. So we've, we've, so we have found that asking pretty easy interview questions is often more generally predictive than asking harder interview questions. And so, so, so, so, to, to break this down, there are kind of, there are two sources of, of signal from asking a question. You can get signal on whether the candidate comes up with the right answer and you can get signal on whether they, you know, whether they struggle, how, how easy or hard it is then to solve the problem. And so we, we square both these things for a bunch of questions and we've done this for, again, thousands of candidates. And what we found is that if you go in and look at how much, you know, the, the individual score on one question we're asking correlates with how the candidate does on the job. And what we've found is that as you would expect, you know, getting a question right is correlated with the engineer. And as you would expect, being able to answer a question easily is correlated with being an engineer. But there's also, of course, you know, false negatives, right? So there are great engineers who fail individual questions and they're going to struggle with questions. And if you, if you do, if you look at like the actual predictive ability, you know, rather than just the correlation of getting on the right side, right, the sweet spot is actually far lower on the scale than most people intuitively think. And so yeah, can you give you like a couple examples of what those easy or questions might be?
Preparing for coding interviews (16:54)
Yeah, sure. Just like, like saying, like, you know, we want you to create a checkers game, absolutely no, you know, no logic, I think complicated, just a class that has a board, has a grid, it's got pieces, pieces move around. This is really a pretty mundane, straightforward task. That actually ends up being, you know, how well candidates do that ends up being a more stable predictor of engineering skill than sort of, you know, here's, you know, here's a, you know, here's the, I'm going to do a sentence that consists of a list of words all going together and you define the optimal way to, you know, given a dictionary to break this apart into words. That second problem ends up being a graph search problem that can be, you know, optimized with, you know, memorization or dynamic programming. The second, you gain the second problem right carries more information than getting the first problem right. But with a really high false negative rate. Yeah. And so the first problem ends up actually being a better general predictor of engineering skill. And so is there a way, if I'm like getting ready for a new, you know, I'm going to go to another company, I'm going to get ready to interview. Do you recommend people train in any particular way? Or is because like you're going for that sweet spot of easy questions like, you just have to be smart enough to do it. Like what do you tell people? Sort of in general. Yeah. If I'm going to prep to do some interviews, what would I do? So I mean, I guess there's two questions there. One is where I think, for companies I think are doing a good job interviewing and then maybe for what I think some sort of status quo is. In general, it depends where you're coming from. So a very different advice for new grads and for experienced folks. Okay. So let's break up them apart. Yeah. New grads. Okay. For new grads, I would say, you know, make sure you're solid with sort of the classic stuff that so, so, you know, breadth first search, you know, hash table, heaps. I mean, classic crook computer science. Yeah. A surprising percentage of a new question ends up being slightly obscured applications of sort of those, especially sort of, you know, hash tables and breadth first search. Those two things by themselves represent probably 40% of the questions that are asked in most companies. And so you need to know those. But actually, many new grads are already pretty solid in that because they've been being drilled that throughout school. The second thing is practicing writing code under, under stress. So working out a big problem over time is very different than you have 10 minutes or 30 minutes. Here's a, you know, here's a marker right in a whiteboard or even here's like, you know, laptop program on it. And so just, you know, the things correlate, the skills correlate, but you can improve your performance by practicing. So totally, you know, put in, you know, 30 minutes a day, finding some new questions online and giving yourself a time limit and sort of trying to solve more distressful situations. And are there good, like, resources that people can look for, like anything in particular? Yeah, I mean, the classic ones, right? So, so cracking the coding interview has a pretty good list of questions. It's the other advice in that book, I don't think really applies to startups very much. But the questions are good. And then there are a bunch of, a bunch of sites online that have lists. Any of your cake is one that that I've seen that I think is high quality. An interesting aside to this, though, is that most companies actually want you to do these things, right? It's not, companies would prefer that all their candidates, we prefer, we totally prefer. Now, we try to design our interview in such a way that there's no impact, right? We don't really want to be measuring if you've been cramming on algorithms. But what, what any companies want to measure is like max scale, max potential. So they would, they would actually much rather see you, you know, in a state where you're, you're real prepared in the material, right? And they would rather than the state when you have the potential to understand it, but, but, but forgot about it. Yeah. And what, you know, a, a new trusting trend that's happening in industry is companies, companies being more upfront about what they're asking. So, so Facebook, for example, has started providing a sort of a interview prep class to everyone who applies so that they're sort of going over the material. It's part of me finds it encouraging because it is moving it in a better direction. But that's also discouraging because it's like really lucky that you have to like take a class to figure out it. Right. I just wonder if it's filtering for those types of people, right? Who are just like looking for like, I don't know what to do. I don't know what to do. It's like, well, if you apply here, you can take the class and then like take it like, let me hold your hand the whole way through your life. Um, yeah. Okay. Um, okay. So say I am going to interview out a bigger company.
Cracking the Code Interview (21:22)
Uh, is there a way to prep to do well with the brain teaser stuff? Yeah. Practice. So, so, so, so, so again, there are some words that are thrown around describing any of your questions that aren't brain teachers are pretty rare. Some companies probably have asked things like, you know, the golf balls in a 747. Um, but that's really very rare. Um, much more common is application of a computer science idea to, to a practical problem. Okay. And there still is this leap of insight required. In many cases, I think those are bad, bad interview questions, right? You don't like, you know, uh, you know, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, companies should try very hard to ask questions whether it's not like there's one thing that has to be grasped until that's grasped that the problem feels impossible. Um, uh, but that, you know, but, hard application, you know, sort of practical application of a computer science topic represents the significant majority of, of, of questions at big companies. And so, uh, so when I was in college, I interviewed at one of those like big management consulting firms and, uh, they did have all those questions. Um, we spent like two months like prepping for it and I didn't get the job. Uh, and, uh, I did okay on like the stupid, you know, ping pong ball questions. You don't have to feel bad. One thing, one, one number we have that's interesting is that, um, the engineers who do the best at companies go on to pass, uh, about 80% of their interviews, but not 100. No, almost no one passes more than 80% of their interviews at companies. Okay. And yeah. So, so, you know, one big advice to everyone is just like, don't feel bad if you fail. It is really not a referendum on, on, on, on your skill. I'm very happy to have not gone in that direction. Um, so what about the role of like, you know, projects? Someone's portfolio of like side projects.
Side projects & job applications (23:07)
Are there certain types of side projects, um, that across the board are attractive to companies? Or is it like, you know, say, say I'm applying for a job at Stripe and like I did, you know, X payment type project and that would be more attractive to them. Um, so across the board, are there things that are interesting? Let me, let me, um, talk about, answer that question, then we'll talk about what I think, you know, the right thing for companies to do it. So, companies don't actually pay very much attention to side projects. Okay. Um, except for at the screening stage. So resume screen, you know, you can't, it applies to company, companies decide if they're going to interview the person at all. And there's some adverse selection bias and who applies the companies and there's this big stream of candidates. And so, you know, at any company, there's this huge dream coming in and they have to decide somehow. And so they do that based on resume screens and it comes down to pretty dumb credential stuff, right? If you've worked at a top company, you've gone to a top school, um, or in some cases, if you have a project that they catch is why that's impressive. And so, a side press can help a lot there. Um, but they are very rarely given away in the actual interview. Um, and I think it's actually probably the right decision. So people who have side projects sometimes feel bad about this. Um, but, but, but the reason it's the right decision is that most engineers don't have side projects. Most engineers have been, has, have, you know, been working at a company and it's all proprietary code and there's very little they can show that's, you know, eight out of 10 and you're doing that situation and having a consistent process, consistent, fair, you know, like consistency is the first goal of the I need to be processed so that, you know, the big problem is that the process is not usually consistent. So if you're making consistent, then you can optimize it. Um, and having this sort of other process where you look at, at, at projects, um, issues and noise. And it's also just really hard to do. So you can't tell if someone spent, you know, a weekend on the project or if they're working out for like the last 10 years, we literally see both pretty regularly when talking about their projects. Something I did over weekend or this has been my, you know, abiding passion for the last 10 years. Let alone like who actually contributed. Yeah, the other, and, and, and, you know, things like coding quality, right? It's actually, it's, it's startlingly hard to look at a big bit of code and decide if you think the program who wrote it is, is skilled. Just again, like there's so much context. You can't tell what bugs they spend hours over. You're like finding, like finding bugs, like not always good enough to look at a code and like immediately find the bugs. Um, yeah. Um, and, uh, yeah. Um, and so, you know, for that reason, um, for all those reasons, you know, side projects are useful. So if you, if you're problem as an engineer, if you know, it's just as much engineer, if you're applying for jobs and you're being screened out a lot, the resume stage, doing projects probably helps. Um, doing practice is a great way to obviously increase your skills and that will be reflected in better performance on interviews. Um, but, uh, I don't think projects have a very big role in, um, in the actual interview. And so what other things should I think about if I am being screened out? Like say, like, you know, I'm, I'm getting a call back from like a one out of 10. Yep. What should I do?
Does anything help you stand out in applications (other than resume projects, publications, internships, etc.)? (26:14)
Flap, fail by. That's the answer. Um, you know, otherwise, yeah, side projects help. It just sucks. Right. It's this, it's not, it's not, it's not malice on the part of the companies. You know, like, they're who run the applicants. Um, and so they, they use these sort of crude filters. Um, and that's, I think the big thing that we're focused on is trying to figure out how to directly measure the skill and so that we don't have to rely on, on, you know, filters like where someone's worked or what school they went to. And what about things like, you know, for example, locations to say I live in Salt Lake City and I'm interested in getting a job possibly a Facebook. Should I put like San Francisco on my resume and just like fly out for an interview? Um, do you have general advice in that area? Uh, big companies don't care at all where you're based. They have, they fly people in, you know, by the hundreds every week. Okay. Um, smaller companies, uh, do show a slight preference to local candidates. And so if your goal is to work at a, at a small, let's say, sub, you know, 20 person startup, um, you're probably at a, I don't know, 10 to 20% advantage if you're based in the Bay Area right there. Okay. Cool. Um, so from the company side, uh, there's like a million different interview methods that people, uh, go for. Um, say we are, you know, they go through triple bite, they get screened. Um, they're going to do an interview, whiteboarding, pair programming, all that stuff.
It can vary... (27:34)
How do you feel about it? All the methods can work. The right. Let me, let me, let me sort of give a bit of a bit of a bit of a overview here. Um, so again, as I mentioned earlier, the core problem is there's this tension between the skills that can be mentioned in an interview, solving small problems quickly and the skill that matters is a program, solving big projects over a long period of time. Um, and so the first, you know, the first approach you can take to interviewing is just say, okay, we're gonna, we're gonna not do it. We're gonna do like trial employment. Something like that. Um, and that totally works. If you work with someone for a week, you have a far better read of their skill than I think anyone can get during it during a three to four hour interview. Um, the problem is that there's a pretty strong, um, bias in who's willing to do trial employment. I mean, that's the adverse bias.
Exp edit questions and bias in resume screens (28:25)
So that's, so some of the, some of the many of the best programmers have lots of options. And if your company requires that everyone do this trial employment period, um, most of them are not going to just want to say no. And obviously anyone who currently has a job, um, I can't leave for a week. Can't do it. Can't leave for a week. Yeah. Um, and of course, there's also, you know, you're, you're committing a week of time. And so obviously you need some filter before the trial, uh, you know, uh, employment. And so I think in the end, we're left with a thing kind of like the, you know, like the, what the famous, you know, democracy is the best, the worst one of government except for all the others. Um, I think that, you know, interview is the worst way to evaluate engineers except for like all the other options. Um, and so we're like, so like you have to do it. It's, it's, it's fundamentally inaccurate, but you still have to do it. And the goal is to make it as accurate as possible. Um, you know, and so once you're in the, on, on, on that page, um, we see two sources of noise. We see noise that comes from the companies being inconsistent. So I talked about that a bit earlier. Um, you know, just, it is still too often the process that engineers are responsible for coming with their own questions. So if you're asking every, every, every candidate different questions, um, and coming to a gut call, there's just this heat, like far larger than anyone really realizes source of noise. And so like if you asked, you know, pick any company that has that process, if you ask them to re somehow re-engine, re-interview their colleagues, you know, in a blind fashion, right, they would have a, you know, they would likely have a 50 to 60% pass rate, right?
Critiques Of Big Tech Interviews & Discussion On Employee Retention
Why big tech interviews are flawed (29:41)
So 40% of their colleagues would be screened out. Um, yeah. Um, and so the solution there is just to be really consistent. Okay. So like to come, like to make sure that you're asking everyone the same question and make sure that you're evaluating them in the same way. And I think that's, that's more important than what you're actually asking, right? So like the first step is be consistent, second step is tweak that over time based on the results you see. Okay. Um, you know, once you're doing that, I think the other source of noise we see is companies looking for different things, right? And so I think that like, if earlier you have a company that's looking for, so you know, super academic, you know, engineers, you have to be very practical engineers, you have companies that think that, you know, all, all skilled engineers, you know, know about how operating systems work, you have companies who think that, you know, they only want to talk to people who have experienced and compiled languages, you have companies who hate compiled, I think they're old and stodgy, you have companies who like want people to use enterprise languages, you have companies, it's, it's this mess. Yeah. And so I think the important thing is to untangle which of those are conscious decisions you're making about who you want to hire. So you're a banking company, you want a big focus on, you know, QA process and safe code, it probably makes sense to reject someone for being too, you're a twinned up cowboy. Um, you know, you're a social media company, your goal is to move really fast.
What Does TripleBit Careers Look Like? (30:59)
You know, maybe you decide to have a culture where you want to move fast and break things and you want to hire a Cowboys. Um, you know, those, those, those are all, those are, you know, logical decisions. Okay. Um, but, but, but very often, companies are making similar kinds of decisions almost by accident. And so sort of, introspection, deciding, okay, like there's always, like we want to hire those people and then designing the process to look for it. And so in your examples, um, white board coding tends to skew toward the academic. It tends to give preference to people who are really good at breaking their thoughts down in this sort of structured academic way and ready with a small amount of code. So you often have people who are actually really productive, excellent programmers who look really stupid and bad on a white board interview. Um, and so if you're not looking for academic skills, um, it probably makes more sense to, to, to put people on an actual computer and see how they act to work in their environment. Okay. And so what does the, um, I get the underlying question for me is like, could you engineer a perfect interview? But I wonder like, what is the interview for a job at Triple Bite look like? What, I mean, I imagine you made it, right? Yep.
How People Engineer A Perfect Interview (32:09)
Okay. Well, so first of all, everyone goes to our, all our candidates go through a regular process. Okay. So we hire people out of our regular stream. Mm-hmm. And then we compete head to head with the companies. So they, they, we just enter them to us as well as other companies and, which is kind of fun. Um, so they first, so they first go through a regular process. Um, and so we already, you know, have a pretty strong sense of how they are, um, uh, in those areas. And then just like, you know, but my advice generally is to decide what, what the skills that, that you preference. Um, and so I think we, we preference, uh, a couple of things. Um, we preference, you know, data and data analysis is pretty key to our business. And so we preference people being comfortable and familiar, you know, talking and thinking about data. Um, that skews a bit more academic. I think then, then, then, then what many companies hire for. Um, and then we, because we're in the business of evaluating knowledge really broadly, we then preference breadth of knowledge. Um, I think to a greater degree than most companies need to. And so what, what does that mean in practice? Like what questions would I be, what I'd be looking at? Yeah. Uh, so we, again, so everyone goes through first our, our, our, our standard process. Yeah. Um, and so we had, that gives us a pretty good read on just, you know, practice programming output, um, general knowledge of computer science, um, general justice and design. Um, and then we then in our, we do an additional follow up, you know, on site with the candidates. And that goes much more into depth, into, um, into data. Um, or if we're hiring them for a different role, we sometimes are folks who are not working right in your data. So for hiring to be sort of a front end developer, that though, sort of into depth, into front end development. And so here's a, you know, here's a spec for a front end project. You have two hours, build that, um, or if there can be a backend specialist, here's a backend spec, you know, got you. Okay. And, um, so as an engineer, uh, should I be paying attention to like every new thing that's coming out? Is that going to be of importance when I'm doing an interview or should I be paying attention to whatever, a medium amount of it? Well, there's an interesting long term answer. So one thing, a class of people we see, interestingly, we see people who have, who thought about that same thing 10 years ago, right, made the decision to not keep up, not, now the industry has changed. And now these folks, you know, are maybe still using, let's say CGI and don't understand modern Web stack and are indeed in a, in a weak situation in interviews. Um, so I think the answer to your question is, day one is not so important.
Employee Retention Lessons Learned. (34:32)
Very few companies, especially only, generally only smaller ones, are like directly evaluating flashing your tech. Um, however, if you, if you, if you make the decision too, too forcefully today and you know, don't give up today and you're, you're end up being, you know, totally behind 10 years from now, then you probably are going to, going to pay a price. Yeah. I mean, especially if you're actually interested in starting your own thing at some point, like being on the edge really, really matters. Um, okay. I, I mean, maybe, uh, this is kind of difficult to answer, but I wonder about like employee retention, engineer retention. Are there any qualities that you've found like you can vet someone and say like, okay, I think the average is like 18 months or something for someone to stay around. Um, are there qualities that correlate to longer term employment? I'm not having looked at it on this, uh, recently, so this is going to be a little bit sort of off, off the cuff. Yeah. I mean, just the obvious things, if you know, candidates who are excited about the, the mission and the actual company have a, you know, higher probability of staying than candidates who are chasing the highest paycheck. Um, of course, there are counter examples, you know, sometimes there are awesome engineers who are looking for a place to really commit, but I also want to make a fair, fair way. So it's, it's like, this thing's probably complicated. Um, but yeah, looking for, I think, I think the number I think I would just say is looking for engineers, how excited about, about the, the company and the job. Okay. Cool. Um, so kind of just wrapping up, I, um, I want to, are there any books or things that, uh, if I'm in, uh, kind of like an engineering manager, I'm going to be running a bunch of interviews, um, that I should, I should really dig into and I can get a lot out of. I have not actually found any books that I think are very useful. I think this is going to sound arrogant maybe, but I think engineering, in, in, if we're in this field where it's so easy to say things that sound profound that are not true. I truly believe that like, like 80% of what's written out there by interviewing just doesn't actually hold up. So for example, yeah, it's sound like a really idea. Actually a lot of engineers love, right, is this the statement that interviews don't make any sense at all and you just look at work someone has done in the past. And, um, you know, we, we, we, we, we tested this a bunch. We tried, we tried scoring engineers and I haven't talked about past projects and scoring them and trying, like even go, even like, like a full hour, like going to depth in the project, talking technical details, scoring it. Um, just talking skill, ability to spin a tail ended up dominating the actual engineering rigor. Um, and this was far less predictive of, of, of job performance than giving them a relatively simple programming assignment. I mean, and, and that kind of sucks. Like, I don't really like that's the case. Um, and you can find so many articles out there about, about sort of, you know, this, it's stupid that rasc engineers to do these interviews. Why don't we just have them talk about their past experience and just to be, if you test it, it doesn't hold up. And just for the sake of like keeping things standard, right? Um, what do you tell people do in that way when they're conducting an interview? Well, yeah, I mean, standardize, right? So we, we will, you'd be really careful about helping people interestingly. So certain, certain candidates are a lot better at listening help without necessarily realizing that you're helping them. It's something we've had to battle with a bunch actually. And so we have, it helps because we, again, we're doing thousands of interviews and so it's easier for us to do this, but sort of being, we have time and decision tree of all different ways that you can go and like what, what help we're allowed to give, what help we're not allowed to give.
Steering Through Help Without Realizing Its Help. (37:37)
Um, it just, it's, it's a big sorts of noise. Um, I, I, outside of doing a thousand interviews and standardizing, I'm not sure I have a really good fix for it, but be aware that some really compassionate candidates will like, okay, well, so, okay. So what's a common area where I might ask for help without you even realizing that I'm getting help? One is just being brave enough to ask. So saying like being really friendly. Yeah. And then saying something with confidence that's, that's sort of like sort of right. And then getting, there's this natural instinct to like add on to incorrect the error. And as the interview rate, it's really easy to do that and not realize that you're like steering the person through the problem. So if you're going out for interviews, you should do exactly that. There will be a, I have a blog post on how to prepare for interviews and I do recommend trying to do that actually. Really? Oh, that's awesome. Um, yeah, I want to add a sort of a side to that though, which is like actually the negatives side of that, which is that interviewing can turn into hazing. In every, like, it'll, another interviewing is not just evaluation. It's also like this right of the shared right of, of entry into a company. And some companies develop this culture around the interview is being hard and unpleasant. And as the interview rate, it's really easy to forget how much harder it is with your other one, you're answering the question. It's so much easier to feel smart when you're asking the question. Um, and sometimes candidates get really flustered and like answer a question and it seems like, like, it's really frustrating at the interview if they're like, this thing is obvious in front of them and they're just missing it and they're wasting your time and you can like, you can get a little bit angry inside. And it's just really important to stay away from like the hazing, like taking out anger on them by, like, like, you know, you know, I'm generally against cutting a view short. Actually, I think it's, I think, I think, I think, except when the case with the interview, with the candidate is in pain, I think it's not worth doing. I think you save some time, but, but you damage reputation, you know, they, they, they really dislike it. It's embarrassing. Okay. Um, but definitely staying away from the hazing, staying away from like the. So does that just mean like, uh, crazy brain teasers? Does that mean like, cutting them off in conversation? What does that mean? Yeah, I mean, well, it means all those things, right? So it means crazy brain teasers. Um, it's just being mean thinking, sort of, sort of sort of getting, getting slightly angry and, and aggressive in how you answer their questions, okay, because you're frustrated by, by, by how poorly they're doing. Okay. Um, and a, a trick that we use that I think helps in that case is to sort of, in the case where a candidate is totally failing the interview, like flipping a switch in your brain and going from like evaluation or into teaching mode, you're like, you're like full goal now is just to explain and as, as friendly as possible, like, I mean, generally, you're, you're, you're, the other person failed, right?
Ethical Considerations In Interviewing
Ethical Concerns When Interviewing Candidates. (40:09)
Because this happens when the person is already essentially failed, at least the problem if not the interview. Sure. And so you're like, you already have in your brain found, okay, this person is not passing. Yeah. And so I'm going to spend the remaining 15 minutes being friendly and explaining the answer to this question, sure, rather than continuing to try to elicit responses from them. And what about the, um, just the dynamic, like, do you advise one on one interviews or how many people to per interview? Yeah, I, I, I, um, interview panel that definitely increases the stress. Yeah. Uh, so we max out at two to one. So it's, it's, it's training is important, right? So if you're trying to keep it consistent, you need to have continual cross, like cross, and people need to watch it, or each other's interviews. And so we, but, but two, sort of one interviewer, one shadower, uh, is enough to do that. Um, uh, you know, going beyond that increase in the stress and I don't think it really helps. Cool. Um, so if people want to follow up with you and ask you questions, how can they reach out to you? Sure. Uh, my email is, uh, firstname.lastname@example.org. That's AMMON. Cool. Thanks, man. Thank you. Okay.