Pair Programming with Ben Adida
So I'm Ben. I'm ah, head of engineering at a company called Clever. And, uh, I want to tell you a little bit of an opinion. I want to give you a bit of an opinionated point of view on how to do. Ah, a particular kind of coding interview, which is a pairing interview. And this talk is based on my experience at two companies where pairing interviews were done set up before I arrived. So I'm not the one who put them together. I got to observe and see how they worked on. And this is kind of if I had to design it myself. Now, based on these experiences, this is how I would do it. So this is a little bit of a combination of the things I saw and the things I would do. And it's opinionated, so it may not align with the other talks you heard. But that's OK. And if you have any questions, I'm happy to take them. I'm told that I can take them, so just you. Do you have any questions we have? Mike's here. If you're not in the front row, please ask your questions to bend as we go through. Happy to d...
o that. Awesome. All right, So what is the goal of engineering interviews? And I mean that broadly, not just the coding interview, but what's the goal? Ah, the goal for me is simple. And that is, if I can click it. Is the candidate going to be an effective engineer in your organization? Now that's a question that has many, many sub questions to it. Some of them have to do with whether they're aligned with the mission of your company. They have nothing to do with coding, right? But on the coating part is are they going to be an effective engineer on your team? And to figure that out? Uh, I like to think of the interview in a way that I compared to a medical test and if you know me, and I don't think I know anybody here, But if you did know me, you know that I do this comparison to medicine quite a bit, in part because I spent some time in the in the medical system, but also because I think that there's analysis of probabilities and risks and air bars in medicine that is worth bringing over to how we think about software engineering and specifically when you give somebody a medical test when you're trying to evaluate whether they have a condition or not. Doctors talked about two concepts, sensitivity and specificity, and I always get them confused. I always have to kind of think through. Wait, what is Oh, right. Sensitivity. Is this one sensitivity is if this person has the condition, does your test detected simple question, right? Hope you're hoping. Like while I hope so, I mean, that's the point of the test, right? Like if I have the condition, hopefully the test detects it. But how often and how often does fail to detect it? The other one is specificity, which is, if you're test, detects something. How often is it the thing you were testing for versus something that has nothing to do with the test, something that's also giving you that outcome? Another way to think about it is false positives and false negatives, right? And by the way, those air flipped around. If you're trying to match false positives with sensitivity, that it's the other way around, so false positive is you hire somebody and they don't work out false negative is. You don't hire somebody and they're awesome and you don't get them. You don't get to work with him. We always think about that 1st 1 We don't always think about that 2nd 1 right. And in a world where it's a pretty competitive market, toe higher, great engineers, that 2nd 1 missing out on a great engineer is a real bummer. It's a thing that hopefully, as you think about your interview process, you can. You can think about that particular outcome. So again, missing out on great candidates. That's a problem in your interview process. Hiring engineers that end up failing That's a problem. Well, maybe we should just design an interview process where that doesn't happen, right? You never hire an engineer that fails, and you never miss on an engineer who is amazing. Great. If you can do that, then you should come up and give this talk because it's pretty hard to do both of those things. And usually there's a trade off, right? Usually if you take more risk on somebody, then you have less of a chance of missing the good people. But you have more of a chance of that person. Maybe not working out with your team, right? And the other way around. If you're super, super strict, then great. Maybe you won't have any problems with the people you hire. But you might miss out on a whole bunch of people who didn't quite match what you thought you were looking for or where your test your interview process just did not highlight the ways in which they would fit in well with your team. And so you missed out on somebody. Great. So I'm telling you all this to tell you that as you're analyzing how you want to hire somebody, this is how I think about it. I prefer to take risks on people rather than to miss great people. Because ultimately, if you're building an organization that has accountability, performance reviews, you have to be able to let somebody go. If it doesn't work, that already has to be something you do in an organization. So why not take some risks? Why not see if you can get that great engineer who doesn't quite present in the way you expect? But it turns out to be amazingly effective afterwards. So as you're thinking about that trade off. I prefer to take more risk. That doesn't mean it's the right solution. That's just how I go about it. And the reason I'm telling you all of this generic stuff about interviews is because I think the key thing where you miss out on great people is the coding interview. Because the coding interview tends to be sufficiently artificial, sufficiently different from how people code in an actual job, that there are many ways in which how you evaluate somebody encoding interview can be completely unrelated to how they will actually do in the job. So if your goal is to determine how well a candid codes in a realistic setting, then you're getting closer to what you need. You want that realistic setting. I want that realistic setting. I told you this would be opinionated, right? That's what I want as realistic of setting as possible so that I can try to get the engineers that will actually be effective. So why is this important? Some more controversial points here? Opinionated, most software will rolls don't require deep algorithmic database or other specialized thinking. I'm not saying don't test for those. I'm just saying keep in mind that the deepest algorithmic question they engineer will probably face with you is the interview question. And then after that, after that, when they work with you, chances are they're not going to do anything. They're not gonna do big, oh, analysis. They're not gonna do a whole bunch of things most of the time. Occasionally they will. But not that often. Uh, also the second point, how well someone navigates code there, Dev Environment how well, they debug how well they test and generally how at ease they are in the coating process is a critical component. And if you don't see that in a natural environment in a realistic environment, you're not gonna get a good gauge on that. And finally, the point I was to repeat that point again so often, what you're doing an interview is very different than what you're doing when you're actually coding. So the goal is to test somebody in a realistic environment. The non goal is to determine how somebody codes in a contrived setting under duress, right? Like not a goal, right? Having somebody like sweating at the white board and like, I can't quite tell what the race condition is right, not a goal like that. I just I don't think it's a useful thing. In other words, you're not trying to determine how well somebody interviews. You're trying to determine how well somebody codes, and that is a really critical difference, because sometimes people are really good at interviewing, and they're not nearly as good at coding or worse. In my mind, they're actually really good at coding. They just can't do it under stress, right? And those of the great people you miss out on. So how do we do that? That's great. I just told you, like I just, you know, do this thing that super realistic done, you know, lesson over. Um, I think you do that with a paring interview, and I think if you're doing coding interviews, my take is stick to pairing interviews in person. I'll tell you a little bit. There's a small wrinkle on that that I'll tell you at the end. But if you're bringing people in to interview them, paring interview in a realistic setting on an actual computer set up with a real dev environment working with somebody else, like many companies do with enough material and milestones that they could make some early progress and have some early successes. And then, if they're really good, keep making progress. So that time pressure is not the predominant thing on their mind. It's not like, Can you build buying a research now in five minutes? Go right, cause that never actually happens in your job, right? Never. So how quickly you can code binary search, not a feature. It's a skill. Kudos if you can do it. But that's not what I'm hiring for. Right and tests. This is so important, right? You wanna have something where tests are part of it? Because when you're coding at the white board and you know there no tests, right? And that's just not how people code today, right? So you want a completely realistic environment where all of those components are brought to bear again? I'm opinionated. I'm sure I'm contradicting the previous speaker who also has wonderful advice. That is just one take, right? So why pair specifically? And the reason I think you want a pair is because the output is not what counts as much as the entire process. So you want somebody to watch over what's going on. So they hit a snag. How did they react to that? They're looking up some documentation. How do they do that, right? Are they looking at stack overflow for every single thing they need or just a couple of things that they need to remember? Right. Um, the previous speaker talked about, like just telling the person this intact. I agree with that. That's wonderful. Like that's not what you're testing for in a normal setting, when you're working with somebody, if you forget something, they're there to remind you if they're not there, Google's there. But, you know, usually there's somebody there to remind you. So how how well do they navigate code? Give him some code that they're gonna have to augment. How well do they navigate the existing code? Watch them work through that. Ideally, in their preferred editor, How comfortable are they with that? Make them at home? How well did they debug and test if you provide, ideally a problem that has some boilerplate code written and some tests, hopefully the first thing they do is run the test, right? Let's run the test, Okay? It passes great, but he had something and see the test still passed. You can tell so much about how somebody codes just by putting them in front of a code base and telling them how to run the test, how to run the program and where the files are, where they go. What's their first instinct? I think you'll uncover so much in those so much Maurin those 1st 2 minutes than you would in any other kind of coding test. How do they work through common issues if you prepare a problem that tends to have off by one errors? Is it zero indexes at one index like those? Those are the things you all the time is an engineer. How do you deal with that? Are they accustomed to that? How they deal with that is going to tell you. Are they doing this on a regular basis? Is this a skill that they really have in their bones from experience? Or are they still at the big interstates? And by the way, depending on whether you're hiring junior folks or more senior folks, there's no one right behavior. Answer right, but you can tell with these things how did they write code. You learn something about their style. Are they test driven development engineers? They like. Wait, I'm gonna write some tests before I do anything else. I'm not saying they have to do that. That's a style, right? But you learn something about them, and that's important. Do they write a bunch of code and then realize that there's duplication and re factor? Or do they think about it very thoroughly and decide what all the functions are gonna be before they actually right? All these things are important, and by the way again, there's no right answer. But you know how your team codes. You know what's working for your product, and that way you can get an idea of whether this person's going to fit in well with your team, right? Or maybe you want somebody who's doing things differently from your team. Maybe your team is not that good a testing, and maybe you see somebody who's really good at testing and like, great. I want that person and you're gonna be able to see that by watching them cook. And then how well did they break down a problem and make rapid progress? Right, so you're giving them maybe a slightly fuzzy problem. And you're asking them to talk through it because you're there and hopefully they break it down. Hopefully you see them do that again. If you've got little milestones along the way where they can get some accomplishment in 10 minutes, 15 minutes, 20 minutes time pressure doesn't become as much of an issue, and you can see them work naturally. So we pause for a second. Any questions here? Uh, yes, Uh, personalizing the development environment. How much time did you spend on that? Great? So personal development? Varmint. I'm gonna answer that. That's actually my second my next slide, which is perfect. Great question. You're totally and flow with what I was presenting. Any other questions? Yes. Uh uh. How much time do you recommend for a pair paring interview? Parent of you? It's a pretty significant investment of time for a pairing interview. Um, it's at least on our Sometimes it's an hour and 1/2. If you have a mode Where, um the way we do it at clever where you have part of it when you have somebody you're pairing with and part of it where you're on your own. So you pair you do some stuff on your own than you pair again. And that tends to be more of an hour. 15 an hour. 20 minute process. 45 minutes. A little tight. If you really want to not be under time pressure cause you have to describe the problem. You have to set them up with a computer. Um, I think at least in our, um do you worry at all that you're testing for experience with TDD and pairing? Yeah. Um, so I'm not looking for TDD. I'm saying that if it's something you care about on your team, then you can see it in that. In that case, you just get a sense of their style. I don't tell them how to do it. I don't given directions. I just say, this is the code. This is how you build. This is how you test and and then go right So it I don't do TDD. I know I have some engineers on my team who do. I don't think it's something you have to do or not do. It's just a style preference, so I'm not trying to check off those things as yeses. I'm just trying to understand who they are. Is an engineer? Yes. Do you typically have sort of multiple languages or problems available Depending on the candidate is great. So multiple problems, multiple languages? That's perfect. We're getting to the sound like the questions are exactly what I want to talk about next. Which is many greedy. All right, Like what is this pairing interview actually look like the big take away? If there's only one take away from everything I just said, it's that doing this right is a huge investment of time. You're not gonna be able to kind of wing a good pairing interview. And if you want to do a pair interview, that's fair. Across multiple candidates over time, that's extra. So the first thing is the environment. I told you this would be opinionated. This is how I would do it if I set up from scratch, Set up one or Mawr. If you're interviewing a lot of people separate machines that are dedicated just to this you're not gonna is not a developers machine that you're gonna temporarily use for a coding interview. This is a machine that you're dedicating, just preparing interviews. You going to set it up? In the standard environment? You probably have to pick Mac windows or Linux. Uh, but you're gonna have to pick. And you can install as many development environment options as you can muster. Max V. I sublime. Uh, like, I guess, Microsoft is a new one that everybody loves, like Death studio, Uh, whatever you can put on that machine. Great. Because the one thing that you do want to do to make an engineer feel at home is at least the text editor that they're used to, right? Um, and if you like to use, I you know, if you want to use some ideas two Great. We tend not to do that at Clever, but that's OK, too. If you can have the Tech center, if you can have the idea that somebody is used to Wonderful, this sounds so simple. But it's so incredibly important the reset script that puts the computer in the right state because you know that you're gonna mess it up between interviews. And so you want the computer be in a predictable state and you're gonna want to test that reset script a few times so that if an engineer is being interviewed, you can take the laptop. Or if it's a desktop, you can just run the reset script. Boom. And you know, you're good to go. You're not gonna have straight files. Weird. Partially compiled code from a previous interview That would totally mess you up. So you need that reset script. Pretty simple, but important. A clear process for bundling and submitting the code. So you probably wanna have a copy of the person's code when they're done, and you will probably put it in your applicant tracking system and whatnot. So you need to decide how that's gonna happen. Not like Oh, yeah, I guess you can zip it and email it. I don't know. There's no email client. Okay? Yeah, it's a mess, right? You have to know exactly how that CO is gonna get submitted. You can't make it up, right. And then you want to give Canada heads up before they come in. Hey, by the way, there's gonna be a pairing interview. It's gonna be on a Mac. Oh, God. I only use windows. I know. I'm sorry. It's gonna be on a Mac, right? If you if you can. If you've got the resources to do, like Mac Windows and Linux. Awesome. But that's the part where you are like combinatorial e exploding the number of things you have to prepare. And I haven't found that to be very doable. So I've used a Mac with his many editors as possible, and you go from there. Uh, but you want to give the Canada heads up so they know what they're dealing with and ideally, more than a day ahead of time so they can be prepared. They get a sense of what it's gonna be like. You're gonna come in. We're gonna describe a problem to you for about 10 15 minutes, you're gonna have time to ask questions. Then you're gonna code with one of your with one of our engineers, right? Just having them be in the right mindset, having them not be surprised that your they're about to be handed a laptop they have to tote on is pretty critical to do this, right? Yes, their performance, when they're an environment they're familiar with, you go as far as having them work on their own machine and just downloading some. So just to start with. So the problem with that is the variability of, you know, if you if they need a particular library for the problem you have, you want to ask them, Do they have on their machine and they have to go install it? Do they have the right version of the compiler like there's all sorts of little variants there where it could work really well, but it could also just go completely sideways. And so, generally we've pushed back When somebody says they want to bring in their own machine, we say, Let's not do that because there, you know 90% of time will go great, but 10% of the time it will be a disaster. And then how do we evaluate you? Any other questions? Yes. Looking half a year ago, I switched back from Windows from the Knicks. Anyways, uh, it took me like twice as much time to Komac at the first, Like several weeks. Yeah. So how are you about it, people? It's the imperfection of this approach. It's the one wrinkle that I haven't quite figured out yet. Usually what we do is we tell the interviewer ahead of time Hey, by the way, this person was informed of the test and they told us they are not really familiar with Max. So take her time with them. Show them some some aspects of it, but it's in perfect. It's the aspect here where haven't quite figured out how to make it perfectly fair. Awesome. All right, a little bit more. The problem itself. This has to be an extremely well vetted problem. You have to understand exactly what a very successful interview is going to look like, what the milestones are going to be. You have to test it internally on engineers that haven't seen it before, and you have to write boilerplate tests in multiple languages. These are the five we use a clever and usually we find it like at least one that's like a solid intersection between the person's skill set and us right. We don't use all five of these languages and production too clever, but we don't care. We're not interviewing for a particular knowledge of a particular language, right? So this is just a, you know, if we have more, if you have more resources and you can build a test and you can build a boreal play in other languages. Bio means do more. This has been a good enough set for us. Actually, we added Ruby recently, we didn't have a need until recently. And then we had a few people who preferred Ruby. So we went ahead and invested a few days before the interview to make that work. Um, try something that's not super standard textbook, right? If you're like, Oh, you have to write code to figure out there's a loop in this like linked list. Yeah, people have seen it before, right? And if you have this, like high variability between people who've seen it and people who haven't, then you're throwing your test off right? Just like this medical test, it's no longer a sensitive as it waas. So this takes a long time. You want to think about a problem that maybe is related to a standard problem but has a different twist on it. That's unusual, right? And so I'm not gonna tell you the problem with the clever, but the problem was a clever has that property where it's related to something that people might have seen before. But it's sufficiently different that Basically nobody has that's come through, has seen the problem before. Maybe I think we've like one in the last year had seen a similar problem. Um, and that's like, That's the sweet spot, right? Um, I have seen at one previous job a place where there were lots of problems. Every engineer would come up with their own problem and they got really good at interviewing with that problem. And so depending on which interviewer you got, you got a different problem and that works in terms of have being prepared and well tested and explained because of one interviewer is very accustomed to giving that interview. But it doesn't work so well in terms of the consistency across candidates, right? Different candidate are getting different tests, and now your evaluation metric is different for different candidates. And so how do you compare? Are you hiring the best for your team? So my preference is to have a standard problem that you're using with everybody and that you trained multiple interviewers to give has ideally, you have one person shadow the other, so you can continually standardize how you're presenting the problem, and you try to consistently presented in exactly the same way. So everybody's in the same boat. Basically, Keep in mind even this is contrived. Okay, some candidates will fail not because they can't code, but because it's contrived because an interview situation, because they're wearing nice clothes and they usually wear when they code like whatever the reason is, there's lots of reasons why people might fail at this. So if you see somebody fail and you get the sense that they didn't fail because they don't know how to code, they failed because it's a contrived situation as un contrived as you try to make it. Maybe there's a maybe before that, as part of the interview process, you gotta take home problem that was not timed where they coded without the constraint. And you can kind of go back to that and see if there's a giant difference like they did really well on take home problem. They didn't do as well in person. Maybe that's an additional signal. Or maybe they've contribute to open source projects and you can go compare what they've done on Get hub to that, and that gives you an additional signal. I I think there's something really valuable if you can have a successful in person interview, but again, in the spirit of trying to make sure that you don't miss the great engineers if something's not right, If you feel like this person had better capability than they showed, then you've got these fall backs. If you can set them up, Uh, and then sometimes even a good person will just totally bomb one of the interviews during the day, right? Like they have a whole bunch of interviews. And then, like, who knows? They didn't drink enough coffee or it was right after lunch and it was like a bunch of pasta, so they fell asleep. Like who knows? Like everybody. It's very common for somebody to just fail one interview during the day. So if you can have two of these coding of these pairing interviews, which requires double the work right of preparation, But if you can have two, it's pretty awesome, because if they fail both okay, that's a problem. If they do great on both, that's an amazing signal. And if it's one of the good on one that you can have a comparison and try to get a sense like use one to correct the impression you got on the other. My ideal situation is to very well defined, very well rehearsed problems that are non standard, non textbook but may be related to a standard textbook problem with really well trained engineers interviewers who can, who can give him in a very standard way, with as much of a real Dev environment that the engineer knows that the candidate knows And remember. The goal is to determine if accounted, can code, not if a candidate can interview. And so this entire process is meant to get an impression of how well the candid does. As an engineer, I like to think about it. You know, the same way you think about medical tests is like you're shining a light on engineer and you see on a prone on a thing and you see what kind of shadow they cast, right? And depending on the angle of this is just a metaphor just to be clear, right? Depending on the angle of the light, you're going to get like, a lot of signal, like if you catch me my profile here, you could. That's a pretty recognizable profile, but if you shine a light from appear like it's a circle. It's not really helpful. You can't tell it's me, right? How do you shine a light, metaphorically speaking on a candidate to try to get the outline of a good engineer, right? How do you do that? And again, please don't misunderstand me. I don't mean how they look. I mean, like, metaphorically speaking. How do you shine that light? Right? And that means the process of interviewing is a constant duration. You have to constantly ask yourself, Did we get into a weird rut of presenting this question away? That's kind of not landing well with a bunch of engineers. You have to debrief on a regular basis and just try to adjust how you're doing it, Um, and because your interview, no matter how hard you try, is in perfect, you're probably hiring some people who are not as good as you hope they are. And you're almost certainly missing people. That could be great, because we're always worried about being that one person that says yes to an engineer that doesn't work out. We feel like, Oh God, that reflects negatively on me, right? That's why I am always worried about that part of the problem. How do you not miss that great engineer that doesn't present like a typical programmer in a hoodie and jeans, Right? Like, how do you find those great people and hopefully a really realistic Aziz realistic. An environment as you can create is the way to do that, because you want somebody who can code, not somebody who can interview. And that's it. Question. Yes. Earlier that there's four on site interviews, two of which would be a coating interview. Where and today do you recommend, like, first thing, aside from the whole thing, is it? I don't have a strong opinion on that. I don't know that I personally have enough data to tell We have not been consistent, a clever in terms of a schedule where it's always at the same time, which actually I kind of wish we were because I think it would put everybody on the same in the same in the same boat. Um, yeah, I don't have a strong opinion, and I think two out of four it is a good ratio. That's great. And then the person that is doing the pairing, what did they do afterwards? Do they fill out a written evaluation? Totally. So the way that I recommend that Any interview, Be it like an architecture interview, which might happen on a white board or coding interview on a computer or, uh, or even more of a culture interview of, like, How is this person like, How are they as a teammate? There's gotta be a write up a debrief, and you have to have very clear criteria that you're asking every every interviewer to fill out. Not like, Hey, would you like to have a beer with them on the weekend? Like That's not a criteria, right? I mean, it is, but not for hiring engineers. So you it's perfectly good criteria for finding friends, right? But so you want a criteria like, are they? Are they challenging assumptions, right? Is that Is that something you're looking for? You want somebody who challenges assumptions? Are they open to new ideas? Right? You want to decide what's important to you on your team, and the criteria for your team might be different than for my team, and you want to ask your interviewers to rate according to these very clear criteria the same way every time, so that you you can be objective about this. And you know, the other aspect is like Is the interviewer in a good mood, right? Like, did they have their coffee? Right? Because if they didn't, they might have a more negative opinion. So you want to formalize what that feedback is like? It's got to be written. It's got me, um, immediate were really, really big on, Like, you have 12 hours to provide feedback because if you forget and your four days later like, Oh, hold on, let me try to remember your impression is completely off. You got to do a real fast. Other questions. How does that kind of look? Eso the process that we use his phone screen take home and then in person interview, including some parents? Yes, question for either if you guys were in this process. So there's a gale mention about style. Yeah, and I interpreted more asylum co stars. Style of courting. I'm interpreted more like documentation. How well do you leave the court for the next developer will pick that up? Yeah. How easy is it to compile and run it? If you were to ever do a take home exercise. Yep. We're all in this process. Do you guys fit that in? Do you? Maybe that's a question for both of us. I don't know. Like, come on up. Um, feel free toe to jump in. And do you want me to? Okay. Um, so, um, different. Different parts of interview process are meant to test different things. I don't expect engineers to document their pairing exercise right now. Don't have to do read me. But when they do a take home problem, they should have a read me. They should have some explanation. They should have some documentation. Um, and on the phone screen is met. Maura's, like, just a really quick check to make sure they can jump over the first bar. And we're not. We're not asking people to spend a whole bunch of time before we know that they have minimal coding skills. So the phone, the tech phone screen is, like, really just, uh, like a small obstacle, Uh, at the top of the funnel. Did you want him? Okay. Yes. So, in my mind, there's kind of a trade off between consistency and predictability, especially with sort of, ah, Internet worth of communication. So if your company is suddenly the size of Google and you only ask one question and it's very well vetted and everyone understands it, everyone understands it. So what can you do, toe sort of have this? Yeah. Ah, flow of interesting questions that don't become so well known that they're not a good signal. So I think this is you got a great separation of speakers here because I'm assuming you have experience with the very large recruiting process. Whereas my experience is more on the smaller company side. So I don't have that problem. Nobody knows what the clever interview questions there. Oh, yeah, yeah, we know about it. Uh, but, um, yeah, we can do things like more of an artisanal way in terms of preparing the questions and whatnot. Whereas in a larger place, you've got some different challenges. I'm sure. Yeah, I mean, I think it's you're I think you're right. I think pair programming has been said is a harder thing to pull off. I think acquires more training for interviewers. I think people are interviewers are less experience with candidates are also less Is Francis with well, and then you have the problem of the questions. Get out. Um, you can scale it up, but it is. It is much, much harder. Um, no. They get a lot more scripts and things like that to get, get out the right problems or start, you know, saying, Okay, we're not, You know, if you try to do the same question, everybody, you're gonna get this great consistency. Everybody will know your problem. Um, so it is. Maybe at some point, you say, You know what? We're not gonna have this perfect of environment. We're not gonna have this and that. And people gonna bring in our laptops and you relax on one. You know, one aspect of the perfect pair programming interview to be able to have interviews come up with their own questions, there's no, you know, I think your growth as well there's no perfect process. There's no perfect question. There's no these all have flies. I will say, though, that if you compare programming interviews to, like, we talk so much about the perfect process, prepare for programming interviews and, like, she's imagine that you know, you're hiring marketing, right? So much harder. Actually, think that like it's actually much easier to test programming skills. So put in perspective, that's true. And I think you're on to something That I didn't highlight and really should have highlighted is that it is a very different process for a different different sizes of company in different notoriety of company. Right? So, uh, at large, large Google size company, you're gonna have to think about like everybody's researching exactly what it's like to interview at Google, and so you have to deal with a whole bunch of things. Frankly, I don't have to deal with right eso. You should have a recruiting process that fits your company and also to think about you know, you have a company of Google. How do you hire new grads? Maybe you have, you know, so a lot of times, just like we're just not gonna hire new grants, and it's a lot easier to pull off a lot more practical tests when you're not hiring new grads. When you start hiring new grads and that's a huge part of your hiring process, you have a lot of people who have never using the bugger before because I just know what ever taught them that stuff, and that doesn't mean they're necessarily bad, But they're just not walking in the door with the right things. One thing I'll say Iran and cultural to think about is that it's, you know, I think programming is a great thing. I think more companies should consider it. I think a lot of companies just default immediately to this coding algorithm that I do just like what Google does. Even though I'm 20 people, Um and you know what? It'll be fine, right? Could you know, pair programming is something more companies should consider. But the thing to realize is that it's not an either, or you can do both. You can have. You can say, You know what this album thing like we really want toe, get people who are really, really, really bright. But we also want to do more practical coding tests, and you can do both. You don't have to do just one or just the other. We actually have ah good online question that came in here just to clarify. Then this one came from from Casey Person who says, What is the role of the pairing engineer? Should I actually try to solve the problem with the engineer. Where should I role play? Like we're working on the problem together. Do I slowly dole out hints and help them? That's a great question. I did not cover that. Good call. Um, So I think the role of the person you're pairing with is obviously the person you're pairing with knows the problem. Really? Really well, probably there really tired of this problem right now. Uh, and so I don't think role play works. I think you have to go in assuming, like, yes, of course. This person knows exactly how to code this backwards and Ford's in 50 different languages. So the role is to provide maybe a little bit of assistance on the syntax. So if you can pick an interviewer based on what I deal, you know what programming language the interviewer is the interviewees is going to use because you've talked to them ahead of time and hopefully you can pick an interviewer who can help them with that. So if they if they come upon a syntax thing than the interview, it could help. And then yes, a few hints along the way. If they're not quite finding a solution, I think is very appropriate. But I wouldnt role play like, Wow, I've never seen this problem before that I don't think that would work very well.