Would you like to receive similar articles straight to your inbox?

How Optiver Hires Engineers & Scales Large Teams with David Kent

This is the first episode of HackerRank Radio, our new podcast for engineering leaders interested in solving developers’ toughest problems today: Hiring the right developers. Hosted by Vivek Ravisankar (CEO & Cofounder, HackerRank). You can subscribe to us on iTunes and Google Play


About David Kent, Automated Trading Systems Team Lead at Optiver

How do you get sought-after developers to join your team when people haven’t heard of your company? As a proprietary trading firm, Optiver encounters unique hiring challenges in FinTech, a competitive and high-performance industry. David Kent, team lead of the Automated Trading Systems team, shares his first-hand experience building a team of engineers and how he continues to grow it.

Listen to David as he reflects on his:

  • Biggest hiring mistakes
  • Ability to take on his biggest hurdles
  • Standard to produce an unparalleled candidate experience

Listen to the interview below, or read the transcript below.

 


Episode transcript

Jump ahead:


Full transcript:

Vivek: Welcome to the HackerRank Podcast. I’m Vivek, co-founder, and CEO of HackerRank. The goal of this podcast is to help you solve one of the toughest problems you’re all facing today – hiring today developers. I’m very excited to have David Kent who’s built a large engineering team at Optiver, a leading proprietary trading firm in the options space. Hey, David. Welcome on board.

David: Thanks, Vivek.

Vivek: David I would love to know – we ask this for all our podcast guests – when was your biggest bug in production?

David: Oh, my goodness. Hopefully, nobody from the post office listens to this one. So this is something that happened when I was at Amazon. What we were doing was I worked in what’s called reverse logistics. Logistics is how you get products from the manufacturer into the hands of the customer and reverse logistics takes is the backward flow of that.

So in this particular project, we were dealing with how do we return overstock inventory and damaged inventory to one of our vendors. We were moving to the service-oriented architecture in Amazon and so I built a service that would allow our inventory management teams to kind of just send a request to us to take inventory and send it back to the vendor. So one of one of those teams just did it where every time they’d have a bit of inventory, they’d send a request for our service right.

So we had a process that was supposed to take all of the inventory and merge it into one big order. But that process didn’t run because of my mistake. At the same time, we had been working on a project to send returned inventory through the customer outbound pipeline. Previously, all the return inventory would go through a separate process that was really manual because it was typically like brought 1,000 books onto a pallet and put it on a giant truck. But we wanted to optimize it by sending individual items to the customer pipeline.

So, those two things happened at the same time and resulted in us sending about 5,000, maybe – that might be a low number. We sent a massive amount of inventory to our customer outbound pipeline to one recipient in some little town in Pennsylvania.

Basically, the post office called us and asked like, “Hey, can you tell us when all of these packages are going to stop coming through because we’re having to have people work overtime to deal with like thousands of individual packages coming into this post office that are all going to the same location.” Thankfully, we just made some people angry. I still feel bad about it to this day because essentially we told them like, “Well, we’re Amazon, so I guess you just have to deal with it.”

Vivek: Are you still in touch with the Pennsylvania Post Office folks?

David: No. Thankfully, I did not have to talk with them. My manager handled all of that. But I was mortified when that happened. I still am.

Vivek: Thank you so much, David. Could you give a little bit about yourself to the audience, your background and your work at Optiver.

David: I majored in computer science at Stanford, and then after that, I started working at Amazon.com. While I was there, that was kind of my first I guess “real job.” I was there for about 4 years and at the end, I was kind of the lead developer on my team. My last two to three years there at Amazon I also did a lot of interviewing. So I was reviewing resumes, doing technical phone screens, doing onsite interviews, things like that.

Then, I left Amazon and moved to Chicago and that was when I joined Optiver. At Optiver, that was my first experience actually leading a team. So when I started I had only two developers on my team, and then grew from there. So through the years, probably every couple of years, I’ve taken on some sort of new challenge at Optiver. Typically, changing both the team that I’m working on and kind of the size and scope of the problems and I was responsible for.
At this point now, I’ve had a chance to lead all the different areas of software development we have an Optiver. I currently focused on automated trading systems.

The other thing is that through all of my time at Optiver, I’ve put a lot of effort and focused on recruiting. Which was kind of a very self-serving thing in a way in that I knew that if I could hire really great people, then as a manager, I could just sit back and let them do awesome things? So yes, I’ve always put a lot of effort into making sure that we have really great people on our team.


Vivek: That’s really a good segue. As you think about scaling the team which is a huge challenge for many companies, not just startups, even large companies as they think about scaling a team from 2 to 50 or 2 to 100, we’d love to know some of the things that you’ve done in terms of putting together the framework that you had, in terms of how do you identify really good developers. Also, maybe you could talk about the mistakes that you’ve done during the scaling process.

David: Yeah, for sure. When I first got to Optiver, one of the things I felt was I was a little concerned at how easy I felt it had been to get a job at the company. So the first thing I did was just work with the recruiting team to figure out like, “Hey, can we add some sort of technical phone interview or something?” Because we had that when I worked to Amazon.

I remember I just had a really pleasant conversation with one of the people in recruiting and then they were like, “All right, great. Well, we want to fly to Chicago tomorrow.” And I just thought, “Who knows? I could have no skills whatsoever and just be a good at talking.” So when I got here that was the first thing.

I think that what I’ve generally seen when it comes to approaching the interview process is that, people tend to view it as a series of increasingly stringent filters. First, filtering our resume. So just get all the resumes that really aren’t applicable to the job in hand out of the way, and then at each step of the process you just filter down the number of people more and more and more
I think that this worked for us okay for the first few years that I was here, and I think it works really well for large [well known] companies. But for us, I think that we started seeing ways in which it actually didn’t work. A lot of that was related to the size of our company.

So probably a lot of people that are listening to this podcast haven’t heard of Optiver because we’re proprietary trading firm, which means we don’t have any clients. We trade our own money, we’re also a niche of the financial industry in that we’re what’s called a market maker, which is a middleman in a way.
People trade with us all the time but they don’t really know that they’re trading with Optiver. They just know that “I want to buy this particular option or this particular future stock or whatever it might be and worth just the other side of the trade.”

So given our small size and lack of brand recognition, we started seeing problems with just viewing the interview process as like all we need to worry about is making sure that these filters are letting the right people through and making sure that the people that don’t fit here don’t make it through the filters.


The problems we saw with that was that we would get candidates that would drop out of our process. We would schedule a phone interview with them and then an hour before the phone interview, they call us and say, “Hey, I actually just took a job at Microsoft or Google or wherever, so I don’t need to interview with you anymore.” The reason was that there wasn’t anything unique to Optiver that was selecting the right people early on. So we were just one of a dozen companies that they had sent their resumes to.

The other thing that would happen as we get to the end of our process, and we’d have people with an offer from Google, Microsoft, a startup, and us, and we’d be like their last resort choice. I think that was a similar problem. So our learning from viewing the interview process in that way was that we really needed to look at it as a two-way interview throughout the process.

What I mean by that is, there’s the obvious one-way interview where we are the one interviewing the candidate and asking “is this person a good fit for Optiver?” Do they have the technical skills we need? Will they fit in well here? Do they have the technical instincts that meet our needs?”

But there’s a second part of the interview which is that the candidate is also interviewing us, and throughout the process, the candidate is looking to see, “Is this a company that I would like to work at? Did their development philosophy and business philosophy and all of these things match up with the way that I view the world?” And because we hadn’t focused on that, a lot of candidates just weren’t seeing that Optiver was a perfect place for them or they would get through the whole process and then we kind of realize, “Oh, we could have saved ourselves a lot of time if we’d made that more clear.”

So we started making the biggest strides when we started viewing every part of the interview process as a reflection of who we were as a company. For us what that meant was we wanted to make sure that everything from the first time the candidate looked at our website to the time that they applied to us, to the recruiting interview, to whatever screening process we had, reflected our values and the candidates that had a shared sense of technical culture would appreciate this.

One of the ways that this played out was we used to do these technical phone screens and that was working pretty well or at least we thought it was working pretty well. But then we started doing some testing. One of the things we did was myself, and our CTO, and one of our people in recruiting did this secret shopper experiment where we got somebody that our CTO knew very, very well, was really smart, had gotten offers from like seven different companies or something in his most recent job search, and we just said, “Hey, would you go through our interview process and then critique it, take notes as you go through it, and we won’t tell anybody that you’re going through our process.”

So we did that and he failed the phone screen, which was shocking. This guy, he’s got a PhD, he’s one of the smartest people our CTO had ever worked with, he had offers from like Facebook, Google, Microsoft, he could’ve worked anywhere that he wanted to, he’d been at Intel for a long time, and he fails our phone screen. And we were just like, “Oh, my gosh. What is happening?”

Vivek: The other conclusion could be your bar is super, super high.

David: Yeah. It could be that, but we didn’t really think it was. Anyway, through a bunch of iteration we realized that we need more eyes in every part of the process. We need to make this reflect our values as a company better. And that was when we switched our technical screening to HackerRank.

For us, we didn’t just go in and like pick five questions we liked and set a minimum score, and then we’re done and that’s it. We put a lot of time and effort into the way we designed that test. We carefully selected questions and we put test cases on there, and we worked really closely and iterated over the course of…I mean, we’ve been tweaking our HackerRank test for two years at this point to make it get ever closer to reflecting the values we have as a company and I think that that’s working pretty well.

So one thing that’s been a funny anecdote is we’ve had a few people that they open up our test and they scoff at the questions that we ask, and then leave nasty comments in Glassdoor. Then we’ll go and read those and most of the time we’re like, “Oh, okay. They missed the entire point of the question, so I guess it actually worked.” Just the question itself filters out people that don’t fit our technical perspective I guess. That’s been a really good improvement to our process

Vivek: Yeah, that’s definitely great to hear. I’m being curious about this in general as we’ve built the company and we’d love to have your advice on this as well. How do you make interviews less stressful? Essentially, what you’re really trying to do is making sure you really understand the best version of the candidate. That’s what you should really optimize for the most part of the interviews.

But somehow the current set up or the way it is done is similar to your CTO friend who actually was super accomplished and failed a phone screen, I don’t know if it was because of that experience or he was just stressed for some reason. Have you thought about it and what are your thoughts on that?

David: No, we definitely have because one of the takeaways we had from that experience with him was that we were kind of being jerks in certain parts of our process and we weren’t giving the candidate the benefit of the doubt. We were kind of sitting back and saying, “We’re a great company and if someone is really serious about wanting a job here, they should have put in a lot of prep and thought really hard about this.”

So one of the ways that played out was our recruiters would tell candidates before the phone interview, “Make sure that you have a pencil and paper because we would have them sketch out their ideas and notes and code on a bit of paper while we were talking with them over the phone.

Often the candidate they wouldn’t have pencil and paper or they’d be in a coffee shop or there’d be a lot of noise. When that would happen, I think regrettably we were refereed against them way too much for that. We would say, “Oh, they didn’t take it seriously.

Really what was happening was people have lives. They have to work. If you’ve got a family and you work nine to five and you need to do a phone interview, where you going to do that? You don’t have that many options. You can do at home but it’s got to be during the workday most of the time. So are you going to do it at work? That’s a pretty bold move to book a conference room at work and interview to work another company.

So most people, what could they do? They go out to a park and there are ambulances going by and they’ve got paper maybe fluttering in the wind. But it’s just we weren’t thinking about the candidate. And so we really stepped back and said, “You know what? We’re going to try to give them the benefit of doubt.”
Now we do things like we give them a full week to respond to our HackerRank requests. We work really hard to make sure that the time pressure that we put on the test is not overbearing and not unrealistic.

Like how often are you writing code and you really have a 30-minute deadline to make sure that your algorithm works? It never happens. Never. I’ve never seen that happen in my entire professional career that you had to get your code perfect in 30 minutes or your company was going to lose X million dollars. If it’s broken, you just roll back to the previous version. So we started doing things like that.

Then also on site we really started trying to make to where…the way we think of it is that we’re welcoming the candidate into our home. So we think about it from a hospitality perspective. How do we make them feel comfortable? We try and give them a tour of the office. Just showing them that we really respect that they have taken time out of their busy schedule to talk to us. That is a significant thing and it’s not something to be brushed aside.

Then the final thing is with our interviewers themselves, what we try and do is tell them like, “Listen, first of all, we only select interviewers that are people that we really trust deeply and we continually re-evaluate how they’re doing in the interview process – how our interviewers are doing.”

 

We want to make sure that they do a good job of making a candidate feel welcome, that they’re good communicators, that they’re trying to assess that the way the candidate thinks fits in with what we do here, not that the candidate memorized the right piece of trivia or that the candidate has experienced that then they’re just regurgitating the way that they built a system six months ago or something.

We’re really interested in how do they approach a problem. With that, what we’re trying to see is how are they working? What would it be like to sit with this person at a whiteboard and talk through some difficult design challenge that we have in our environment? So we’re often trying to set it up. The phrase we use is that we want it to be the candidate and our interviewer against the problem. We don’t want it to be the candidate versus the interviewer.

I personally have experienced a number of times where I’ve gone into an interview and it really felt like it was me against the interviewer and the interviewer was trying to get me to prove that I was worthy of having a conversation with them. And that always left a bad taste in my mouth when it was obvious to me that we’re miscommunicating but to them, it’s, “why did they let this idiot in the door?” That always left a bad taste in my mouth. So really try and not do that.

Vivek: Yeah. It’s amazing. I mean, there is so much depth in your answers in terms of how you’ve really mastered a perfect of your interview process. By the way, I used to work at Amazon as well and I used to do a lot of interviews when I was at Amazon. This was my personal problem which is what lead to starting the company.
I remember I had to go through an interview training. It’s almost “Hey, you can’t just go ahead and interview any person. You have to go through this interview training.” I think they’ve again continued to tune that since.

I think you touched upon this a little bit. In terms of how you select interviewers and how you choose interviewers, do you go super deep in terms of actually having a very data-driven approach on, hey, did you give a yes? What happened to this candidate? Did the person actually get an offer and how are they performing?

Talk to me on the other side of it which is how do you select interviewer, how do you prepare them and how do you also tell somebody, “hey you might be the greatest developer but you’re a pretty bad interviewer”? I mean, that’s at some level a little offending but I’ve seen that to be the case with a few people as well as. So we would love to know a little more on the other side of the interview process.

David: First of all, I very much appreciate the compliments. I think one thing that’s really important is that we always view our interview process as needing continual improvement. We feel like we’re super far from perfection. It’s definitely our goal but I don’t think we’ll ever get there. We hope to asymptotically approach that line of interviewing perfection.

As far as how we select interviewers, there’s a couple of thing on there. The first step for us is we’re going to start by looking at our strongest technical people. If somebody does not do well in designing a system, it’s going to be really hard for them to recognize a good design when they see it. So you have to have a high degree of technical competence. I would say you need technical excellence in order to do an interview. But it doesn’t stop there.

So the second thing we look for is somebody that’s a good communicator. We want to have somebody that when they talk with other engineers about unfamiliar problems or when they talk with our traders about what kind of feature they’re building, that communication is never the issue, but the people often they really enjoy speaking with that person because they do a good job of stating problems in multiple ways, understanding and seeing when it’s a miscommunication as opposed to a lack of understanding or something to that degree. So we look for those two things.

Then what we do is we have that person shadow some of our best interviewers. So they’ll sit with our best interviewers and just watch them do like 10 different interviews. Then once they’ve done that, then we flip it a little bit. What will typically try and do is have them give an interview to one of our current developers.

In that, we’ll have one or two people that watch that and then give them critique and feedback on “Okay, when you asked this question, actually the way you phrased it didn’t make any sense. Or hear you thought that he was going down the wrong path but you actually needed to just let it go just a little bit, see where they take it.” Something to that effect.

Then we’ll have them start conducting interviews with one of our interviewers shadowing them and then they give them continual feedback. Actually, we keep doing that throughout the whole interview time. So we always pair up our interviewer with a shadower.

So you always see a candidate with two people interviewing them, but what it is is you’ve got one person that’s leading the interview and the other person is just supposed to be a shadow. So they’re just supposed to sit there and watch and get a different perspective.

We found that’s really helpful because a lot of times, the interviewer you’ve got a lot to think about. You need to think about “Is the candidate point on the right path, what’s the right hint to give them, are we communicating well, and how are they doing on this problem?” Whereas the shadow can just sit there and just think about, “Is this person approaching the problem in a way that makes sense?” So we do all of that.

Then the final bit we have is that we do look at the data from our interviews to see if our suspicions about our interviewers are on track. But we’re really careful about the way we do that because we’re a smaller company.

During our campus season, we might have 75 onsite interviews or something. If you have 10 interviewers that’s basically seven data points, which actually you can’t tell a ton from that. So you really have to go into the nitty-gritty details of each one of those and see “is the data we’re saying is it actually addictive of a trend or did they just have an unlucky campus interviewing season?”

Vivek: Got it. I know a lot of customers when I talk to them they struggle with days on connecting interview feedback. I remember this. When I was an engineer, it was on the bottom of my to-do list after to finishing everything the end of the day. And then I would probably push it for another day and then I would get two reminder emails, “Can you go ahead and fill the interview feedback?” I might have forgotten half the things that I discussed with a candidate.

If you really need to optimize and continue to fine-tune the process, you need to have a really strong feedback about the candidate right after the interview when it’s fresh in your mind to go back and reference. What would you say is the Optiver state in terms of writing feedback and how have you done that well?

David: We used to be really bad at it and then we kept having situations where we’d say, “Oh, hey, this person is performing really great. Let’s go look at what happened in the interview.” And then we’d find all these gaps of data and stuff and is terrible.

Probably four years ago we started just getting really militant about collecting feedback. The key was that the recruiting department got buy in from the development leads. The development leads basically told recruiting, “Look, if someone doesn’t give you feedback the day of the interview, tell me and I will immediately make sure that they do that.” So it was really just a matter of us putting it as a high priority for everybody.

Then we were very rigorous about if people…it’s fine if they miss it once, but if they miss it like two or three times, then we’re telling them, “Listen, the next time you don’t send feedback, you’re out of the interview process.” By being very strict about it, that was very effective. People took it really seriously because it did reflect poorly on them.

Vivek: We were actually thinking of an interview leaderboard on our products. It’s more on “Hey, how well you’re doing, how many interviews you’ve done, how many have you given a yes, how many conversions have happened to get to an offer, and how good are you in writing feedback immediately.”
It’s a fun. We’ve not yet implemented it. We’re still debating on all the pros and cons to this when you put up a leaderboard. But that’s something that we’ve been thinking about it as well.

But even after that, you do this, and you’re probably familiar this David, it’s a continuous art that continuous you continue to keep getting better and better. There are always mistakes and always things that you need to get better at. What have been some of the learnings that you’ve had over the course of the last three years in terms of some of the mistakes and how you corrected them?

David: I think one of the mistakes that we had was for a while we didn’t realize that one of our interview questions was biased towards people that had industry experience. It should have been somewhat obvious but we thought we were doing a decent job of getting away from that. But then we had a couple people come through that had experience in our industry and then they nailed our interview process and they and they came in and they did really poorly as an engineer at our company.

So what we really had to do was take deep ends and say like, “All right, we need to be aware that when we’ve got someone with industry experience that we have to take a slightly different touch in the interview itself. We need to adjust the interviewer, so we can put somebody that’s inexperienced that’s only got maybe 10 interviews under their belt. We need someone that’s been here for a while and they need to know to have a few curveballs up their sleeve.

Let’s say that they can throw that kind of change the problem from being a canonical industry problem to being something that the candidate maybe has never seen before. That way we know if they’re just regurgitating a design that they did three or four times at their previous company or if they actually can’t think about the problem. So that I think is one of the biggest learnings that we’ve had through that process.

Then the other learning that we’ve had is that it’s really important for us to have a heavy level of involvement from the development team in the whole interview process. That’s the thing where for a while we tried to limit the involvement of the development team and that really didn’t work very well.

Basically what we found was that we were very hit or miss when we brought people on site. We’d have some people that did great and some people that did just totally terrible, where within 10 minutes we knew this is a really bad fit. And a lot of that was because we just kind of set really simple metrics. We’d say, “As long as this person can do precisely these seven things, then bring them on site.”

So what we did was, even with the HackerRank, we have a minimum bar, minimum score that needs to be met. But once that minimum score is met, then the code itself is reviewed by our team leads. And there needs to be two people that look at the code and say, “This code is quality enough for this person to come on site before we will invite somebody on site.” And that it’s made a big improvement in the quality of people that we see on site.


Vivek: Interesting. Have you had the challenge of fit in terms of how much do I love the domain and space Optiver is in, to be a challenge? Just looking at a profile, when you graduated from two 2005, there are still a lot of startups but the big names were Amazon, Microsoft, Google and now you see hundreds of startups, so many millions of dollars put in terms of funding. There are so many opportunities for somebody just graduating out of any college. What would be your advice and how do you choose a company? There are just so many opportunities available and so many more startups.

David: That’s a good question. There’s been a few people that I’ve talked in the last couple months that were trying to decide on what internship they should take or what job they should take fresh out of college. I think a lot of people are running into this, I say problem. What I’ve told all of them is, “Look, this is an amazing problem to have. Most of society does not have this problem. So you’re really quite fortunate to be able to pick from four or five different companies of where you want to work.”

I think that the thing the things that I advise people to look at, one is, “What kind of connection did you feel with the people that you talked to during the process?” I advise people to not focus too much on the perceived scope of the problems because at the end of the day I think that all of these companies are solving the world’s biggest problems.

I hear this all the time from people saying like, “Come here. Our problems are so big. They’re the biggest in the world.” And it’s just like this competition to see like who’s solving the biggest problems. The reality is everyone is solving huge problems. That’s not the thing to judge it on, otherwise, everybody would go to Google. I wouldn’t focus so much on that.

No matter what you’re going to solve, difficult technical problems, really focus on the people, focus on the quality of life in the surrounding area. Well, hopefully, you’re not just going to be sitting at whatever company you go to for 90 hours out of a week. You should try and have a life outside of work.

Then finally, try and think about just what did you sense from the company about the way that they solve problems. Because something we try and focus on a lot here is this idea of technical culture. I think that it’s not something you think about a lot, but there actually are very distinctive technical cultures.

For example, our company is relatively conservative when it comes to our technical culture. And what I mean by that is that we don’t rush into a lot of new technologies without a significant bit of analysis and a strong feeling that we have to use this new technology in order to improve our trading system. But there are other companies out there that are all about exploration and trying new things and being at the forefront of the latest and greatest website technology or augmented reality or whatever it is.

So, as a candidate, you want to be at a company where the way they think about solving problems really resonates with you. And I think that you can get a sense of that from the people you talk to and the way they ask questions and the way the interview process goes.

Vivek: Yeah, absolutely. There’s a really funny story. I was talking to one of my friends who’s running a company. It’s a startup he just started. He was just telling me, “You know Vivek, there is a lot of competition to hire the VP’s and all of those things, but a close second or probably even better than that is for interns.”

He was telling me a story where Larry Page from Google called this guy who was deciding between Google or his startup to say, “You should come and work at Google.” He was saying, “If Larry Page is going to call a student, I’m going to lose them against Google.” Of course, he chose Google and he went there. It’s pretty crazy, right?

David: We have literally almost an identical experience here where we had this candidate come in – he didn’t turn to Facebook. We interviewed them, we loved them. We were smitten with this guy. And he was like, “Yeah, I’m deciding between you guys and between Facebook.”

We were like, “Maybe we’re going to get him. He was from the Midwest. So he seemed like really want to come here and then he calls us and he says, “Last night I was having pizza with the other interns at Mark Zuckerberg house and he was telling us the vision of Facebook for the next five years and I just feel like I can’t turn this opportunity down.” We were like, “Yeah. We get that.”

We’re kind of joking with our CEO about he was just saying like, “Yeah, I can never do that.” Because he’s just some random Dutch guy that nobody knows. He was about to go to a Beyoncé concert which was weird because he’s not in the pop music that much. But we were like, “Oh, yeah, maybe you could take him to go see Beyoncé with you.” And he’s like, “I still don’t think that would convince him.”

Vivek: As we’re just coming to a close on this David, I’d love to hear your thoughts on you’ve been in the you’ve been in the industry for 10, 12 years and yes, there have been some changes in the way recruiting has been done in terms of phone interviews, in terms of HackerRank how we have done it. Where do you see this happening over the next let’s say 5 to 10 years? What do you see as fundamental changes in the way technical interviewing or recruiting is going to be done?

David: That’s an interesting question because based on everything I hear and what I see as popular, there’s directions I can imagine the recruiting realm going, but I would be very hesitant to follow along in those directions. Everybody’s talking about machine learning, right?

Vivek: Yeah.

David: Machine learning this, machine learning that. So there’s got to be a bunch of startups out there that are like, “We’re going to crack the recruiting problem with machine learning.” There’s just so many subtle biases in the way that…if you pick the wrong data set and input that into a machine learning algorithm, it’s going to be biased and all of a sudden you’re going to hire only white men. That’s what is going to decide is like, “You know what? Only white men can develop software.” That’s not true.

It’s just you picked the wrong data set, you’ve fed it into a machine learning algorithm and off you go, there you go. That’s the thing that anyone working on those problems I think that you could very carefully apply a machine learning to the problem and get some leverage. But gosh, you just got to be so careful with that and just and walk into that with trepidation and fear.

The only other thing I can say is that I think the competition is just going to get more and more stringent and difficult and all of that because software development is becoming more and more a fundamental requirement for any industry moving forward I think.

So I think the best advice is to really just try and as much as possible hone in your interview process to be as reflective as possible of what is like to work in your company, what your company culture is like. That way it becomes less about competing for talent and more about trying to find the right fit for every bit of talent. Because not everybody is a good fit every company, not everybody should go work at Google and not everybody should go work at Optiver.

But we think that there is a lot of people out there that are a perfect fit for Optiver and those are the ones that we want to find. And we try as much as possible to tune our interview process to do that. I think that if everyone focuses on that, then that is where I hope the recruiting industry goes.

Vivek: Absolutely. That’s super insightful. Thank you so much, David for your time. This has been really great. For the listeners, if you have any specific questions or topics you want to hear about it, tweet to @hackerrank and we’ll be sure to reply. Thank you so much, David. It was great talking to you.

David: Yeah, definitely. Nice talking with you. Thanks.

NEXT: Listen to episode 2 of HackerRank Radio here.

Comments (2)

  • Nice website to get the dragon city hack tools as all the players will be able to get all the resources of dragon city within a second.

  • Excellent article, but it would be even better with a bit more editing.

    • Harvey Glass
    • January 31, 2018 at 6:26 pm
    • Reply

Leave a Reply

Your email address will not be published. Required fields are marked *