I took the backend test expecting programming and SQL, but it was hard-core DevOps and SysAdmin. Had nothing to do with programming a backend system. I know a little about working with command line and admin, but using tools to orchestrate 100 servers isn't something I know about.
My impression is that it's built with a very high failure rate built-in, targeting people who have a very specific and easy-hire background. I'm not really sure if someone of this talent level would really need Triplebyte.
AFAIK when most recruiters talk about placement rate, they're not talking about "engineers who we worked with who happened to get jobs," they're talking about "engineers who we placed and received payment for the placement." While Triplebyte hasn't explicitly stated that distinction here, I think a good faith reading of that phrase would suggest that the 40% is actually people who are getting jobs through Triplebyte.
Things may have changed but when I did it I think ~2 years ago the projects and in person bit were much harder than the initial quiz, I'd expect the drop off to be higher.
Edit: I admit didn't pass the in person part and I have some issues with things I think were held against me when it was more confusion about what they were looking for, but overall I thought it was a great process and have recommended it to people.
What was always most interesting to us about starting a recruiting company was seeing what would happen if you treated hiring as a data problem. Partly we've raised more funding for the same reason any startup does, so we can grow faster to get more customers = more revenue = more success, etc. But we're also driven by how the larger the scale we operate at, the faster we can run experiments to answer questions about the best way to evaluate technical skills. More rigorous and data focused approaches to hiring benefit everyone.
Interviewing and evaluating engineers is an area a lot of people feel passionately about and have strong opinions on. We're continually looking for ways to improve our process, if you've any thoughts or feedback please ping me - harj at triplebyte.
Wow. Is it just me or does that sound like a downright horrifying company to deal with as a candidate?
At our company, we try to painstakingly craft our recruiting experience to make sure each candidate we interview has a good experience and ends up with a positive impression of our company regardless of whether or not we end up sending them an offer. At the end of the day, we're all human beings, each with something unique to bring to the table, even if that something might not be what we're looking for for a particular role at the moment.
Maybe past some scale we'll have to start changing our approach and start reducing candidates down to data points and "run experiments" on them like lab rats, like Google, et all, and these guys here seem to be so proud of themselves for doing, but I'd sooner quit than to stay a part of a company that does that.
For giggles (I work on OS kernels and high performance TCP, there's no way this kind of meatgrinder would ever place me correctly) I did the online quiz thing and got to the interview piece just to see why this was generating buzz. The quiz was easy and when I saw the interview prep I was disgusted. It's gamified hiring practice done in an outsourced fashion.. the kind where you are supposed to read a book that tells you what kind of questions a hiring party will ask you. I guess if you are a lousy engineering manager and want to offload and emulate the hiring process of something like Google this is up your alley. If you want some relatively fungible developers that can work on CRUD applications it may be fine but I don't think that's hard to hire for in any case.
Engineering managers MUST do recruiting - never let HR take this from you. HR can do clerical work, but they should have little role in search and outreach. I'd instead recommend finding new talent at local universities, mid level talent give recruiter like bonuses for employee referrals and poach from competitors, and extraordinary talent go look at commit logs of the open source software you use and hire people from that list. Easy, cheap, and effective.
I mean sure, so would I if I had to choose between the two, but that's a false dilemma.
And for me at least, a company that doesn't treat their candidates with kindness and respect in the hiring process likely wouldn't be a long term fit for me anyways.
I haven't been interviewed by you, but the initial process is stellar! Applying and going through the code challenges was very smooth. Plus the fact that on completion, if the applicant passes, are pretty much guaranteed some form of an interview, is great.
Just wanted to say thank you! I used Triplebyte to find the company I'm currently working for, and I've had a wonderful time here so far. The process took a little longer than I had anticipated, but it was well worth it :)
The crux of this problem in my opinion is that long term results aren't tracked.
It's hard enough internally to track someone's performance over the course of the year or two after they get hired, it would be even harder to do it if you are a recruiting company.
It's especially sensitive because employers are weary of sharing employee performance data to third parties because of the high risk of a lawsuit (there is clear precedent for these lawsuits.)
Once that data problem is bridged, it blows the problem right open for data to be explored and figure out what exactly predicts a top performer, in any field.
>Once that data problem is bridged, it blows the problem right open for data to be explored and figure out what exactly predicts a top performer, in any field.
Yes, assuming there is some top-level "data problem" to actually bridge here...
How do we know that the concept of "top performer" isn't just a completely divergent idea that means different things to different people and different companies in different industries and different geographic areas?
I'm inclined to believe that measuring performance is fairly subjective. However, in an attempt to put some numbers behind a performance score, you might have some combination of individual weighted scores that involve things like:
(1) length of time employed at the company
(2) some measure of the performance feedback the engineer receives in their annual review - which may include percentage salary increase (possibly subjective, though)
(3) a score compiled by surveying the employees' peers (again, possibly subjective)
(4) the overall TripleByte turnover rate at said company
It highly depends on how you measure value. If you measure value in terms of what they bring to the overall organization, people from top 10 schools are very likely overvalued. If you measure what value they bring to the team, top 10 schools are probably correctly valued. The difference is that the manager of the team hiring from Harvard can speak to people in more senior roles about how great their team is and how they have the best talent.
At the end of the day, value is driven by how much political leverage a hire can give a manager, not by how much value they add to the org.
Yeah I believe Triplebyte will only accept candidates from countries that have an easy way to get a work visa in the US - Canada, Mexico, Australia and Singapore (I think). (I'm Canadian and am working at my current company through Triplebyte)
That's weird, I am also Canadian and my profile says that they can only proceed with my application with people who hold an US visa and/or work permit. I thought it was a mistake so I contacted their support team and got a reply saying that they can only consider my application if I live in the US.
Did you change your location to some city in the US?
How did you get your application accepted?
On the same note, Hired.com which seems to be a direct competitor, also has restrictions on the locations available for their recruitment process. In Canada, for example, they only accept people living in Toronto. Or people lying about their current location because they are planning to move there.
Hi,
I just went through the interview questions, they were fun (they said that I did exceptionally well, but nothing concrete, like percentage). I'm not looking for job, just though I try it out.
What I was interested in is whether the questions get harder, if I answer well, but they seemed random.
You could use logistic regression to estimate the level of an interviewer and adjust the questions to get to the same accuracy with less time (or to improve accuracy with the same number of questions/time)
So you're right that the quiz does try to be harder if you're doing well, but it'll also give you easier questions if an incorrect answer lowers its confidence in your ability estimate. We have a pretty sizeable bank of potential questions to ask a candidate, but the quiz tries to strike an optimal balance between appropriate difficulty and maximum informativeness. For example, we wouldn't want to as you a particularly difficult question unless we're confident that it's a) a good fit for your estimated ability level, and b) will give us more information about your ability than any other question in the bank.
You're right that tailoring question difficulty to ability level can drastically increase a test's accuracy. But while a logistic regression model works well when you have a fixed quiz or a low number of questions, it isn't flexible enough to work with a fully adaptive system like we have at Triplebyte. Our models are loosely based on the kinds of systems that the MGAT or GRE use, but we've implemented significant extensions on top of those approaches to fit our needs.
Thanks for the answer.
When I was implementing a language learning program (who hasn't? :) ), using logistic regression was working quite well to quickly find my vocabulary level in about 10 questions in the top 10000 most frequently used words list adaptively (I ran a full logistic regression on the user dataset after each new data point, by mapping the position of the words to the estimated level of the user), and the questions just felt right. So I'm not talking about multiple logistic regression model, just using 1 variable, which works with lots of questions (as long as the question hardness is well calibrated).
Although I'm happy that you're trying to predict the most informative question, for me some questions near the end felt trivial, so either my feeling wasn't right about the hardness of a question, or the algorithm has lots of space to improve, or the question hardness levels weren't calibrated optimally.
Anyways congrats for the success for your startup (I just hope that you prioritize people who don't have U.S. VISA)!
Yeah GRE does that. They just use your answers as some sort of dichotomic search: they start giving you harder (or easier) questions until you are answering correctly ~50% of them.
Triplebyte doesn't work as a common app to all YC companies like Work at a Startup. Specific things that we do differently are:
- A background blind technical skills evaluation is our application process in lieu of submitting your resume.
- We don't introduce you to all the companies we work with, only the ones where our data predicts you have a high chance of receiving an offer.
- Companies expedite their own hiring process for Triplebyte candidates skipping technical phone screens/coding challenges.
- Each engineer gets a Talent Manager who offers them career advice, interview and negotiation help and general support throughout the job search process.
- Work with non-YC companies e.g. Apple, Asana and Coursera.
I'd say that Work at a Startup is a good way to get in front of a lot of startups at once and run a breadth first job search. Triplebyte is a good way to run a targeted job search optimizing your offer to interview ratio.
The number one request since we launched has been expanding to more locations and top of the list has been New York. We're working on expanding to more locations through the rest of the year.
We'll also be opening up for remote engineers and companies. If you're a company hiring remote engineers, I'd love to hear from you - harj at triplebyte.
I didn't see an answer to this on yr FAQ for candidates, and I think it's pretty important from privacy standpoint: Do candidates have total control over which companies you will reach out to on their behalf?
The goal of our process is to identify which areas of strengths and weakness an engineer has. We do have a section where we ask algorithm and data structure questions to test academic/textbook CS knowledge but we also have sections testing very different skills e.g. practical web programming, debugging and low level systems.
We've helped many engineers who didn't perform well in the algorithm/data structure section still find jobs at great companies because we work with many companyies who don't care about those skills (as evidenced by not evaluating it during their interview). This is one of the main ways we can help engineers have a better job search process, if textbook CS knowledge is not your strength we won't match you with companies who evaluate that during their interview.
We're certainly not the only way to join a YC company, or any company, and nor would we claim to be. The specific reasons why you'd consider using Triplebyte are:
(1) Companies are unlikely to reply if you contact them directly because your resume isn't impressive. Our entire screening process is background blind and we've built up enough credibility with companies for them to interview you if you do well on our process, when they otherwise would have rejected you based on your background.
(2) Finding new company options that are are doing interesting work but you haven't heard about. We spend a lot of time finding and partnering with early stage startups that haven't become well known yet.
(3) Saving time in your job search by identifying which companies are looking for your specific set of skills. We have a lot of data about what the companies we work with are looking for and can reduce the number of interviews you do where it becomes clear early on that you're just not the type of engineer they're looking for (e.g. they ask you complex algorithm questions on a whiteboard when your strengths are building and scaling web applications).
We're focused on doing these things well and our process certainly isn't perfect. There will be times where we match you with a company and you weren't a fit or when we don't match you with a company and you might have been a fit. Achieving 100% accuracy in something as complicated as hiring is likely impossible but we believe our approach is still a lot better than the status quo of recruiters reading resumes and making gut calls on who gets through to an interview.
I remember when we launched Triplebyte expecting the hard part to be convincing "good" engineers to go through the process but so long as we did that, every company would want to hire them. That turned out to be our mistaken assumption that every startup has before they launch.
Ammon talks here about how little consensus there is amongst companies on what a "good" engineer is. As he says, "we’re more often in the situation of sort of broadening people’s vision of what a skilled engineer can be".
This concerned us at first as it seemed like an intractable problem to work on improving the hiring process when no one can agree on what a good candidate even means. It actually turned out to be a huge opportunity as we realized the value was in what we're doing is matching candidates to companies rather than trying to define what the universal definition of a "good engineer" is.
I'm sorry you had a negative experience and we're working hard to make sure it doesn't happen for anyone again.
We started the company to help those people for whom manually submitting their resume to companies isn't an option. We don't use resumes as part of our screening process because we're looking to find skilled engineers without elite resume credentials. For them, submitting their resumes directly results in either silence or rejection.
We've been able to find many of these engineers jobs at companies they'd never imagined they'd be able to work at. We just had a self taught engineer working as a pizza delivery person in Cincinnati, hired by Instacart - his onsite interview was the first time he'd met another engineer in person. We helped a recent high school graduate, who didn't attend college, get hired by Apple - his dream company.
I'm the founder so I'm obviously incentivized to highlight these success stories and I don't claim that we've built a perfect process for everyone yet. We're working on a hard problem, judging the skill of other human beings in a fair way and vouching for them to companies that have maintained the same hiring process for decades and are resistant to change. Our approach is not perfect but it has had life changing outcomes for many people and we're doing our best to increase the % of all Triplebyte applicants that's true for.
Slightly off topic question Harj, but is it suffice to say that there is a finite percentage of companies willing to spend a 25% placement fee on candidates through triplebyte? Surely the time will come again when startups will curtail their spending and thus leave limited options for candidates going through your process?
Is it safe to say candidates whom solely use Triplebyte are limited to a fraction of the companies that they can otherwise get in front of via other means?
Nah, you can't use forum threads as proof. You need to do the analysis scientifically, which based on the feedback, it sounds like you need to do. You don't bother quantifying the ability of you and your interviewers, you just assume you're great.
You personally may not enjoy skipping the technical phone screens but that doesn't make it suboptimal for most people. If that were true, we'd see no demand for applying through Triplebyte versus applying to companies directly. Empirically, that doesn't seem to be the case, most candidates prefer to not repeat phone screens across multiple companies when job searching.