I have ulterior motives.
I’m not writing about code review practices by the Web team at Beezwax because the world needs another blog post about code review. I’m not going to link to any of the articles I read while preparing this. If you can’t find them yourself, I’m not sure how you found this one.
I’m writing this to brag. About Beezwax and our Web development team. About the awesome devs I work with every day. I’m bragging to you because I want you to come work here with us.
I do, ballpark, a hundred recruiting phone screens every year, and consistently candidates respond positively when I tell them that we have a strong culture of code review and testing. On this team, testing is non-optional. Code review is non-optional.
We’re a consultancy, we work on other people’s dime, so it would be easy to fall into the trap of skipping these steps in the development process thinking we’re saving our clients money. One of the things I love about this team is that we have the experience, both as developers and as an organization, to know that skipping review and testing would actually cost our clients more in the form of defects and poor maintainability.
If you’re a developer who doesn’t need the value of tests and review explained, let me convince you that you want to work here with examples of how we do code review.
I won’t bore you with a laundry list of steps for how we do code review, as you can read about best practices anywhere. We follow standard practices, although nothing is overly dogmatic either.
Every project is organized and completed in whatever way the team feels is best for that project. Small project with intermittent work needs? Weekly team check-in is fine. That project is suddenly really busy for the next few weeks? Daily standup while it’s busy. Same for code review; one size doesn’t fit all.
A “Normal” Example
The project in this first example had decent unit test coverage, but while fixing a UI bug, I discovered there were no feature specs (for non-rubyists, with the ruby testing tool
I made a merge request (MR) to add feature specs, and it was fairly decently sized: +97/-47 in seven files. Add
database_cleaner, set up initialization of base data needed to run the site, add a spec helper for resizing the window for mobile display. And finally, add the spec for the feature I want to test.
After I pushed my branch, the first review was from rubocop of course. A human reviewer started having their look while I fixed the issues rubocop flagged, and they pointed out places the specs could be refactored to be tidier and places where configuration could be simplified.
After making all the other changes, I committed a couple of different options for the spec refactoring and left a comment with links to the file at those different revisions for the reviewer to comment on. The reviewer gave their feedback, I made the change, merged, and shipped.
What to notice: If you don’t work somewhere where robots do robot jobs and humans do human jobs in an open and collaborative way, come work for us.
A Much Smaller Example
I addressed the UI bug in the first example in a separate MR from the specs, and this one was tiny: 2 HAML changes to element attributes and 6 SASS lines for the animation. When I created the MR, I left a comment about an aspect of the animation I wasn’t happy with, and left it for the reviewer to decide if it was worth spending more time on. The reviewer spent probably two seconds looking at the code, pulled the branch, and decided they liked the thing I didn’t. Merged. Shipped.
What to notice: If you don’t work somewhere where decisions are made as a team, or where people aren’t as thorough (but only as thorough) as they need to be, come work for us.
A Much Larger Example
This example is for a major piece of new functionality to an existing product. +5,456/-1,472 across 45 files in 74 commits over two weeks. That would be impossible for one person to review effectively in a short amount of time. If they spent an hour a day on it, maybe they could do it in a week.
Knowing ahead of time this would be the case, we took a review-as-you-go approach. The dev who started the work made a WIP MR the moment the branch was created, and as we worked, we paired heavily, reviewing each other’s work as we went. When we worked asynchronously, we left reviews for each other on the MR, addressing the review notes before continuing with further development.
When it was finally time to make a merge/don’t-merge call, we pulled up the entire MR and did a pairing review of it, going fast where we could (CSS and code we had written and reviewed together), and going into more detail on code either of us was less familiar with.
Merged and shipped, no major (and hardly any minor) defects reported.
What to notice: If you don’t work somewhere where devs are empowered to make process decisions as the needs of the project or task demand, or where devs are able to pair or not pair as needed, come work for us.
The Last Word
I’ve worked in shops where process was stifling. I’ve worked in shops where process was non-existent and the result was chaos. I’ve worked in shops that somehow managed to achieve both those situations simultaneously.
On the Web team at Beezwax we consistently deliver high-quality products to our clients because we test and review our code in a way that brings the team together. In a way that makes developers excited for code review because we know we’ll always learn something useful from the others we work with. The results speak for themselves: our clients come back with new projects for us, and people like working here — the mean tenure of devs on this team is 3 years.
Check out our open positions and join us.