Den vs. Technologies

Tech Interview: Code Review & Debugging sessions

  ·   5 min read

Story

I’m currently seeking new opportunities and have applied for several senior/lead positions. To prepare, I’ve solved a few hard/medium LeetCode tasks as a “warm-up” exercise. For instance, I found the “Calculate median in data stream” problem to be particularly fun and interesting. How would you approach solving that?

Different companies have varying interview processes. Some conduct one, two, or three rounds of algorithmic puzzles, while others ask basic questions just to ensure you have fundamental knowledge. There are also companies that employ entirely different approaches. I’d like to discuss these various approaches to technical interviews.

First, I’d like to share my opinion on this matter. I believe all approaches have their merits in different situations. However, offering three rounds of purely algorithmic puzzles seems peculiar to me, especially for experienced candidates. This approach might make sense for someone without any experience, but for an engineer with over 10 years in the field, I find it rather odd. It’s not necessarily good or bad, just strange. In my view, when dealing with an experienced engineer, it’s more natural to discuss data structures and algorithms in the context of solving real-world problems. These topics can be seamlessly integrated into conversations about specific issues or challenges faced in software development.

Code Review

I recently encountered an interesting task when applying for a Staff Backend Engineer position at a major company. One notable aspect was that they weren’t solely focused on Go experience (despite the position being Go-based), but rather on overall software engineering experience.

The task was conducted through HackerRank, which had recently added a Code Review feature. The interface resembled a GitHub pull request review page. Here’s how it worked:

  1. I was presented with a simple task comprising about 6 files, including code, input data, and tests.
  2. My job was to review the code and provide comments.
  3. While there were some basic constraints, much of the review criteria were left to my discretion.

For example, I’m particularly fond of using io.Reader or io.ReadCloser as abstractions for passing data between functions. In my review, I suggested implementing these interfaces, as they seemed appropriate for the code I was evaluating.

The company provided feedback on my code review, highlighting several key aspects they were looking for:

  1. Tone: They appreciated that my comments were friendly and professional, without any harsh or unprofessional language like “WTF DUDE???”.

  2. Suggestions: They valued the constructive suggestions I provided for improving the code.

  3. Standards: They noted their use of the Uber Go Style Guide as a reference. I agree that this guide contains many valuable ideas and rules for Go development.

Based on my performance in this task, I was invited to the next round of the interview process, which focuses on system design. I plan to discuss my experiences with system design interviews in more detail later.

This approach to technical assessment seems quite comprehensive. It evaluates not just technical skills, but also communication style, knowledge of best practices, and ability to provide constructive feedback - all crucial for a Staff Engineer role.

The use of a code review task instead of (or in addition to) traditional coding challenges is an interesting trend. It more closely mimics real-world scenarios that senior engineers often face.

Live debugging

Another company took a different approach to assessing candidates. They scheduled a meeting with two of their engineers and shared a repository containing a simple program with a deadlock issue. The key aspects of this task were:

  1. The program was intentionally complex, featuring numerous Go channels, goroutines, and even channels of channels.
  2. The primary goal wasn’t necessarily to fix the issue, but to observe how I approached and analyzed unfamiliar code.
  3. I was able to identify the issue within about 5 minutes.
  4. I then attempted to fix the problem, which took me approximately 40-45 minutes.
  5. Throughout the process, we engaged in a productive discussion covering various related topics.

This round seemed to focus on assessing communication skills and fundamental knowledge of tools commonly used in the role, rather than just coding ability.

My opinion on this approach:

  • I found it to be an excellent and engaging method of assessment.
  • It’s particularly relevant for a senior-level position, where debugging complex concurrent code is likely to be a frequent task.
  • I was very satisfied with this interview format.

This approach indeed seems well-suited for evaluating senior engineers. It tests practical skills like code comprehension, problem-solving, and the ability to discuss technical issues effectively - all crucial for senior roles.

Conslusion

Each of these methods has its merits and may be more or less appropriate depending on the specific role and the candidate’s experience level. The trend towards more practical, real-world assessments is promising, as it allows candidates to demonstrate a broader range of skills that are crucial in senior engineering roles.

Ultimately, the most effective interview process may involve a combination of these approaches, tailored to the specific needs of the company and the nature of the position. As the field of software engineering continues to evolve, so too should our methods of evaluating talent, ensuring that we’re assessing the skills that truly matter in modern development environments.