← Back to context

Comment by jacobyoder

3 months ago

> CTB: one surprising commonality amongst many of the interviews thus far is the lack of use (or disdain for) debuggers. Almost everyone trots out print statements!

Interesting. I came of age before debuggers were a 'thing' available to common kids at home (80s BASIC, etc). As I've worked in various environments over the years, debuggers are nice when available, but they're not always available. Code is only running in production, you don't have full access to a debuggable environment, company won't pay for tool X, etc. print/log statements always work, regardless of what tool/language you're working with. Or... 'always work' enough of the time that it's not surprising it's a common theme.

I jump between client projects relatively often (consulting) and while I do use debuggers, I don't use them exclusively (perhaps no one does?). The longer I'm in a codebase, the more likely it will be that I'll end up with a full debugger setup and get comfortable stepping through all the bits.

I've been a developer for 40+ years, and have to agree. I've never found debuggers to be the most efficient way to debug code, and it's not something I'd normally reach for during development.

The only time I normally used a debugger is for post-mortem debugging - looking at a production core file for a multi-threaded process to see where it was in the code (and maybe inspect a few variables) when it crashed. If the core isn't sufficient to isolate the problem, then I'll add more logging to isolate it next time it happens.

During development, printf is just so much more convenient. I'll always put print/log statements in my code as a combination of tracing execution flow and to validate assertions / check that variables have the types of value that I'm expecting. Often this sort of pre-emptive debugging is all you need, and if not it should point to exactly where you need to sprinkle a few more print statements to isolate an issue.

The convenience of print statements is that once you've put them there, at critical points in your code, and printing critical values, they are always there (can conditional them out once code is working, if wanted), as opposed to having to go into a debugger, navigate to points in code, setup breakpoints, monitor variables etc ...

  • I'm of a similar vintage, and I found debuggers valuable when I was debugging Windows 3x and 9x programs, C or MFC C++. You needed to get in to the data structures in memory sometimes, and Microsoft made good debuggers at the time.

    Modern development, scripting languages and web development, I rarely use a debugger, setting it up for each of the various languages we jump to is too much bother.

    • I am in a similar boat - When I was an engineer at Intel during the Windows NT days, I used SoftICE every day... but over time I have evolved into using less debugger tools and more emulation/simulation tools to find problems.. and even just more time 'thinking' about the code paths and what could go wrong.

    • FWIW, I'm talking about C++ multi-threaded code for Linux (server software with high variety of requests and work flows).

  • > I'll always put print/log statements in my code as a combination of tracing execution flow and to validate assertions / check that variables have the types of value that I'm expecting.

    This is where Go's superfast compiles shine. Debuggers-B-Gone.

  • Correct me if I'm wrong but you basically just described good logging practices, yea?

    • Maybe ? But I was mostly trying to describe how good logging / pre-emptive print statements can avoid much of the need to debug in the first place (as well as why it's more efficient).

      Of course coding discipline / experience plays into writing bug-free or easy-to-debug code too.

The biggest issue for me is working on distributed systems. It's non-trivial to run many different servers and clients through a debugger all at the same time, components often have timeouts that are going to trigger due to the delay you'd cause even if you could, and much of the most vexing behavior happens in the networking stack rather than in your own software, and you can't run Linux in a debugger while also running your own software on top of the Linux that is in a debugger.

This is why we have logs, metrics, and traces. They attach at least some of the same instrumentation you'd get out of a debugger but all the time. You lose the flexibility to stop, step, and modify program behavior at runtime, but also gain data that is representative of what happens at full speed, which may be the only time certain bugs manifest anyway.

> I jump between client projects relatively often

Using debuggers makes a lot of sense in this case. If I had to switch context that frequently between relatively stable applications, it would be helpful to have a debugger framework for doing work.

`printf` statements are helpful when the error exists beneath the debugger: in the application framework itself.

> they're not always available

100% agreed. `printf` is "one tool" I can use to follow the control flow of a function call across frameworks, programming languages, and even operating systems. It's also something I can reliably assume my coworkers have in their tool belt, and thus provides a common language that even multiple different organizations can use to cross-collaborate a specific debugging session.

  • the longer i'm on a project, the more inclined i may be to get a debugger going, but often there's much more low hanging fruit to deal with that doesn't necessarily need a debugger to figure out.

For sure debuggers are great not just for debugging but as a way to reverse engineer a large codebase.

I had noticed the disdain for debuggers and it used to make me feel a bit inferior for relying on them but then found that John Carmack loves them so I feel better now. Ha!

I started programming with scripting languages so mainly had to rely on print statements. I also worked in production support and QA in the past - print statements are the main tools over there. Even though I am programming in languages like Java/Go/Python now that have proper debugger support, all that history has resulted in me relying mainly on print statements. Also my print statements are now more carefully structured to provide almost a narrative to help support/QA folks (and myself) - I don't want them to suffer like I did due to lack of well designed log statements.

There's a general distain for fancy tooling like IDEs. I have to wonder how a lot of these people would think of programming these days, with the rise of AI, ubiquitous code analysis, linting, formatting, CI, packages. Even in the short amount of time that I've been programming, writing software has changed massively. I have to imagine it'd appear as a completely different enterprise to them.