I disagree. Most of the time, I find a debugger just slows me down. It's super helpful in some cases, but good logs can pinpoint problems far before a debugger can. Also, building in debug mode can change everything, so you may not even catch your bug, especially if it's concurrent in nature.
This idea that debugging with print statements is superior to using a debugger is simply false. Learn to use the debugger for your platform it will pay huge dividends throughout your career.
I regularly see pais+ of println debuggers debate and speculate while the guy with the debugger drills straight down to the issue, and fixes it.
I think it is a fallacy to choose either printing or debugging. I forgot where I read this, but the two techniques are fundamentally different. A debugger lets you stop execution and examine data structures at one point in time. Printing lets you accumulate a log of one particular data structure over a span of time. I think these techniques are complementary and have different effectiveness on different problems.
Yep. They call it “tracing” in debugging and a good debugger will be able to directly catch and log the values of any variable at a particular line of code.
By using logging instead you’re reimplementing years of good work done by engineers before you.
1st you have to add the useless print statements and after decades of programming a debugger is vastly superios in terms of 'debugging' when compared to printing. (Back in time basic had no debugger even)
Also printing is utterly useless for high concurrent code, as printing alters memory visibility, usually adds global sync, etc..
This would depend on the language/compiler/linker - take Java (which the article is about). Attaching debugger does nothing prior to adding a breakpoint.
The breakpoint would cause the method to be deoptimized, executed in the interpreter. Removing the breakpoint would allow the method to be optimized again.
Now obviously during stepping in, the thread would be blocked and not highly concurrent. However print statements just bare the concurrency.
I think there are two use cases here, I was referring to debugging during development and a lot of replies are regarding troubleshooting an active prod system.
Of course we all hope for well thought out logging to troubleshoot issues we're seeing in prod.
I'm referencing an pattern I see with junior devs who simply use "printf debugging" in development instead of learning to use a debugger properly, even with distributed systems.
If I have easy access to a debugger. It’s extra work to hook things up to a debugger, and there are certain restrictions that may apply (attach too late if process launch is not under our control, program may behave differently, etc.). If I do have access to a debugger, often I will just do “printf debugging” there by setting a breakpoint and adding an action to “p someVariable; c”. Usually I treat my debugger a sort of IPython for statically compiled languages, to mess around with and inspect values as programs are executing.
Again, attaching a debugger is occasionally not helpful–for example, if you're trying to figure out why your program isn't loading certain plugins at launch, you trying to attach the debugger may happen after this step occurs. So you don't get to debug this process.
Or if an issue happens in your staging environment but not locally. That happened to me just yesterday, and a simple print statement gave me the information I needed to resolve the issue.
I probably could have attached a remote debugger, and executed the relevant function a few times until my request got routed to the right process in the cluster, but that honestly would have taken me more time than just committing the print statement and letting CI take it away.
Exactly, and in the replies it is really obvious what the debate is really colored by.
In many cases there is no debugger available for someone's favorite platform. And so they "hate debugging". Go programmers, javascript people (where you can't do client->server debugging, but really, really have to), ...
How many Java programmers don't use debuggers ? How many C# developers ? Those languages have excellent debuggers. Python, C/C++, ... decent at best. Go/Javascript/... dismal debugging support.
Fair enough, but attaching multiple debuggers across several interacting components with conditional breaks gets me there faster than incrementally inserting progressively more print statements, in a dev environment. Proper logging is a given doping out problems in a production system to then verify and correct in dev.
Sometimes having a debugger on prod is the right answer.
"The Remote Agent software, running on a custom port of Harlequin Common Lisp, flew aboard Deep Space 1 (DS1), the first mission of NASA's New Millennium program. Remote
Agent controlled DS1 for two days in May of 1999. During that time we were able to debug and fix a race condition that had not shown up during ground testing. (Debugging a
program running on a $100M piece of hardware that is 100 million miles away is an interesting experience. Having a read-eval-print loop running on the spacecraft proved
invaluable in finding and fixing the problem. The story of the Remote Agent bug is an interesting one in and of itself.)
"The Remote Agent was subsequently named "NASA Software of the Year"."
No one said otherwise, and the comment you replied to specifies "in a dev environment". The language you are using is unnecessarily combative: this is likely to inhibit the adoption of your ideas.