PHOTOOG Photography writings by Olivier Giroux


Deep C++ & Disassembly

The ability to read disassembly is an essential skill to have in one’s bag of tricks. Particularly for C++ developers. When the training wheels are taken off this powerful language, digging down the rabbit holes will very often take you to the machine interface.

Case in point: do you know what this line of source does?

x = 1;

What about this?

foo( x );

You might have a good guess, but you don’t know, no. Not until you’ve seen more of the program source, at least up until the declaration of x and all declarations of foo, and then potentially a great deal more if x is not a built-in type. Then consider Koenig lookup and template overloading rules. Finally, pray nobody has a #define somewhere to override that type or function in a header far far away.

Side Question: is that bad taste?

There is certainly a need for restraint when implementing operator overloading, and it is certainly a bad thing to write unobvious code. You don't want to give a semantic other than assignment to operator=(int) for example. But is defining an operator=(int) a Bad Thing(tm) it itself? Not (I believe) if your type looks like an int and quacks like an int - like a counter, or an odd-bit number, or an instrumented variable, etc...

Personal opinions vary greatly on this, and if only because of this you are likely to encounter this even if you despise it.

When badness happens, all of your code structure questions are answered by the disassembly. The compiler’s choices are revealed in the harsh light of bytecode.

So far there’s been no system really able to help with this. Intellisense can’t look up operators, nor does it allow you to say what template parameters to use (or which are relevant) when you look up inside a template. If you're willing to step through a lot of code, you will eventually see this in a stack display but...

...then there’s the topic of debugging optimized builds. This is something I do quite a bit of, I think it’s actually my most common debugging scenario. I build software that takes 10~20x longer to run in debug flavor, which means it could take many hours of heavy computation to reproduce a failure that takes a few minutes in the optimized build. When you break and step through an optimized build, odds are you’ll be reading disassembly about half the time.

I dare not part with my disassembly reading glasses...

Filed under: Uncategorized Leave a comment
Comments (1) Trackbacks (0)
  1. I definitely agree, and in the game industry this is often why we had an Internal build, which was basically Release + inlined debug symbols.

    Usually in Release builds, if you try to place a break point, your breakpoint either doesn’t get hit or when it does and you try to step, it just starts running.

    In Internal builds, the breakpoints worked as expected, and when you hit F10 to step, it’d step (possibly not to the next line, because the optimizer might’ve reordered things), but if you opened the disassembly view and stepped, you’d step one asm instruction at a time, and you’d see your inlined instructions above.

    This also helps with the ‘what does the asm look like for this instruction’ question. The answer is: “It looks like what the internal build says it does.”

Leave a comment

No trackbacks yet.