Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What I think many people fail to realize is that a large part of the code as a C compiler sees it is machine-generated. While I certainly do not tend to literally write out `if (1 != 0)` conditionals in my code, these and other similarly trivial expressions, will be introduced en-masse after macro expansion and aggressive inlining.

And in this context, macro-expanded, inlined code, exploiting the assumption of no undefined behavior occurring starts making a lot more sense. Sure, if you consider something like

    int x = *ptr;
    if (ptr != NULL) {
        *ptr = 1;
    }
you may not unreasonably say that, as there is an explicit ptr != NULL check in there, probably that's what the programmer intended -- so we should not replace ptr != NULL with a constant 0 because that pointer was already dereferenced.

However if the real code is

    int x = *ptr;
    do_something_with_ptr(ptr)
and the do_something_with_ptr() function just happens to contain a NULL-check at some point (because it is written for a more generic purpose and not use in just this place), then not dropping that NULL-check becomes a very significant missed optimization opportunity.

If the compiler wouldn't optimize that out of consideration for the possibility of the former hand-written code, that would incur costs on the programmer. He might have to split do_something_with_ptr() into two functions, one doing NULL checks and one not doing it. He may have to add explicit compiler assumption / unreachability annotations.

The whole argument goes both ways, performing optimizations that break broken code has a cost (people have to fix it) and not performing those optimizations for working code (people have to manually optimize it).



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: