Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But the definition of differentiable did not change, did it? RELU is not differentiable (or - in other words - is differentiable at all points except 0). Not nitpicking, just trying to improve my understanding.


> But the definition of differentiable did not change, did it?

It did! In the title of this article they are describing a blatantly non-differentiable function (the rasterizer) using the word "differentiable". This is indeed a new usage of the word, only seen since the advent of automatic differentiation a few years ago.


To be fair, AI researchers used strictly differentiable functions (which is required for back-propagation) until recently. For example lenet5 uses the logistic function.

Only in 2011, some smart-asses [1] :) experimented with rectifier units and discovered they're even better

[1] Xavier Glorot, Antoine Bordes and Yoshua Bengio - Deep sparse rectifier neural networks (2011)


They make the point in their presentation to say that their method replaces the non-differentiable step function with a differentiable sigmoid function to enable the rasterizer to be differentiable.


How is it non-differentiable? They explicitly point out that after prefiltering (convolution with a pixel filter), it is differentiable.

Moreover, these derivatives can be obtained via methods like Automatic Differentiation, so they are true derivatives, not finite difference approximations: https://en.wikipedia.org/wiki/Automatic_differentiation




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: