But the definition of differentiable did not change, did it? RELU is not differentiable (or - in other words - is differentiable at all points except 0). Not nitpicking, just trying to improve my understanding.
> But the definition of differentiable did not change, did it?
It did! In the title of this article they are describing a blatantly non-differentiable function (the rasterizer) using the word "differentiable". This is indeed a new usage of the word, only seen since the advent of automatic differentiation a few years ago.
To be fair, AI researchers used strictly differentiable functions (which is required for back-propagation) until recently. For example lenet5 uses the logistic function.
Only in 2011, some smart-asses [1] :) experimented with rectifier units and discovered they're even better
[1] Xavier Glorot, Antoine Bordes and Yoshua Bengio - Deep sparse rectifier neural networks (2011)
They make the point in their presentation to say that their method replaces the non-differentiable step function with a differentiable sigmoid function to enable the rasterizer to be differentiable.