A differentiable activation function makes the function computed by a neural network differentiable and the back propagation algorithm is applicable to it.

Correct Answer: True
A differentiable activation function makes the function computed by a neural network differentiable (assuming that the integration function at each node is just the sum of the inputs), since the network itself computes only the function compositions. And the error function also becomes differentiable.