Question:
Activation functions
Author: Christian NAnswer:
(a) is a step function or threshold function (b) is a rectified linear function ReLU(x): max(0,x) The smooth version (everywhere-differentiable) of ReLU is called soft plus softPlus(x) : log(1 + eX) Changing the bias weight W0,i moves the threshold location
0 / 5 Â (0 ratings)
1 answer(s) in total
Author
Christian N