Here, we’ll implement a made-up activation function that we’ll call the Rectified Quadratic Unit(ReQU). Like the sigmoid and ReLU and several others, it is applied element-wise to all its inputs:
require 'nn'
local ReQU = torch.class('nn.ReQU', 'nn.Module')
function ReQU:updateOutput(input)
-- TODO
self.output:resizeAs(input):copy(input)
self.output[torch.lt(self.output, 0)] = 0
self.output:pow(2)
-- ...something here...
return self.output
end
function ReQU:updateGradInput(input, gradOutput)
-- TODO
self.gradInput:resizeAs(gradOutput):copy(gradOutput)
self.gradInput[torch.lt(input, 0)] = 0
self.gradInput:mul(2):cmul(input)
-- ...something here...
return self.gradInput
end