When is a custom pytorch function (not just a module) needed?

Pitcher's newbie is here! Consider the following custom module:

class Testme(nn.Module):
    def __init__(self):
        super(Testme, self).__init__()

def forward(self, x):
    return x / t_.max(x).expand_as(x)

As far as I understand the documentation: I believe that this can also be implemented as custom Function. A subclass Functionrequires a method backward(), but Moduleno. Also, in the linear doc example, Moduleit depends on the linear Function:

class Linear(nn.Module):
    def __init__(self, input_features, output_features, bias=True):
        ...    
    def forward(self, input):
        return Linear()(input, self.weight, self.bias)

Question: I do not understand the relationship between Moduleand Function. In the first listing above (module Testme), should it have a related function? If not, then you can implement this without a method backwardby subclassing the module, so why Functionalways requires a method backward?

, Function , ? : , Function, forward ?

+6
1

PyTorch Documentaion.

torch.autograd.Function autograd PyTorch. , PyTorch, , Variables PyTorch, Function. __init__(), forward() backward() ( .: http://pytorch.org/docs/notes/extending.html). PyTorch Variables.

nn.Module() , , .. , .parameters() .. .. , , forward() Function(), backward(). , forward(), PyTorch , .

, ?

, PyTorch (, ), - (). . . - (, ), nn.Module() . ( ) torch.optim .. - , , , , Python, , , .

(, ), Function() __init__(), forward() backward(), PyTorch, , . , , , , , . , .

+7

All Articles