Quantcast
Channel: Recent Questions - Stack Overflow
Viewing all articles
Browse latest Browse all 12111

Pytorch: how do I make sure the model output is not 0 or negative?

$
0
0

I need to calculate loss on my model. The loss requires the logarithm of the output. This is for an actor critic model for those who want to know.I use a network that uses relu and softmax to make sure the values are not getting to high or that they are negative. But they are sometimes 0. This is not good since I cannot take the log of that.

What can I do to avoid this?

I tried using a custom relu function but for some reason it does not work.

I tried also Increasing the value in cases that it is 0 by 0.01 but then I get an error that there was a local change.

The loss function looks like this. Where P is the output of the model, eta and value constant are some unimportant values. And a[t] is the action at time t. This is not important as well. The important part is that the P output should not be 0.0.

x = self.eta*P*torch.log(P)theta_loss += -value_constant*torch.log(P[a[t]])+torch.sum(x)

This is the relu function

class MyReLU(torch.autograd.Function):    @staticmethod    def forward(ctx, inp):        ctx.save_for_backward(inp)        # out = torch.zeros_like(inp).cuda()        # out[inp > 0.01] = inp        return torch.where(inp < 0.01, 0.01, inp)    @staticmethod    def backward(ctx, grad_output):        inp, = ctx.saved_tensors        # grad_input = grad_output.clone()        # grad_input[inp < 0.01] = 0        grad = torch.where(inp <= 0.01,0.0,1)        return grad_output * grad

Viewing all articles
Browse latest Browse all 12111

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>