Torch logical indexing of the tensor

I am looking for an elegant way to select a subset of the torch tensor that satisfies some restrictions. For example, let's say I have:

A = torch.rand(10,2)-1

and S- tensor 10x1,

sel = torch.ge(S,5) -- this is a ByteTensor

I would like to be able to do logical indexing as follows:

A1 = A[sel]

But that does not work. So, a function indexthat accepts LongTensor, but I could not find an easy way to convert Sto LongTensor, except for the following:

sel = torch.nonzero(sel)

which returns the tensor K x 2 (K is the number of values ​​S> = 5). So, I have to convert it to a 1-dimensional array, which finally allows me to index A:

A:index(1,torch.squeeze(sel:select(2,1)))

This is very cumbersome; in particular. Matlab, all i need to do is

A(S>=5,:)

Can anyone suggest a better way?

+4
2

:

sel = S:ge(5):expandAs(A)   -- now you can use this mask with the [] operator
A1 = A[sel]:unfold(1, 2, 2) -- unfold to get back a 2D tensor

:

> A = torch.rand(3,2)-1
-0.0047 -0.7976
-0.2653 -0.4582
-0.9713 -0.9660
[torch.DoubleTensor of size 3x2]

> S = torch.Tensor{{6}, {1}, {5}}
 6
 1
 5
[torch.DoubleTensor of size 3x1]

> sel = S:ge(5):expandAs(A)
1  1
0  0
1  1
[torch.ByteTensor of size 3x2]

> A[sel]
-0.0047
-0.7976
-0.9713
-0.9660
[torch.DoubleTensor of size 4]

> A[sel]:unfold(1, 2, 2)
-0.0047 -0.7976
-0.9713 -0.9660
[torch.DoubleTensor of size 2x2]
+4

:

  • maskedSelect:

    result=A:maskedSelect(your_byte_tensor)

  • ,

    result=torch.cmul(A,S:gt(0))

, (, A), , backprop. , , , ByteTensor, , ( ..). , , , , .

0
source

All Articles