20.03.2021 Views

Deep-Learning-with-PyTorch

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Tensor metadata: Size, offset, and stride

59

TIP To help build a solid understanding of the mechanics of tensors, it may

be a good idea to grab a pencil and a piece of paper and scribble diagrams

like the one in figure 3.5 as we step through the code in this section.

We can easily verify that the two tensors share the same storage

# In[32]:

id(points.storage()) == id(points_t.storage())

# Out[32]:

True

and that they differ only in shape and stride:

# In[33]:

points.stride()

# Out[33]:

(2, 1)

# In[34]:

points_t.stride()

# Out[34]:

(1, 2)

This tells us that increasing the first index by one in points—for example, going from

points[0,0] to points[1,0]—will skip along the storage by two elements, while increasing

the second index—from points[0,0] to points[0,1]—will skip along the storage by

one. In other words, the storage holds the elements in the tensor sequentially row by row.

We can transpose points into points_t, as shown in figure 3.6. We change the order

of the elements in the stride. After that, increasing the row (the first index of the tensor)

will skip along the storage by one, just like when we were moving along columns in

points. This is the very definition of transposing. No new memory is allocated: transposing

is obtained only by creating a new Tensor instance with different stride ordering

than the original.

3

4

3 4

1 2 TRANSPOSE 1 1

1 7 2 7

STRIDE = (3, 1)

STRIDE = (1, 3)

+1

NEXT COL

NEXT ROW

3 1 2 4 1 7

+3 NEXT ROW

NEXT COL

Figure 3.6 Transpose

operation applied to a tensor

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!