This is a prequel to my previous blog post My first deep learning model using PyTorch. In this tutorial I will be covering basic know how about pytorch tensors.

PyTorch is a python based deep learning framework developed by Facebook’s AI Research. It has gained popularity because of it’s flexibility and speed. It has been integrated with cloud platforms including Amazon’s Sagemaker , Google’s GCP and Azure machine learning service.

So let’s start with understanding what exactly are tensors!!

What are tensors?

Tensors are containers that can have data in N-dimensions . A general tensor is represented by N^R where R is rank and N is number of dimensions. From this perspective, a rank-2 tensor (requires N ^2 numbers to describe) is equivalent to an N*N matrix. Scalar can be considered as a rank-0-tensor and vector can be introduced as a rank-1-tensor.

Image for post

Tensors and NumPy

Tensors are similar to numpy ndarray. Pytorch is targeted as a replacement for numpy to use the power of GPUs. By copying an array to the GPU memory, performance gets improved due to parallel computing.

1. Tensor Creation

Although in tensor creation documentation by PyTorch you will find multiple ways to create tensors. Here, I am discussing few popular methods.

**a) torch.tensor **— This is used to create a tensor with pre-existing data.

Image for post

**b) torch.zeros **— this method returns a tensor filled with zeroes with shape defined as per the arguments.

Image for post

c) torch.rand —this method returns a tensor filled with random numbers from a uniform distribution on the interval [0, 1). Similarly for normal distribution torch.randn can be used.

Image for post

**d) torch.from_numpy **— It creates a tensor from numpy ndarray . The returned tensor and numpy ndarraywill share the same memory. Modifications to the tensor will be reflected in the ndarray and vice versa.

Image for post

Pytorch aims to be an effective library for computations. It avoids memory copying if it can and i.e. the core difference in creating a tensor from numpy array using torch.from_numpy than torch.tensor.

Other methods like torch.randn, torch**.**as_tensor, torch.empty, etc. can be explored for tensor creation too.You can refer herefull list .

2. Tensor dimensions

To view shape of a tensor in PyTorch both size() and shape can be used. Shape is an alias to size() and that’s what numpy users would be accustomed to. It is added to match numpy closely.

Image for post

You can notice similar outputs for both size() and shape.

3. Transforming tensor shapes

To change shape of tensors , view and reshape can be explored. Let’s understand more about both of these methods in detail.

#tensor #autograd #pytorch #deep-learning #neural-networks #deep learning

Lighting the PyTorch tensors
1.70 GEEK