In data manipulation, it is typical to reshape or reorder the original data and create multiple copies. At any new step, a new copy is created. As the program grows, so does the occupied memory, and I seldom think to worry about this issue until an Out Of Memory error happens.
The amazing thing about tensors is that multiple tensors can refer to the same storage (a contiguous chunk of memory containing a given type of numbers). This is managed by torch.storage.
Each tensor has the .storage property which shows us the tensor content as it is stored in the memory.
In the next article I will write about the even more amazing Tensor property of tracking the ancestors operations, but here I will mostly focus on memory optimization.
Note: the new Vitis AI 1.2 release will be the first to support PyTorch. This article celebrates the addition of support for the popular framework with an example in the Jupyter Notebook format dedicated entirely to PyTorch.