Skip to content

Instantly share code, notes, and snippets.

@ragulpr
Last active November 1, 2018 22:46
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ragulpr/f35513dda57b6f312e0eef107f1f4b2d to your computer and use it in GitHub Desktop.
Save ragulpr/f35513dda57b6f312e0eef107f1f4b2d to your computer and use it in GitHub Desktop.
memory mechanics in pytorch. Override variable names or make new?
import torch
@profile
def keep():
x = torch.randn(10000,100)
y = x+1
z = y**2
w = z**2
return w
# keep()
@profile
def override():
x = torch.randn(10000,100)
x = x+1
x = x**2
x = x**2
return x
override()
[~/Prylar/diverse]$ python -m memory_profiler pytorch-memtest.py
Filename: pytorch-memtest.py
Line # Mem usage Increment Line Contents
================================================
2 66.578 MiB 66.578 MiB @profile
3 def keep():
4 70.504 MiB 3.926 MiB x = torch.randn(10000,100)
5 74.363 MiB 3.859 MiB y = x+1
6 78.219 MiB 3.855 MiB z = y**2
7 82.035 MiB 3.816 MiB w = z**2
8 82.035 MiB 0.000 MiB return w
[~/Prylar/diverse]$ python -m memory_profiler pytorch-memtest.py
Filename: pytorch-memtest.py
Line # Mem usage Increment Line Contents
================================================
11 66.594 MiB 66.594 MiB @profile
12 def override():
13 70.520 MiB 3.926 MiB x = torch.randn(10000,100)
14 74.383 MiB 3.863 MiB x = x+1
15 74.422 MiB 0.039 MiB x = x**2
16 74.422 MiB 0.000 MiB x = x**2
17 74.422 MiB 0.000 MiB return x
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment