Created
December 11, 2020 03:54
-
-
Save failable/1d206f4207c4cbf0549806d52f8da2e9 to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
***************************************** | |
Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. | |
***************************************** | |
(37131) -- Epoch 0 -- | |
(37131) -- DistributedDataParallel -- | |
(37130) -- Epoch 0 -- | |
(37130) -- DistributedDataParallel -- | |
(37131) device: cuda:1, forward size: torch.Size([4, 2]) | |
(37130) device: cuda:0, forward size: torch.Size([4, 2]) | |
tensor([[-0.0597, -0.4675], | |
[ 0.1235, -0.1510], | |
[ 0.3206, -1.0219], | |
[ 1.7456, -1.2779]], device='cuda:1') | |
tensor([[ 0.0539, 0.6684], | |
[-0.3144, -0.4963], | |
[-0.3424, -1.4020], | |
[ 1.4635, -0.7477]], device='cuda:0') | |
Parameter containing: | |
tensor([[-0.6661, -0.1387], | |
[-0.3396, -0.1886]], device='cuda:0', requires_grad=True) | |
Parameter containing: | |
tensor([[-0.6661, -0.1387], | |
[-0.3396, -0.1886]], device='cuda:1', requires_grad=True) | |
tensor([[-0.1286, -0.1444], | |
[ 0.2783, 0.2004], | |
[ 0.4226, 0.3807], | |
[-0.8711, -0.3560]], device='cuda:0', grad_fn=<MmBackward>) | |
tensor([[ 0.1046, 0.1084], | |
[-0.0613, -0.0135], | |
[-0.0718, 0.0838], | |
[-0.9855, -0.3518]], device='cuda:1', grad_fn=<MmBackward>) | |
(37130) device: cuda:0, loss: 0.6541517972946167 | |
(37131) device: cuda:1, loss: 0.6011970639228821 | |
(37131) (37130) tensor([[ 0.1275, -0.2672], | |
[-0.1275, 0.2672]], device='cuda:0') | |
tensor([[ 0.1275, -0.2672], | |
[-0.1275, 0.2672]], device='cuda:1') |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment