This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# Clone github repository setup | |
# import join used to join ROOT path and MY_GOOGLE_DRIVE_PATH | |
from os.path import join | |
# path to your project on Google Drive | |
MY_GOOGLE_DRIVE_PATH = 'My Drive/MyDrive/Udacity/deep-learning-v2-pytorch' | |
# replace with your Github username | |
GIT_USERNAME = "vsay01" | |
# definitely replace with your | |
GIT_TOKEN = "{YOUR_GITHUB_TOKEN}" |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
<html> | |
<head> | |
<title>Title</title> | |
<body> | |
<center> | |
<h2>Welcome to my new website</h2> | |
<iframe id = "new" src="http://www.cnn.com" style="opacity:0.0;position:absolute;top:195px;left:10px;width:1000px;height:200px"> | |
</iframe> | |
</center> | |
</body> |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
trained_model.eval() |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# load the saved checkpoint | |
model, optimizer, start_epoch, valid_loss_min = load_ckp(ckp_path, model, optimizer) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# define optimzer | |
optimizer = optim.Adam(model.parameters(), lr=0.001) | |
# define checkpoint saved path | |
ckp_path = "./checkpoint/current_checkpoint.pt" |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
%ls ./checkpoint/ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
%ls ./best_model/ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
checkpoint = { | |
'epoch': epoch + 1, | |
'valid_loss_min': valid_loss, | |
'state_dict': model.state_dict(), | |
'optimizer': optimizer.state_dict(), | |
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
def load_ckp(checkpoint_fpath, model, optimizer): | |
""" | |
checkpoint_path: path to save checkpoint | |
model: model that we want to load checkpoint parameters into | |
optimizer: optimizer we defined in previous training | |
""" | |
# load check point | |
checkpoint = torch.load(checkpoint_fpath) | |
# initialize state_dict from checkpoint to model | |
model.load_state_dict(checkpoint['state_dict']) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
def save_ckp(state, is_best, checkpoint_path, best_model_path): | |
""" | |
state: checkpoint we want to save | |
is_best: is this the best checkpoint; min validation loss | |
checkpoint_path: path to save checkpoint | |
best_model_path: path to save best model | |
""" | |
f_path = checkpoint_path | |
# save checkpoint data to the path given, checkpoint_path | |
torch.save(state, f_path) |
NewerOlder