Skip to content

Instantly share code, notes, and snippets.

Last active September 17, 2019 17:10
Show Gist options
  • Save prakashjayy/d692487c9ed35938da376f3b58d91206 to your computer and use it in GitHub Desktop.
Save prakashjayy/d692487c9ed35938da376f3b58d91206 to your computer and use it in GitHub Desktop.
Transfer Learning using Keras
from keras import applications
from keras.preprocessing.image import ImageDataGenerator
from keras import optimizers
from keras.models import Sequential, Model
from keras.layers import Dropout, Flatten, Dense, GlobalAveragePooling2D
from keras import backend as k
from keras.callbacks import ModelCheckpoint, LearningRateScheduler, TensorBoard, EarlyStopping
img_width, img_height = 256, 256
### Build the network
img_input = Input(shape=(256, 256, 3))
x = Conv2D(64, (3, 3), activation='relu', padding='same', name='block1_conv1')(img_input)
x = Conv2D(64, (3, 3), activation='relu', padding='same', name='block1_conv2')(x)
x = MaxPooling2D((2, 2), strides=(2, 2), name='block1_pool')(x)
# Block 2
x = Conv2D(128, (3, 3), activation='relu', padding='same', name='block2_conv1')(x)
x = Conv2D(128, (3, 3), activation='relu', padding='same', name='block2_conv2')(x)
x = MaxPooling2D((2, 2), strides=(2, 2), name='block2_pool')(x)
model = Model(input = img_input, output = x)
Layer (type) Output Shape Param #
input_1 (InputLayer) (None, 256, 256, 3) 0
block1_conv1 (Conv2D) (None, 256, 256, 64) 1792
block1_conv2 (Conv2D) (None, 256, 256, 64) 36928
block1_pool (MaxPooling2D) (None, 128, 128, 64) 0
block2_conv1 (Conv2D) (None, 128, 128, 128) 73856
block2_conv2 (Conv2D) (None, 128, 128, 128) 147584
block2_pool (MaxPooling2D) (None, 64, 64, 128) 0
Total params: 260,160.0
Trainable params: 260,160.0
Non-trainable params: 0.0
layer_dict = dict([(, layer) for layer in model.layers])
[ for layer in model.layers]
import h5py
weights_path = 'vgg19_weights.h5' # ('
f = h5py.File(weights_path)
# list all the layer names which are in the model.
layer_names = [ for layer in model.layers]
# Here we are extracting model_weights for each and every layer from the .h5 file
>>> f["model_weights"]["block1_conv1"].attrs["weight_names"]
array([b'block1_conv1/kernel:0', b'block1_conv1/bias:0'],
# we are assiging this array to weight_names below
>>> f["model_weights"]["block1_conv1"]["block1_conv1/kernel:0]
<HDF5 dataset "kernel:0": shape (3, 3, 3, 64), type "<f4">
# The list comprehension (weights) stores these two weights and bias of both the layers
>>> model.layers[1].set_weights(weights)
# This will set the weights for that particular layer.
With a for loop we can set_weights for the entire network.
for i in layer_dict.keys():
weight_names = f["model_weights"][i].attrs["weight_names"]
weights = [f["model_weights"][i][j] for j in weight_names]
index = layer_names.index(i)
import cv2
import numpy as np
import pandas as pd
from tqdm import tqdm
import itertools
import glob
features = []
for i in tqdm(files_location):
im = cv2.imread(i)
im = cv2.resize(cv2.cvtColor(im, cv2.COLOR_BGR2RGB), (256, 256)).astype(np.float32) / 255.0
im = np.expand_dims(im, axis =0)
outcome = model_final.predict(im)
## collect these features and create a dataframe and train a classfier on top of it.
Copy link


At this line I get a error.

KeyError: "Unable to open object (Object 'model_weights' doesn't exist)"
I am sure the path is correctly set.

Copy link

ghost commented Jul 3, 2017

I too faced the same issue and It is a strange error. I checked the complete Keras code. 'model_weights' is defined but it is wrapped up under load_weights() function. Hence I directly used load_weights() instead of using h5py.File(weights_path) and the below code worked for me.

Working code:

import h5py
weights_path = '/home/ubuntu/<<model>>_weights_tf_dim_ordering_tf_kernels.h5'
model.load_weights(weights_path, by_name=True)
for layer in model.layers:
    weights = layer.get_weights()
    print("[INFO] Model Layer Configuration with respect to each layer weights as follows : " + str(layer.get_config()), str(,str(weights))
print("[INFO] The total number layers is : " + str(layer_count))
for i in layer_dict.keys():
    index = a.index(i)

where a = [ for layer in model.layers]

Copy link

@H-Cognitum Thanks for the workaround, it saves me lot of time.

Copy link

@H-Cognitum Thanks 👍

Copy link

yustiks commented Aug 2, 2018

@H-Cognitum I tried to implement the solution, but how to write things for a?
It is not clear for me how to work with this: 'where a = [ for layer in model.layers]'

Copy link

jaelim commented Dec 5, 2018

I think it's because 'model_weights' doesn't exist in your .h5 file. You can do a simple check with:
for key in f.keys(): print(key)
So, either load .h5 with load_weights() or work around to ignore model_weights.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment