Skip to content

Instantly share code, notes, and snippets.

@javedqadruddin
Created November 18, 2016 03:58
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save javedqadruddin/c491c00b109abb8cfed9c06d7545dd63 to your computer and use it in GitHub Desktop.
Save javedqadruddin/c491c00b109abb8cfed9c06d7545dd63 to your computer and use it in GitHub Desktop.
double 'connected to' problem
Display the source blob
Display the rendered blob
Raw
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@markddesimone
Copy link

Hi, I am experiencing this same problem. I have seen the discussion between you and Jeremy on the forums:

Jeremy: You didn't pop the last layer and replace it with one with the correct number of outputs, prior to creating conv_model. I'm not sure why it is happening - but my guess is that that's the cause. Keras often gets confused when copying layers between models - recently I've started writing code that instead copies the layer config and weights separately.

You: Got it working. Copying the config and the weights separately did the trick. Thanks!

However, I don't understand what is meant here. How do I copy the config? Do I need to create the model from scratch, i.e. add all the layers in Keras, in that case how to handle vgg_preprocess?

I have tried simply using the existing model and deleting all layers after the first conv layer i.e.

num_del = len(layers) - last_conv_idx - 1
for i in range (0, num_del):  model.pop()

(Note I create the FC model first so i don't lose the fc weights) This seems to produce the correct conv model as the weights are intact. But I'm very interested to know what yours/Jeremy's solution is.

Thanks for any help

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment