Skip to content

Instantly share code, notes, and snippets.

@ilyakava
Created September 30, 2015 13:00
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ilyakava/cab0f8b0d99e4b4568cb to your computer and use it in GitHub Desktop.
Save ilyakava/cab0f8b0d99e4b4568cb to your computer and use it in GitHub Desktop.
Cannot copy param 1 weights from layer 'conv1'; shape mismatch. Source param shape is 1 1 1 96 (96); target param shape is 96 (96).
import numpy as np
import caffe
MODEL_FILE = '../val.prototxt'
PRETRAINED = '../food_alexnet_train_iter_25000.caffemodel'
IMAGE_MEAN = '../imagenet_mean.binaryproto'
INPUT_IMAGE = '~/code/fundus/data/train/cent_crop_227/1000016.png'
net = caffe.Classifier(MODEL_FILE, PRETRAINED, image_dims=(256,256))
# net = caffe.Classifier(MODEL_FILE, PRETRAINED, image_dims=(227,227))
# loading the mean image and passing it in (with mean=) leads to the same error
@csandmann
Copy link

I have stumbled upon the same problem: I trained a network using digits and tried to deploy it using happynear on windows. For some reason the initialization for the BatchNorm layer fails with the exact same error (only difference: 64 parameters instead of 96). Did you per chance happen to resolve this issue?

@lin1000
Copy link

lin1000 commented Dec 24, 2017

Hi check0104, I am facing the same difficulty. Did you resolved it? need your insight.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment