Skip to content

Instantly share code, notes, and snippets.

@chrisdinant
Created March 14, 2018 08:54
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save chrisdinant/4df2201849d62f537df2aa27ba1aec31 to your computer and use it in GitHub Desktop.
Save chrisdinant/4df2201849d62f537df2aa27ba1aec31 to your computer and use it in GitHub Desktop.
filters = [32,64,128] input_img = Input(shape = (61,75,1))
def block(filters, inp):
inp = inp layer_1 = BatchNormalization()(inp)
act_1 = Activation('relu')(layer_1)
conv_1 = Conv2D(filters, (3,3), padding = 'same')(act_1)
layer_2 = BatchNormalization()(conv_1)
act_2 = Activation('relu')(layer_2)
conv_2 = Conv2D(filters, (3,3), padding = 'same')(act_2)
return(conv_2)
x = Conv2D(filters[0], (3,3), padding = 'same')(input_img)
y = MaxPooling2D(padding = 'same')(x)
x = Add()([block(filters[0], y),y])
y = Add()([block(filters[0], x),x])
x = Add()([block(filters[0], y),y])
x = Conv2D(filters[1], (3,3), strides = (2,2), padding = 'same',
activation = 'relu')(x)
y = Add()([block(filters[1], x),x])
x = Add()([block(filters[1], y),y])
y = Add()([block(filters[1], x),x])
y = Conv2D(filters[2], (3,3), strides = (2,2), padding = 'same',
activation = 'relu')(y)
x = Add()([block(filters[2], y),y])
y = Add()([block(filters[2], x),x])
x = Add()([block(filters[2], y),y])
x2 = GlobalAveragePooling2D()(x)
x2 = Dense(len(classes), activation = 'softmax')(x2)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment