Skip to content

Instantly share code, notes, and snippets.

@manashmandal
Created February 18, 2018 15:37
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save manashmandal/56804d53c349533c476252f6f88176e0 to your computer and use it in GitHub Desktop.
Save manashmandal/56804d53c349533c476252f6f88176e0 to your computer and use it in GitHub Desktop.
from keras.models import Sequential
from keras.layers import Dense, Dropout, Flatten
from keras.layers import Conv2D, MaxPooling2D, Activation
batch_size = 100
num_classes = 5
epochs = 50
# input image dimensions
img_rows, img_cols = 100, 100
channels = 3
model = Sequential()
model.add(Conv2D(32, kernel_size=(3, 3),padding='same',input_shape=(img_rows, img_cols, channels)))
model.add(Activation('relu'))
model.add(Conv2D(32, (3, 3)))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.25))
model.add(Conv2D(64,(3, 3), padding='same'))
model.add(Activation('relu'))
model.add(Conv2D(64, (3, 3)))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.25))
model.add(Flatten())
model.add(Dense(512))
model.add(Activation('relu'))
model.add(Dropout(0.5))
model.add(Dense(num_classes))
model.add(Activation('sigmoid'))
model.compile(loss=multitask_loss,
optimizer='adam',
metrics=['accuracy'])
model.fit(x_train, y_train,
batch_size=batch_size,
epochs=epochs,
verbose=1,
validation_data=(x_test, y_test))
@alyato
Copy link

alyato commented Apr 16, 2018

Thanks. But i also confused that the loss.
Does the loss function can be replaced by the binary_crossentropy?
If yes ,what does the difference between the multi-loss and binary_crossentropy?
Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment