Skip to content

Instantly share code, notes, and snippets.

@uhmseohun
Created April 6, 2020 08:00
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save uhmseohun/4b5c130ec331d6c9d81a0a5c264487e0 to your computer and use it in GitHub Desktop.
Save uhmseohun/4b5c130ec331d6c9d81a0a5c264487e0 to your computer and use it in GitHub Desktop.
간단한 MNIST 예제
import numpy as np
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Flatten, Dense, Activation
def softmax(x):
return np.exp(x) / np.sum(np.exp(x), axis=0)
(x_train, y_train), (x_test, y_test) = tf.keras.datasets.mnist.load_data()
x_train = x_train.astype('float32')
x_test = x_test.astype('float32')
gray_scale = 255
x_train /= gray_scale
x_test /= gray_scale
model = Sequential([
Flatten(input_shape=(28, 28)),
Dense(256, activation='sigmoid'),
Dense(128, activation='sigmoid'),
Dense(10, activation='softmax')
])
model.compile(
optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy']
)
model.fit(
x_train, y_train,
epochs=15,
batch_size=2000,
validation_split=0.2
)
results = model.evaluate(
x_test, y_test, verbose=0
)
print('Test loss: %.2f%%' % (results[0] * 100))
print('Test accuray: %.2f%%' % (results[1] * 100))
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment