Skip to content

Instantly share code, notes, and snippets.

View krunal3kapadiya's full-sized avatar
🎯
Focusing

Krunal Kapadiya krunal3kapadiya

🎯
Focusing
View GitHub Profile
@krunal3kapadiya
krunal3kapadiya / gist:6ae2db7cdf3ef9f94a7248d670d4b3b5
Created May 30, 2020 05:38 — forked from rxaviers/gist:7360908
Complete list of github markdown emoji markup

People

:bowtie: :bowtie: 😄 :smile: 😆 :laughing:
😊 :blush: 😃 :smiley: ☺️ :relaxed:
😏 :smirk: 😍 :heart_eyes: 😘 :kissing_heart:
😚 :kissing_closed_eyes: 😳 :flushed: 😌 :relieved:
😆 :satisfied: 😁 :grin: 😉 :wink:
😜 :stuck_out_tongue_winking_eye: 😝 :stuck_out_tongue_closed_eyes: 😀 :grinning:
😗 :kissing: 😙 :kissing_smiling_eyes: 😛 :stuck_out_tongue:
workflow "Branch notification" {
on = "push"
resolves = ["Add Comment"]
}
action "Jira Login" {
uses = "atlassian/krunal3kapadiya-login@v1.0.0"
secrets = ["JIRA_API_TOKEN", "krunal3kapadiya@gmail.com", "https://krunal3kapadiya.atlassian.net/"]
}
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@krunal3kapadiya
krunal3kapadiya / automobile.ipynb
Created February 26, 2018 10:18 — forked from martinwicke/automobile.ipynb
Estimator demo using Automobile dataset
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
print("Optimization Finished!")
training_cost = sess.run(cost, feed_dict={X: train_X, Y: train_Y})
print("Training cost=", training_cost, "W=", sess.run(W), "b=", sess.run(b), '\n')
# Graphic display
plt.plot(train_X, train_Y, 'ro', label='Original data')
plt.plot(train_X, sess.run(W) * train_X + sess.run(b), label='Fitted line')
plt.legend()
plt.show()
# Start training
with tf.Session() as sess:
# Run the initializer
sess.run(init)
# Fit all training data
for epoch in range(training_epochs):
for (x, y) in zip(train_X, train_Y):
sess.run(optimizer, feed_dict={X: x, Y: y})
# Initialize the variables (i.e. assign their default value)
init = tf.global_variables_initializer()
# Gradient descent
# Note, minimize() knows to modify W and b because Variable objects are trainable=True by default
optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)
# Mean squared error
cost = tf.reduce_sum(tf.pow(pred-Y, 2))/(2*n_samples)
# Construct a linear model
pred = tf.add(tf.multiply(X, W), b)