Skip to content

Instantly share code, notes, and snippets.

View eggie5's full-sized avatar
💭
Working on TensorFlow Ranking

Alex Egg eggie5

💭
Working on TensorFlow Ranking
View GitHub Profile
@fchollet
fchollet / classifier_from_little_data_script_3.py
Last active September 13, 2023 03:34
Fine-tuning a Keras model. Updated to the Keras 2.0 API.
'''This script goes along the blog post
"Building powerful image classification models using very little data"
from blog.keras.io.
It uses data that can be downloaded at:
https://www.kaggle.com/c/dogs-vs-cats/data
In our setup, we:
- created a data/ folder
- created train/ and validation/ subfolders inside data/
- created cats/ and dogs/ subfolders inside train/ and validation/
- put the cat pictures index 0-999 in data/train/cats

FWIW: I (@rondy) am not the creator of the content shared here, which is an excerpt from Edmond Lau's book. I simply copied and pasted it from another location and saved it as a personal note, before it gained popularity on news.ycombinator.com. Unfortunately, I cannot recall the exact origin of the original source, nor was I able to find the author's name, so I am can't provide the appropriate credits.


Effective Engineer - Notes

What's an Effective Engineer?

@shagunsodhani
shagunsodhani / Learning to Generate Reviews and Discovering Sentiment.md
Last active January 30, 2020 22:27
Notes for "Learning to Generate Reviews and Discovering Sentiment" paper

Learning to Generate Reviews and Discovering Sentiment

Summary

The authors train a character-RNN (using mLSTM units) over Amazon Product Reviews (82 million reviews) and use the char-RNN as the feature extractor for sentiment analysis. These unsupervised features beat state of the art results for the dataset while are outperformed by supervised approaches on other datasets. Most important observation is that the authors find a single neuron (called as the sentiment neuron) which alone achieves a test accuracy of 92.3% thus giving the impression that the sentiment concept has been captured in that single neuron. Switching this neuron on (or off) during the generative process produces positive (or negative) reviews.

Notes

  • The paper aims to evaluate if the low level features captured by char-RNN can support learning of high-level representations.
@omoindrot
omoindrot / tensorflow_finetune.py
Last active February 25, 2024 15:00
Example TensorFlow script for fine-tuning a VGG model (uses tf.contrib.data)
"""
Example TensorFlow script for finetuning a VGG model on your own data.
Uses tf.contrib.data module which is in release v1.2
Based on PyTorch example from Justin Johnson
(https://gist.github.com/jcjohnson/6e41e8512c17eae5da50aebef3378a4c)
Required packages: tensorflow (v1.2)
Download the weights trained on ImageNet for VGG:
```
wget http://download.tensorflow.org/models/vgg_16_2016_08_28.tar.gz
@ed-alertedh
ed-alertedh / validate_tfrecords.py
Last active April 16, 2024 18:57
Utility functions to check for corruption in tfrecord files
import tensorflow as tf
def validate_dataset(filenames, reader_opts=None):
"""
Attempt to iterate over every record in the supplied iterable of TFRecord filenames
:param filenames: iterable of filenames to read
:param reader_opts: (optional) tf.python_io.TFRecordOptions to use when constructing the record iterator
"""
i = 0
@lioutasb
lioutasb / mrr_metric.py
Created July 26, 2018 23:09
Tensorflow implementation of Mean Reciprocal Rank (mrr) metric compatible with tf.Estimator
import tensorflow as tf
def mrr_metric(labels, predictions, weights=None,
metrics_collections=None,
updates_collections=None,
name=None):
with tf.name_scope(name, 'mrr_metric', [predictions, labels, weights]) as scope:
@peter0749
peter0749 / AttentionLoss.py
Created August 18, 2018 11:23
Keras Layer/Function of Learning a Deep Listwise Context Model for Ranking Refinement
def att_loss(y_true, y_pred):
def att_(x):
a = tf.where(x>0, K.exp(x), K.zeros_like(x))
return a / (K.sum(a, axis=-1, keepdims=True)+K.epsilon())
y_true_a = att_(y_true)
y_pred_a = att_(y_pred)
loss = K.mean(K.binary_crossentropy(y_true_a, y_pred_a), axis=-1)
return loss