Skip to content

Instantly share code, notes, and snippets.

What would you like to do?
# train skip-gram (word2vec) model
model = Word2Vec(window = 4, sg = 1, hs = 0,
negative = 10, # for negative sampling
alpha=0.03, min_alpha=0.0007,
seed = 14)
model.build_vocab(random_walks, progress_per=2)
model.train(random_walks, total_examples = model.corpus_count, epochs=20, report_delay=1)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment