Skip to content

Instantly share code, notes, and snippets.

View fogside's full-sized avatar
🐙
Doing cool stuff

Ženja Cheskidova fogside

🐙
Doing cool stuff
View GitHub Profile
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

DeepNLP

  1. Intro

    • backprop,
    • theano,
    • mnist classification;
  2. Embeddings

    • Text classification: bag of words, TF-IDF
  • NLTK: lemmatization, stemming;
#include <iostream>
#include <algorithm>
#include <stack>
#include <stdexcept>
#include <map>
#include <sstream>
#include <utility>
#include <climits>
using namespace std;
import os
src_path = '/Users/fogside/Desktop/all_photos/Camera/'
dst_path = '/Users/fogside/Desktop/all_photos/'
###############################################################################################
##### Creating folders like <year-month> in dst_path ######################################
##### and moving photos from all folders in the src_path ######################################
##### to the corresponding folder <year-month> ######################################
##### filenames must be like: ######################################
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
## Задание 1 -- нужно было просто сделать только одно матричное умножение в имплементации ячейки lstm:
in_mtx = tf.Variable(tf.truncated_normal([vocabulary_size, num_nodes*4], -0.1, 0.1))
out_mtx = tf.Variable(tf.truncated_normal([num_nodes, num_nodes*4], -0.1, 0.1))
b_vec = tf.Variable(tf.zeros([1, num_nodes*4]))
# Variables saving state across unrollings.
saved_output = tf.Variable(tf.zeros([batch_size, num_nodes]), trainable=False)
saved_state = tf.Variable(tf.zeros([batch_size, num_nodes]), trainable=False)
from itertools import compress
data_index = 0
def generate_batch_cbow(batch_size, num_skips, skip_window):
'''
Batch generator for CBOW (Continuous Bag of Words)
batch should be a shape of (batch_size, num_skips)
Parameters
----------
batch_size: number of words in each mini-batch
#!/usr/local/bin/python
# -*- coding: utf-8 -*-
"""
Summer School problem.
usage:
http letnyayashkola.org/api/v1.0/workshop-programs/ | ./script.py
"""
import numpy as np
import math
from scipy import stats
from sklearn.neighbors import KDTree
import warnings
warnings.filterwarnings("ignore", category=DeprecationWarning)
from sklearn.metrics import confusion_matrix
import math