Skip to content

Instantly share code, notes, and snippets.

View tomgrek's full-sized avatar

Tom Grek tomgrek

View GitHub Profile
@tomgrek
tomgrek / docs.md
Created January 8, 2024 04:40
Embedding Windows Python in C++ (Visual Studio project)

Quick note on how to include Python into a C++ executable built with Visual Studio 2022. Uses Python installed via the Windows Store.

  1. Locate Python installation

Easiest way I found was from Powershell, run python: import sys; print(sys.__file__)

Because my installation was from Windows Store, it was in c:/Program Files/WindowsApps which was not accessible to my user. From Powershell (run as Admin) I had to run takeown /f "C:\Program Files\WindowsApps" /r.

  1. Add includes to Visual Studio project
@tomgrek
tomgrek / gist:61b0454e3cb3e010382e192c3982049f
Created August 27, 2020 17:27
Mask a sequence in Pytorch

I do this so many times, might as well make a gist.

Start with an example 4 (batch size) x 2 (sequence length) x 3 (embedding dim) tensor.

a = torch.tensor([[[1,2,3],[4,5,6]],[[10,11,12],[13,14,15]],[[6,7,8],[9,10,11]],[[12,13,14],[15,16,17]]])

tensor([[[ 1,  2,  3],
         [ 4,  5,  6]],
@tomgrek
tomgrek / life.py
Created January 27, 2020 02:56
Game of Life in Pytorch as a convolution
import cv2
import numpy as np
import sys
import torch
import torch.nn.functional as F
from PIL import Image
BOARD_HEIGHT = 200
BOARD_WIDTH = 300
@tomgrek
tomgrek / mlq_worker.py
Created November 25, 2018 03:06
A Very simple MLQ worker
import asyncio
from mlq.queue import MLQ
mlq = MLQ('example', 'localhost', 6379, 0)
def some_listener_func(params, *args):
return params
def main():
print("Worker starting")
@tomgrek
tomgrek / default
Created November 17, 2018 03:29
nginx front and backend config on same server
server {
root /var/www/xxx;
index index.html;
server_name xxx.xxx;
if ($request_method = 'OPTIONS') {
add_header 'Access-Control-Allow-Origin' '*';
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS';
add_header 'Access-Control-Allow-Headers' 'DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range';
add_header 'Access-Control-Max-Age' 1728000;
add_header 'Content-Type' 'text/plain; charset=utf-8';
@tomgrek
tomgrek / Policy.py
Last active September 1, 2018 20:53
My neural network for actor critic financial trading
class Policy(nn.Module):
def __init__(self):
super(Policy, self).__init__()
self.input_layer = nn.Linear(8, 128)
self.hidden_1 = nn.Linear(128, 128)
self.hidden_2 = nn.Linear(32,31)
self.hidden_state = torch.tensor(torch.zeros(2,1,32), requires_grad=False).cuda()
self.rnn = nn.GRU(128, 32, 2)
self.action_head = nn.Linear(31, 5)
self.value_head = nn.Linear(31, 1)
@tomgrek
tomgrek / index.html
Last active May 3, 2018 17:09 — forked from mourner/index.html
Mapbox GL JS Puppeteer benchmark
<!doctype html>
<meta charset="utf-8">
<title>Benchmark</title>
<body></body>
<style>html, body, #map { height: 100%; margin: 0; } </style>
<div id="map"></div>
<script src='https://api.tiles.mapbox.com/mapbox-gl-js/v0.40.0/mapbox-gl.js'></script>
<!-- <script src="mapbox-gl.js"></script> -->
@tomgrek
tomgrek / 2_b.py
Last active March 23, 2018 02:49
2_b
class BotBrain(nn.Module):
def __init__(self):
super().__init__()
self.embedding = nn.Embedding(len(words), 10)
self.rnn = nn.LSTM(20, 30, 2, dropout=0.5)
self.h = (Variable(torch.zeros(2, 1, 30)).cuda(), Variable(torch.zeros(2, 1, 30)))
self.l_out = nn.Linear(30, len(words))
def forward(self, cs):
inp = (self.embedding(cs)).view(1, -1)
@tomgrek
tomgrek / 2_a.py
Created March 23, 2018 02:36
chatbot article part 2_a
class DataGenerator():
def __init__(self, dset):
self.dset = dset
self.len = len(self.dset)
self.idx = 0
def __len__(self):
return self.len
def __iter__(self):
return self
def __next__(self):
@tomgrek
tomgrek / trainingloop.py
Last active June 22, 2020 17:11
A PyTorch training loop
criterion = nn.CrossEntropyLoss()
optimizer = optim.SGD(m.parameters(), lr=0.01)
for epoch in range(0,300):
gen = DataGenerator([words[word.lower()] for word in ' '.join(sentences).replace('?',' <unk>').split(' ')])
for x, y in gen:
m.zero_grad()
output = m(x)
loss = criterion(output, y)
loss.backward()
optimizer.step()