Skip to content

Instantly share code, notes, and snippets.

typealias HANDLE Ptr{Void}
typealias DWORD Uint32
typealias WORD Uint16
typealias LPTSTR Cwchar_t
typealias LPBYTE Ptr{Char}
immutable STARTUPINFO
cb::DWORD
lpReserved::LPTSTR
lpDesktop::LPTSTR
@ggggggggg
ggggggggg / buffered_hdf5.jl
Created October 20, 2017 19:10
buffered hdf5 dataset in juli
using HDF5
"HDF5 appears to be inefficent for small writes, so this a simple buffer that
allows me to write to HDF5 only once per unit time (typically one second) to
limit the number of small writes."
mutable struct BufferedHDF5Dataset{T}
ds::HDF5Dataset
v::Vector{T}
lasti::Int64 # last index in hdf5 dataset
timeout_s::Float64 # interval in seconds at which to transfer data from v to ds
@fguevaravas
fguevaravas / MinimalBuild.jl
Created November 28, 2017 14:44
Minimal Julia Compilation
## Better way for creating standalone EXE files using Julia,
## Taken from: https://github.com/JuliaComputing/static-julia
## Assumptions:
## 1. g++ / x86_64-w64-mingw32-gcc is available and is in path
## 2. patchelf is in the path
module MinimalBuild
# Assumption: the application main module is in MyApplication.jl
# this module must export a function called julia_main that is ccallable
appname = "MyApplication"
julia> solve(m)
******************************************************************************
This program contains Ipopt, a library for large-scale nonlinear optimization.
Ipopt is released as open source code under the Eclipse Public License (EPL).
For more information visit http://projects.coin-or.org/Ipopt
******************************************************************************
Internal error 2 in QAMD : Schur size expected: 0 Real: 1
** MPI_ABORT called
@sminot
sminot / test_sparse_dataframe_creation.ipy
Created October 26, 2017 17:52
Profiling sparse DataFrame creation
#!/usr/local/bin/ipython
import pandas as pd
from collections import defaultdict
from random import choice
alph = ['a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j']
# Function to make some test data
def make_dat(nrows=1000, ncols=1000, nvals=1000):
@PatWie
PatWie / update-tensorflow.sh
Last active December 4, 2019 18:24
simple cronjob script to frequently build TensorFlow from source automatically
#!/bin/bash
# Patrick Wieschollek
# =============================================================
# UPDATE SOURCE
# =============================================================
git checkout -- .
git pull origin master
from keras.models import Sequential
from keras.layers import Dense
from keras.utils.io_utils import HDF5Matrix
import numpy as np
def create_dataset():
import h5py
X = np.random.randn(200,10).astype('float32')
y = np.random.randint(0, 2, size=(200,1))
f = h5py.File('test.h5', 'w')
@AmericanEnglish
AmericanEnglish / README.md
Last active May 26, 2020 22:56
How to compile TensorFlow 2.0 from source in an HPC environment that uses EasyBuild for CentOS 7.x

How to compile TensorFlow 2.0 from source in an HPC environment that uses EasyBuild for CentOS7

Some background about why another Gist on compiling tensorflow without root

If you're like me and you don't have root access to a HPC system and your system adminstrators use EasyBuild combined with Lmod you are going to be in for a bad time. Our CentOS7 did not ship out of the box capable of compiling tensorflow 2.0 from source with no problems. I scoured several closed and recently opened issues to write a script that enables the creation of a fully functional .whl for tensorflow 2.0. Note that on our cluster we have only two different sets of GPUs. One set has compute capability 7.0 (V100) and compute capability 3.5 (K20m). So many issues came up

  1. because Bazel can't handle symlinks for gcc leading to errors like
@wassname
wassname / augumented_hdf5_matrix.py
Last active October 11, 2020 23:32
How to do data augmentation on a keras HDF5Matrix
"""Another way, note this one will load the whole array into memory ."""
from keras.preprocessing.image import ImageDataGenerator
import h5py
from keras.utils.io_utils import HDF5Matrix
seed=0
batch_size=32
# we create two instances with the same arguments
data_gen_args = dict(
rotation_range=90.,
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.