Skip to content

Instantly share code, notes, and snippets.

Kovas Boguta kovasb

Block or report user

Report or block kovasb

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
View gist:9a4cda4b7f24be39db787248e2022e7f
#!/bin/bash
set -eu
wget -nc https://github.com/rapidsai/notebooks-extended/raw/master/utils/env-check.py
echo "Checking for GPU type:"
python env-check.py
if [ ! -f Miniconda3-4.5.4-Linux-x86_64.sh ]; then
echo "Removing conflicting packages, will replace with RAPIDS compatible versions"
View gist:ced7061fd0c895c0782dd5fa28de1728
# Example training loop for fitting a line from 2 points
# A line is defined as a*x+b
# Want machine to learn what a and b are.
# Important thing to note is the overall structure of components
# 1. Batch of training data
# 1A: inputs used to generate predictions
x_values = tf.convert_to_tensor([0.0, 1.0])
# 1B: desired outputs or 'labels'
y_values = tf.convert_to_tensor([0.0, 3.0])
View gist:27d87c5210354389cbd2a1881310ad02
# recall the training loop pattern
# 1. Batch of training data
# Batch of training data
x_values = tf.range(0,1,0.01)
# 1B: desired outputs or 'labels'
y_values = 3*x_values + 1 + tf.random_uniform(x_values.shape)
# 2. Model
# 2A: Variables that will be trained
View gist:b1f8f9f89da767c2f66c9bd37a25bd7e
______________________________________
user> ERROR: Unhandled REPL handler exception processing message {:file (ns cursive.repl.runtime
(:import [java.lang.reflect Method Field Constructor])
(:require [clojure.reflect :as reflect]
[clojure.set :as set]))
(def mappings {:macro identity
:ns (comp name ns-name)
:name name
:arglists #(mapv str %)
@kovasb
kovasb / gist:9b625b8e82b0aa08c1c9
Last active Jan 13, 2016
Assessing Spark & Flink from Clojure POV
View gist:9b625b8e82b0aa08c1c9
**Concerns
- Interactivity
-- Incremental extension or modification of running system
- Modularity
-- Pluggable serialization, storage, conveyance, scheduling, lifecycle hooks etc etc
**Spark Summary
- RDDs: represent a lazily-computed distributed dataset
-- each dataset is broken up into 'partitions' that individually sit at different machines
-- just a DAG with edges annotated to describe relationship between partitions of parent and partitions of child
@kovasb
kovasb / gist:843f2fea05a9da3ec4fd
Created Mar 24, 2015
piggieback+weasel+cursive
View gist:843f2fea05a9da3ec4fd
WARNING: No such namespace: clojure.tools.nrepl.middleware.load-file, could not locate clojure/tools/nrepl/middleware/load_file.cljs at line 1 <cljs repl>
clojure.lang.ExceptionInfo: Unable to resolve var: file-contents in this context at line 1 <cljs repl> {:tag :cljs/analysis-error, :file "<cljs repl>", :line 1, :column 390}
View gist:a641c42ce0bd120296cb
(require '[clojure.data.fressian :as fress])
(require '[clojure.java.io :as io])
(import 'java.util.zip.GZIPInputStream)
(defn fress-seq* [reader]
(let [x (try (fress/read-object reader) (catch Exception e :eof))]
(if (= :eof x)
(do (.close reader) nil)
(cons x
View gist:6a9bb54316ff0e0e051e
/*static*/ v8::Handle<v8::Value> Java::bufferFromDirect(const v8::Arguments& args) {
v8::HandleScope scope;
Java* self = node::ObjectWrap::Unwrap<Java>(args.This());
v8::Handle<v8::Value> ensureJvmResults = self->ensureJvm();
if(!ensureJvmResults->IsUndefined()) {
return v8::False();
}
You can’t perform that action at this time.