Skip to content

Instantly share code, notes, and snippets.

#!/usr/bin/python
import numpy as np
import random
# first condition: average: 4.5
def crossed1(p1, p2):
x1, y1 = p1
x2, y2 = p2
# p1 should be within;

Neural Style Transfer

Neural Style Transfer is the process of using Deep Neural Networks to migrate the semantic content of one image to different styles.

Usage

This gist implements NST in Owl, and provides a simple interfaces to use. Here is an example:

#zoo "6f28d54e69d1a19c1819f52c5b16c1a1"
open Owl
open Algodiff.D
let rec desc ?(eta=F 0.01) ?(eps=1e-6) f x =
let g = (diff f) x in
if (unpack_flt g) < eps then x
else desc ~eta ~eps f Maths.(x - eta * g)
@jzstark
jzstark / inception_example.ml
Last active January 11, 2020 15:37
An Inception Example in Owl
#!/usr/bin/env owl
open Owl
(* Import InceptionV3 Library *)
#zoo "9428a62a31dbea75511882ab8218076f"
let img = "/path/to/you/image.png";;
let _ =
(* Path to your image; here we use the "panda.png"
* in this gist as example.
<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<title>Owl - OCaml Scientic and Engineering Computing</title>
<script src="https://code.jquery.com/jquery-3.4.1.slim.min.js" integrity="sha384-J6qa4849blE2+poT4WnyKhv5vZF5SrPo0iEjwBvKU7imGFAV0wwj1yYfoRSJoZ+n" crossorigin="anonymous"></script>
<script src="https://cdn.jsdelivr.net/npm/popper.js@1.16.0/dist/umd/popper.min.js" integrity="sha384-Q6E9RHvbIyZFJoft+2mJbHaEWldlvI9IOYy5n3zV9zzTtmI3UksdQRVvoxMfooAo" crossorigin="anonymous"></script>
<script src="https://stackpath.bootstrapcdn.com/bootstrap/4.4.1/js/bootstrap.min.js" integrity="sha384-wfSDF2E50Y2D1uUdj0O3uMBJnjuUD4Ih7YwaYd1iqfktj0Uod8GCExl3Og8ifwB6" crossorigin="anonymous"></script>
@jzstark
jzstark / Readme.md
Last active September 25, 2019 23:17
Simple MNIST neural network (MirageOS)

Files:

  • config.ml: MirageOS configuration file.
  • simple_mnist.ml: main logic of MNIST neural network.
  • simple_mnist_weight.ml: pre-trained weights of the neural network.

Information:

  • Weight file size: 146KB.
  • Model test accuracy: 92%.
  • MirageOS: compile with Unix backend. The generated binary is 10MB. The other backends are not tested yet.
@jzstark
jzstark / Readme.md
Created September 17, 2019 17:14
Owl-Tensorflow Converter Examples
@jzstark
jzstark / Readme.md
Last active September 16, 2019 12:15
Simple MNIST neural network

Files:

  • simple_mnist.ml: main logic of MNIST neural network.
  • simple_mnist_weight.ml: pre-trained weights of the neural network.

Information:

  • Weight file size: 146KB.
  • Model test accuracy: 92%.
  • MirageOS: compile with Unix backend. The generated binary is 10MB. The other backends are not tested yet. Here is the MirageOS version of this simple MNIST.

Usage:

How I think Algodiff Works

To train a network, we need to first use train_generic:

let train_generic ?state ?params ?(init_model=true) nn x y =
    if init_model = true then init nn;
    let f = forward nn in
    let b = backward nn in
    let u = update nn in

Owl-Tensorflow Converter Example: MNIST CNN Training

This example is a example of MNIST-based CNN.

  • Step 1 : running OCaml script tfgraph_train.ml, which generates a file tf_convert_mnist.pbtxt in current directory.
  • Step 2 : make sure tf_convert_mnist.pbtxt and tfgraph_train.py in the same graph; make sure Tensorflow/numpy etc. is installed.
  • Step 3 : execute python tf_converter_mnist.py, and the expected output on screen is the training progress. After each 100 steps, loss value and model accuracy will be shown.

Here we only assume the python script writer knows where to find the output node (in collection "result") and the placeholder names (x:0).