Navigation Menu

Skip to content

Instantly share code, notes, and snippets.

View johnmyleswhite's full-sized avatar

John Myles White johnmyleswhite

View GitHub Profile
function fixed_point(f, x0)
x, x_old = f(x0), x0
n = 1
while x !== x_old
x, x_old = f(x), x
n += 1
end
(x, n)
end

Univariate API

A new univariate distribution type should implement all of the following methods:

  • Core constructors
    • MyDistribution{T}(args[...])
    • We need to clarify whether constructors should handle input validation or not. There are use cases in which people want to avoid input validation.
  • params(d::MyDistribution{T})::Tuple: A tuple of the distribution's parameters in our canonical order.
  • minimum(d::MyDistribution{T})::T: The lowest value in the support of MyDistribution.
  • maximum(d::MyDistribution{T})::T: The highest value in the support of MyDistribution.
library("pwr")
library("ggplot2")
n_sims <- 1000L
n <- 10L
mu <- 0.5
sigma <- 1
true_power <- power.t.test(
n = n,
@johnmyleswhite
johnmyleswhite / JuliaGlobals
Last active August 10, 2018 10:58
Naively counting Pythagorean triples in Python and Julia
total = 0
N = 300
start_time = time()
for a in 0:(N - 1)
for b in 0:(N - 1)
for c in 0:(N - 1)
if a^2 + b^2 == c^2
total = total + 1
@johnmyleswhite
johnmyleswhite / gist:5248212
Created March 26, 2013 19:06
The Joys of Sparsity: Forward Stagewise Regression
# Generate (x, y) data with a sparse set of active predictors
# prob controls the frequency of predictors having zero effect
function simulate_date(n::Integer, p::Integer, prob::Real)
x = randn(n, p)
beta = randn(p)
for j in 1:p
if rand() < prob
beta[j] = 0.0
end
end
@johnmyleswhite
johnmyleswhite / missing_values.jl
Created June 27, 2018 12:28
Perf of Missing Values in Julia 0.7
using BenchmarkTools
n = 10_000_000
x = rand(n)
y = [ifelse(iseven(i), missing, x[i]) for i in 1:n]
sum(x)
sum(y)
# 0.6 Approach
using BenchmarkTools
using NullableArrays
n = 10_000_000
x = rand(n)
y = NullableArray{Float64}(n)
for i in 1:n
if !iseven(i)
using Distributions
using PyPlot
ϵ = 0.0001
n_grid = 10_000
ps = linspace(ϵ, 1 - ϵ, n_grid)
ss = [std(Bernoulli(p)) for p in ps]
sks = [skewness(Bernoulli(p)) for p in ps]
using Distributions
using HypothesisTests
n_sims = 1_000_000
n = 2
x = Array{Float64}(n)
y = Array{Float64}(n)
p_d = Array{Float64}(n_sims)
for s in 1:n_sims
for i in 1:n
using Distributions
using HypothesisTests
n_sims = 1_000_000
n = 20
x = Array{Float64}(n)
y = Array{Float64}(n)
p_d = Array{Float64}(n_sims)
for s in 1:n_sims
for i in 1:n