I hereby claim:
- I am sdan on github.
- I am sdan (https://keybase.io/sdan) on keybase.
- I have a public key ASCNnyRpRtiqmD03S9pjbggODAOBS4dFZGzajZri0yLWPgo
To claim this, I am signing this object:
From: Elon Musk <[MASK]> | |
To: Greg Brockman <[MASK]> | |
CC: Sam Altman <[MASK]> | |
Date: Sun, Nov 22, 2015 at 7:48 PM | |
Subject: follow up from call | |
Blog sounds good, assuming adjustments for neutrality vs being YC-centric. | |
I'd favor positioning the blog to appeal a bit more to the general public -- there is a lot of value to having the public root for us to succeed -- and then having a longer, more detailed and inside-baseball version for recruiting, with a link to it at the end of the general public version. |
1. Fork https://github.com/chroma-core/chroma ; for some reason you can't deploy images on Railway | |
Ensure you add PORT=8000 env var in Railway | |
2. Client side: | |
Railway doesn't allow HTTP connections so you'll need to open up SSL connections on the client side like so(for some reason this isn't properly documented): | |
# Set up ChromaDB client | |
client = chromadb.Client(Settings( |
I hereby claim:
To claim this, I am signing this object:
#!/bin/sh | |
set -x | |
# == Swarm training (alpha release) == | |
# Setup: | |
# | |
# git clone https://github.com/shawwn/gpt-2 | |
# cd gpt-2 | |
# git checkout dev-shard |
The command line, in short…
wget -k -K -E -r -l 10 -p -N -F --restrict-file-names=windows -nH http://website.com/
…and the options explained
I hereby claim:
To claim this, I am signing this object:
import numpy as np | |
import tensorflow as tf | |
import gym | |
import time | |
import spinup.algos.ppo.core as core | |
from spinup.utils.logx import EpochLogger | |
from spinup.utils.mpi_tf import MpiAdamOptimizer, sync_all_params, MpiAdadeltaOptimizer, MpiAdagradOptimizer, MpiFtrlOptimizer, MpiGradientDescentOptimizer, MpiMomentumOptimizer, MpiProximalAdagradOptimizer, MpiProximalGradientDescentOptimizer, MpiRMSPropOptimizer, MpiAdaMaxOptimizer, MpiAdamGSOptimizer, MpiAdamWOptimizer, MpiAddSignOptimizer, MpiGGTOptimizer, MpiLARSOptimizer, MpiLazyAdamGSOptimizer, MpiLazyAdamOptimizer, MpiMomentumWOptimizer, MpiNadamOptimizer, MpiPowerSignOptimizer, MpiShampooOptimizer | |
from spinup.utils.mpi_tools import mpi_fork, mpi_avg, proc_id, mpi_statistics_scalar, num_procs |
import numpy as np | |
import math | |
import tensorflow as tf | |
from mpi4py import MPI | |
from spinup.utils.mpi_tools import broadcast | |
def flat_concat(xs): | |
return tf.concat([tf.reshape(x,(-1,)) for x in xs], axis=0) |
from spinup.user_config import DEFAULT_DATA_DIR, FORCE_DATESTAMP, \ | |
DEFAULT_SHORTHAND, WAIT_BEFORE_LAUNCH | |
from spinup.utils.logx import colorize | |
from spinup.utils.mpi_tools import mpi_fork, msg | |
from spinup.utils.serialization_utils import convert_json | |
import base64 | |
from copy import deepcopy | |
import cloudpickle | |
import json | |
import numpy as np |
import numpy as np | |
import math | |
import tensorflow as tf | |
from mpi4py import MPI | |
from spinup.utils.mpi_tools import broadcast | |
def flat_concat(xs): | |
return tf.concat([tf.reshape(x,(-1,)) for x in xs], axis=0) |