This gist details the following:
- Converting a Subversion (SVN) repository into a Git repository
- Purging the resultant Git repository of large files
- Retrieve a list of SVN commit usernames
echo 'export PATH=$HOME/local/bin:$PATH' >> ~/.bashrc | |
. ~/.bashrc | |
mkdir ~/local | |
mkdir ~/node-latest-install | |
cd ~/node-latest-install | |
curl http://nodejs.org/dist/node-latest.tar.gz | tar xz --strip-components=1 | |
./configure --prefix=~/local | |
make install # ok, fine, this step probably takes more than 30 seconds... | |
curl https://www.npmjs.org/install.sh | sh |
public class SomeFragment extends Fragment { | |
MapView mapView; | |
GoogleMap map; | |
@Override | |
public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) { | |
View v = inflater.inflate(R.layout.some_layout, container, false); | |
var Promise = require('bluebird'); | |
var promiseWhile = function(condition, action) { | |
var resolver = Promise.defer(); | |
var loop = function() { | |
if (!condition()) return resolver.resolve(); | |
return Promise.cast(action()) | |
.then(loop) | |
.catch(resolver.reject); |
This gist details the following:
sudo openvpn --config *.opvn | |
apt-get update | |
apt-get install vim | |
wget http://d3kbcqa49mib13.cloudfront.net/spark-1.3.0-bin-hadoop2.4.tgz | tar zxf | |
hadoop fs -mkdir /spark | |
hadoop fs -put spark-1.3.0-bin-hadoop2.4.tgz /spark | |
hadoop fs -du -h /spark | |
cp spark-env.sh.template spark-env.sh | |
## Delete a remote branch | |
$ git push origin --delete <branch> # Git version 1.7.0 or newer | |
$ git push origin :<branch> # Git versions older than 1.7.0 | |
## Delete a local branch | |
$ git branch --delete <branch> | |
$ git branch -d <branch> # Shorter version | |
$ git branch -D <branch> # Force delete un-merged branches | |
## Delete a local remote-tracking branch |
> brew install hadoop | |
This installs hadoop at /usr/local/Cellar/hadoop/2.7.3 | |
Find java home | |
> cd /usr/local/Cellar/hadoop/2.7.3 | |
> vim etc/hadoop/hadoop-env.sh | |
The JAVA_HOME should be set as below in file | |
export JAVA_HOME="$(/usr/libexec/java_home)" |
I recently had several days of extremely frustrating experiences with service workers. Here are a few things I've since learned which would have made my life much easier but which isn't particularly obvious from most of the blog posts and videos I've seen.
I'll add to this list over time – suggested additions welcome in the comments or via twitter.com/rich_harris.
Chrome 51 has some pretty wild behaviour related to console.log
in service workers. Canary doesn't, and it has a load of really good service worker related stuff in devtools.
<template> | |
<span id="time" v-html="time"></span> | |
</template> | |
<style> | |
</style> | |
<script> | |
module.exports = { |
val ssc = new StreamingContext("local", "datastream", Seconds(15)) | |
// create InputDStream | |
ssc.registerInputStream(stream) | |
// interact with stream | |
ssc.start() |