Skip to content

Instantly share code, notes, and snippets.

@shivaram
shivaram / dataframe_example.R
Created June 2, 2015 23:54
DataFrame example in SparkR
# Download Spark 1.4 from http://spark.apache.org/downloads.html
#
# Download the nyc flights dataset as a CSV from https://s3-us-west-2.amazonaws.com/sparkr-data/nycflights13.csv
# Launch SparkR using
# ./bin/sparkR --packages com.databricks:spark-csv_2.10:1.0.3
# The SparkSQL context should already be created for you as sqlContext
sqlContext
# Java ref type org.apache.spark.sql.SQLContext id 1
@shivaram
shivaram / cdh5-SparkR.md
Last active May 17, 2017 09:09
Installing SparkR + CDH5 on EC2

On master node

wget http://archive.cloudera.com/cdh5/one-click-install/redhat/6/x86_64/cloudera-cdh-5-0.x86_64.rpm
sudo yum --nogpgcheck localinstall cloudera-cdh-5-0.x86_64.rpm
sudo yum clean all
sudo yum install hadoop-hdfs-namenode
sudo yum install R git
sudo yum install spark-core spark-master spark-python

cd
@shivaram
shivaram / rstudo-sparkr.R
Created July 15, 2015 18:17
Rstudio local setup
Sys.setenv(SPARK_HOME="/Users/shivaram/spark-1.4.1")
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))
library(SparkR)
sc <- sparkR.init(master="local")
sqlContext <- sparkRSQL.init(sc)
df <- createDataFrame(sqlContext, faithful)
# Select one column
head(select(df, df$eruptions))

Keybase proof

I hereby claim:

  • I am shivaram on github.
  • I am shivaram (https://keybase.io/shivaram) on keybase.
  • I have a public key ASAcRXjNGcE3Wivd9PLqf-4EpoDcjMUuhxSEANR88silxQo

To claim this, I am signing this object:

@shivaram
shivaram / sbt-test.txt
Created July 8, 2013 17:55
output from sbt test
[info] ReplSuite:
[info] - simple foreach with accumulator
[info] - external vars
[warn] /home/shivaram/projects/spark/core/src/test/scala/spark/FileServerSuite.scala:49: method toURL in class File is deprecated: see corresponding Javadoc for more information.
[warn] val partitionSumsWithSplit = nums.mapPartitionsWithSplit {
[warn] ^
[info] - external classes
[info] - external functions
[warn] Note: /home/shivaram/projects/spark/streaming/src/test/java/spark/streaming/JavaAPISuite.java uses unchecked or unsafe operations.
[warn] Note: Recompile with -Xlint:unchecked for details.
@shivaram
shivaram / fair-share.json
Created March 9, 2013 09:20
JSON example
{
"pools": {
"sample_pool": {
"minMaps": 5,
"maxMaps": 25,
"maxReduces": 25,
"minSharePremptionTimeout": 300
}
},
"fairSharePreemptionTimeout": 600,
@shivaram
shivaram / fair-share.yaml
Created March 9, 2013 09:19
YAML example
# Fair Scheduler Pools
pools:
sample_pool:
minMaps: 5
maxMaps: 25
maxReduces: 25
minSharePremptionTimeout: 300
# User limits
users:
@shivaram
shivaram / sparkr-demo
Created July 15, 2015 18:17
SparkR 1.4.1 Demo
# If you are using Spark 1.4, then launch SparkR with the command
#
# ./bin/sparkR --packages com.databricks:spark-csv_2.10:1.0.3
# as the `sparkPackages=` flag was only added in Spark 1.4.1.
# # This will work in Spark 1.4.1.
sc <- sparkR.init(spark_link, sparkPackages = "com.databricks:spark-csv_2.10:1.0.3")
sqlContext <- sparkRSQL.init(sc)
flights <- read.df(sqlContext, "s3n://sparkr-data/nycflights13.csv","com.databricks.spark.csv", header="true")
@shivaram
shivaram / map-ec2-hosts.sh
Created October 13, 2012 22:46
Map internal to external hostnames on ec2 while using Mesos + Spark
#!/bin/bash
for i in `cat /root/mesos-ec2/slaves`;
do
ssh $i 'echo -n `hostname`" "; nslookup `hostname` | grep Address | grep 10';
done | awk '{print $1" "$3}' > /tmp/internal
for i in `cat /root/mesos-ec2/slaves`;
do
ssh $i "echo -n $i' '; nslookup $i | grep Address | grep 10";
@shivaram
shivaram / merge_spark_pr.sh
Created April 24, 2015 00:36
Prompt for jira password
#!/bin/bash
FWDIR="$(cd "`dirname "$0"`"; pwd)"
pushd $FWDIR >/dev/null
echo -n "JIRA Password:";
read -s password
echo ""