Skip to content

Instantly share code, notes, and snippets.

View yaravind's full-sized avatar
💭
Constraints Liberate. Liberties Constrain.

Aravind Yarram yaravind

💭
Constraints Liberate. Liberties Constrain.
View GitHub Profile
@yaravind
yaravind / spark-rest-submit.sh
Last active October 30, 2020 02:43
Submit apps (SparkPi as e.g.) to spark cluster using rest api
curl -X POST -d http://master-host:6066/v1/submissions/create --header "Content-Type:application/json" --data '{
"action": "CreateSubmissionRequest",
"appResource": "hdfs://localhost:9000/user/spark-examples_2.11-2.0.0.jar",
"clientSparkVersion": "2.0.0",
"appArgs": [ "10" ],
"environmentVariables" : {
"SPARK_ENV_LOADED" : "1"
},
"mainClass": "org.apache.spark.examples.SparkPi",
"sparkProperties": {
@yaravind
yaravind / KMeansSparkMLToMLLib.scala
Last active July 3, 2020 23:16
SparkML to MLLib conversion to run BisectingKMeans clustering
import org.apache.spark.mllib.clustering.BisectingKMeans
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.linalg.Vector
//std_features col is of type vector
scaledFeatures.select($"std_features").printSchema()
val tempFeatureRdd = scaledFeatures.select($"std_features").rdd
import scala.reflect.runtime.universe._
from pyspark.sql import SparkSession
from pyspark.sql.functions import *
from pyspark.sql import Row
from pyspark.sql.types import IntegerType
# Create the Spark session
spark = SparkSession.builder \
.master("local") \
.config("spark.sql.autoBroadcastJoinThreshold", -1) \
.config("spark.executor.memory", "500mb") \
import org.apache.spark.ml.Pipeline
import org.apache.spark.ml.PipelineStage
import org.apache.spark.ml.Transformer
import org.apache.spark.ml.classification.LogisticRegression
import org.apache.spark.ml.feature.LabeledPoint
import org.apache.spark.ml.linalg.DenseVector
import org.apache.spark.ml.linalg.Vectors
import org.apache.spark.ml.param.ParamMap
import org.apache.spark.sql.Dataset
import org.apache.spark.sql.Row
@yaravind
yaravind / WikiPageClustering.java
Created April 28, 2020 18:04 — forked from Jeffwan/WikiPageClustering.java
Machine Learning Pipleline
package com.diorsding.spark.ml;
import java.util.Arrays;
import java.util.List;
import org.apache.spark.SparkConf;
import org.apache.spark.SparkContext;
import org.apache.spark.ml.Pipeline;
import org.apache.spark.ml.PipelineModel;
import org.apache.spark.ml.PipelineStage;
import java.util.Arrays;
import java.util.List;
import org.apache.hadoop.yarn.webapp.hamlet.HamletSpec.P;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.MapFunction;
import org.apache.spark.ml.Pipeline;
import org.apache.spark.ml.PipelineModel;
import org.apache.spark.ml.PipelineStage;
@yaravind
yaravind / DataFrameWithFileName.scala
Created April 15, 2020 03:22 — forked from satendrakumar/DataFrameWithFileName.scala
Add file name as Spark DataFrame column
import org.apache.spark.sql.functions._
import org.apache.spark.sql.SparkSession
object DataFrameWithFileNameApp extends App {
val spark: SparkSession =
SparkSession
.builder()
.appName("DataFrameApp")
.config("spark.master", "local[*]")
@yaravind
yaravind / spark-shell-init-load-file
Last active April 10, 2020 01:38
init file to load spark imports etc. You can use :load macro in the shell
:paste
import org.apache.spark.sql.types._
import com.databricks.spark.xml._
import org.apache.spark.sql.functions._
// For implicit conversions like converting RDDs to DataFrames
import spark.implicits._
# Root logger option
log4j.rootLogger=INFO, stdout
# Redirect log messages to console
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{ISO8601} %-5p %t %c:%L - %m%n
log4j.com.ncr.eda=INFO, stdout
@yaravind
yaravind / Product Association Recommender
Last active February 6, 2019 15:19
Product Association Recommender
= Product Association Recommender
Aravind R. Yarram <yaravind@gmail.com>
v1.0, 30-Sep-2013
== Domain Model
Association Rules: Support, Confidence and Lift
== Setup