Skip to content

Instantly share code, notes, and snippets.

View yaravind's full-sized avatar
💭
Constraints Liberate. Liberties Constrain.

Aravind Yarram yaravind

💭
Constraints Liberate. Liberties Constrain.
View GitHub Profile
@yaravind
yaravind / Sports Leagues
Last active December 22, 2015 14:38
Sports Leagues version 1.0
= Models Sports Leagues
Aravind R. Yarram <yaravind@gmail.com>
v1.0, 08-Sep-2013
== Domain Model
Each *League* has multiple *Level*s like playoffs, quarter-finals etc. The levels are ordered: first is playoffs, +NEXT+ is quarter-finals, +NEXT+ is semi-finals and then the next and last one is the finals. The ordering is represented using a http://docs.neo4j.org/chunked/milestone/cookbook-linked-list.html[linked-list].
A *Player* can play for more than one team over multiple leagues but can only play for a single team in a given league. This is captured by the +PLAYED_IN_FOR_LEAGUE+ http://docs.neo4j.org/chunked/milestone/cypher-cookbook-hyperedges.html[hyperedge] between player, team and league using http://docs.neo4j.org/chunked/milestone/cypher-cookbook-hyperedges.html[hypernode] *PlayerTeamLeague* . A team can register in a new league with a different name in which case, we want to know what it was +PREVIOUSLY_KNOWN_AS+.The fact that a player had for a given team (irrespective of which league) is capture
@yaravind
yaravind / Mahabharata Family Tree
Last active December 22, 2015 15:09
Mahabharata Family Tree
= Mahābhārata Family Tree
Aravind R. Yarram <yaravind@gmail.com>
v1.0, 08-Sep-2013
== Domain Model
Person-[:PARENT]->Person
use sunburst (http://stackoverflow.com/questions/8742950/sunburst-data-visualization-additional-ring) graph to show the mahabharatha lineage
@yaravind
yaravind / Product Association Recommender
Last active February 6, 2019 15:19
Product Association Recommender
= Product Association Recommender
Aravind R. Yarram <yaravind@gmail.com>
v1.0, 30-Sep-2013
== Domain Model
Association Rules: Support, Confidence and Lift
== Setup
@yaravind
yaravind / Product Catalog
Last active December 24, 2015 08:29
Product Catalog
= Product Catalog
Aravind R. Yarram <yaravind@gmail.com>
v1.0, 17-Sep-2013
== Domain
A product catalog is a collection of products, their categories, manufacturers with their pricing information. Products can be sold separately, included in one or more catalogs or used as substitute products
You can perform the following operations on a product:
Create, Update, Delete (Not recommended, deactivate instead), Deactivate (We should deactivate products that are no longer available instead of deleting them, because they are included in past orders, quotes, or opportunities.), Search etc.
@yaravind
yaravind / Legislative System
Last active January 1, 2016 15:59
Legislative System
= Legislative system
Aravind R. Yarram <yaravind@gmail.com>
v1.0, 30-Dec-2013
== About Me
Aravind R. Yarram <yaravind@gmail.com>
Sr. Principal Innovation - Research and Development, Equifax Inc.
@yaravind
yaravind / first.go
Last active December 7, 2016 19:56
First Go Package & Command
package main
import (
"fmt"
"github.com/yaravind/example"
)
func main() {
fmt.Println("Example - Hello World")
@yaravind
yaravind / first_test.go
Created December 7, 2016 20:09
First Go Test
package example
import "testing"
func TestNameAndAge(t *testing.T) {
n, a := NameAndAge()
if n != "Esha" || a != 7 {
t.Errorf("Expected 'Esha' and '7' but got %s and %d", n, a)
}
Apache Spark CLI
Usage:
spark-cli apps [--completed | --running]
spark-cli app <app-id>
Options:
-h, --help show help
-v, -version show version
# - Prints all options available to start the worker
%SPARK_HOME%\bin\spark-class.cmd org.apache.spark.deploy.worker.Worker --help
# - Prints all options available to start the master
%SPARK_HOME%\bin\spark-class.cmd org.apache.spark.deploy.master.Master --help
# - Start master - Web UI on http://localhost:8080
%SPARK_HOME%\bin\spark-class.cmd org.apache.spark.deploy.master.Master
# - Start worker with 1 core and 64mb ram - Web UI on http://localhost:8081
TITLE Launcher - Spark Master, 1 Worker and History Server
set SPARK_HOME=C:\aravind\sw\spark-2.0.2-bin-hadoop2.7
:: - Find the IP Address and set it to IP_ADDR env var and reuse it while launching Worker
for /f "tokens=1-2 delims=:" %%a in ('ipconfig^|find "IPv4"') do set ip=%%b
set IP_ADDR=%ip:~1%
echo %IP_ADDR%
:: - Start master
START "Spark Master" %SPARK_HOME%\bin\spark-class.cmd org.apache.spark.deploy.master.Master