Skip to content

Instantly share code, notes, and snippets.


Rajkumar Singh rajkrrsingh

View GitHub Profile
rajkrrsingh / Create MapR-Db Table
Created August 9, 2015 03:42
code snippet to create maprdb table
View Create MapR-Db Table
private void createTable(String tableName, List<String> cfList)
throws IOException {
final String table = tableName;
final List<String> cfs = cfList;
try {
ugi.doAs(new PrivilegedExceptionAction<Void>() {
public Void run() throws Exception {
if (!admin.tableExists(table)) {
rajkrrsingh / gist:a86ff86f73e351bdf86c
Created November 23, 2015 16:56
Redis Server installation on CentOs
View gist:a86ff86f73e351bdf86c
# tar xzf redis-3.0.2.tar.gz
# cd redis-3.0.2
# make
Run Redis Server
rajkrrsingh / HiveServer2 connection with MySQL DB as metastore over
Last active July 18, 2016 06:07
quick guide to connect HS2 to MySQL DB metastore over SSL
View HiveServer2 connection with MySQL DB as metastore over

Setting up MySQL SSL

# Create clean environment
shell> rm -rf newcerts
shell> mkdir newcerts && cd newcerts

# Create CA certificate
shell> openssl genrsa 2048 > ca-key.pem
shell> openssl req -new -x509 -nodes -days 3600 \
         -key ca-key.pem -out ca.pem
rajkrrsingh / Reading parquet files using the parquet tools
Created August 10, 2015 22:04
reading parquet files and know meta information of parquet file
View Reading parquet files using the parquet tools
// Building a parquet tools
git clone
cd parquet-mr/parquet-tools/
mvn clean package -Plocal
// know the schema of the parquet file
java -jar parquet-tools-1.6.0rc3-SNAPSHOT.jar schema sample.parquet
// Read parquet file
java -jar parquet-tools-1.6.0rc3-SNAPSHOT.jar cat sample.parquet
rajkrrsingh /
Created November 22, 2016 09:46
java program to run Sqoop Command using google SSHXCUTE framework
import net.neoremind.sshxcute.core.SSHExec;
import net.neoremind.sshxcute.core.ConnBean;
import net.neoremind.sshxcute.task.CustomTask;
import net.neoremind.sshxcute.task.impl.ExecCommand;
public class RunSqoopCommand {
public static void main(String args[]) throws Exception{
rajkrrsingh / SparkStreamingSampleApp
Created November 27, 2016 11:01
Spark Streaming Sample program using scala
View SparkStreamingSampleApp
mkdir spark-streaming-example
cd spark-streaming-example/
mkdir -p src/main/scala
cd src/main/scala
vim TestStreaming.scala
add following line of code to TestStreaming.scala
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.streaming.StreamingContext._
rajkrrsingh / Atlas_Rest_API
Last active November 27, 2016 14:53
quick reference guide of ATLAS REST API
View Atlas_Rest_API
[root@rksnode ~]# curl http://rksnode:21000/api/atlas/admin/version
{"Version":"","Name":"apache-atlas","Description":"Metadata Management and Data Governance Platform over Hadoop"}[root@rksnode ~]#
[root@rksnode ~]#
[root@rksnode ~]#
[root@rksnode ~]# curl http://rksnode:21000/api/atlas/types
{"results":["DataSet","hive_order","Process","hive_table","hive_db","hive_process","hive_principal_type","hive_resource_type","hive_object_type","Infrastructure","hive_index","hive_column","hive_resourceuri","hive_storagedesc","hive_role","hive_partition","hive_serde","hive_type"],"count":18,"requestId":"qtp1286783232-60 - 0128be6a-076e-4ad3-972a-58783a1f7180"}[root@rksnode ~]#
[root@rksnode ~]#
[root@rksnode ~]# curl http://rksnode:21000/api/atlas/types/hive_process
{"typeName":"hive_process","definition":"{\n \"enumTypes\":[\n \n ],\n \"structTypes\":[\n \n ],\n \"traitTypes\":[\n \n ],\n \"classTypes\":[\n {\n \"superTypes\":[\n
rajkrrsingh / Spark2StarterApp
Created November 30, 2016 17:48
sample spark2 application using scala
View Spark2StarterApp
mkdir Spark2StarterApp
cd Spark2StarterApp/
mkdir -p src/main/scala
cd src/main/scala
vim Spark2Example.scala
import org.apache.spark.sql.SparkSession
object Spark2Example {
rajkrrsingh / Spark2DataSetDemo
Created November 30, 2016 18:08
sample spark2 application demonstrating dataset api
View Spark2DataSetDemo
[root@rkk1 Spark2StarterApp]# /usr/hdp/current/spark2-client/bin/spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
16/11/30 18:01:48 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.
Spark context Web UI available at
Spark context available as 'sc' (master = local[*], app id = local-1480528906336).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
rajkrrsingh / Benchmark-results.txt
Created December 4, 2016 15:42 — forked from saurabhmishra/Benchmark-results.txt
Kafka Benchmark Commands
View Benchmark-results.txt
End-to-end Latency
0.0543 ms ms (median)
0.003125 ms (99th percentile)
5 ms (99.9th percentile)
Producer and consumer
Producer - 1431170.2 records/sec (136.49 MB/sec)
Consumer - 3276754.7021 records/sec (312.4957 MB/sec)