Skip to content

Instantly share code, notes, and snippets.

View sankars's full-sized avatar

Sankar sankars

  • Dubai
View GitHub Profile
@sankars
sankars / hbase_jruby_get.rb
Last active December 11, 2015 11:48
Hbase Get command in Jruby that can be used in Hbase shell
import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.util.Bytes
import org.apache.hadoop.hbase.client.HTable
import org.apache.hadoop.hbase.client.Get
import org.apache.hadoop.io.Text
conf = HBaseConfiguration.new
tablename = "Customer"
table = HTable.new(conf, tablename)
get = Get.new(Bytes.toBytes('0001'))
data = table.get(get)
@sankars
sankars / hbase_shell_filter.rb
Created February 22, 2013 07:52
Filter from Hbase shell.
import org.apache.hadoop.hbase.filter.CompareFilter
import org.apache.hadoop.hbase.filter.SingleColumnValueFilter
import org.apache.hadoop.hbase.filter.SubstringComparator
import org.apache.hadoop.hbase.util.Bytes
scan 't1', { COLUMNS => 'family:qualifier', FILTER =>
SingleColumnValueFilter.new
(Bytes.toBytes('family'),
Bytes.toBytes('qualifier'),
CompareFilter::CompareOp.valueOf('EQUAL'),
SubstringComparator.new('somevalue'))
@sankars
sankars / gist:5064830
Last active December 14, 2015 09:28 — forked from jorisbontje/gist:5056544
Hive with Avro
0) Download avro-tools jar file from avro.apache.org
1) Extract Avro schema using avro-tools.jar
java -jar avro-tools*.jar getschema file.avro > file.avsc
2) Upload Avro schema to hdfs
hadoop fs -cp file.avsc /use/training/file.avsc
-- This is a Hive program. Hive is an SQL-like language that compiles
-- into Hadoop Map/Reduce jobs. It's very popular among analysts at
-- Facebook, because it allows them to query enormous Hadoop data
-- stores using a language much like SQL.
-- Our logs are stored on the Hadoop Distributed File System, in the
-- directory /logs/randomhacks.net/access. They're ordinary Apache
-- logs in *.gz format.
--
-- We want to pretend that these gzipped log files are a database table,
@sankars
sankars / liferay-hook.xml
Created March 12, 2013 06:04
Override Liferay Login
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE hook PUBLIC "-//Liferay//DTD Hook 6.1.0//EN" "http://www.liferay.com/dtd/liferay-hook_6_1_0.dtd">
<hook>
<portal-properties>portal.properties</portal-properties>
<struts-action>
<struts-action-path>/login/login</struts-action-path>
<struts-action-impl>com.test.CustomLoginAction</struts-action-impl>
</struts-action>
package com.test;
import javax.portlet.ActionRequest;
import javax.portlet.ActionResponse;
import javax.portlet.PortletConfig;
import javax.portlet.RenderRequest;
import javax.portlet.RenderResponse;
import com.liferay.portal.kernel.struts.BaseStrutsPortletAction;
import com.liferay.portal.kernel.struts.StrutsPortletAction;
CREATE EXTERNAL TABLE GameDataAvro (
)
ROW FORMAT SERDE ‘com.linkedin.haivvreo.AvroSerDe’
STORED AS INPUTFORMAT ‘com.linkedin.haivvreo.AvroContainerInputFormat’
@sankars
sankars / Zookeeper.sh
Created March 19, 2013 07:11
Zookeeper Testing
echo ruok | nc localhost 2181
@sankars
sankars / PigToHbase.pig
Created March 19, 2013 14:32
Pig script to fetch data from hbase
a = LOAD 'hbase://Customer' using org.apache.pig.backend.hadoop.hbase.HBaseStorage('D:N','-loadKey=true') as (key:chararray,name:chararray);
b = order a by name parallel 1;
dump b;
@sankars
sankars / ComputeEnergyForCustomer.java
Last active April 2, 2019 10:06
Mapreduce job to process data stored in MySQL.
/* hadoop jar mysqlmapr.jar org.jm.samples.ComputeEnergyForCustomer -libjars /path/mysql-connector-java-5.1.13-bin.jar */
package org.jm.samples;
import java.io.DataInput;
import java.io.DataOutput;
import java.io.IOException;
import java.sql.PreparedStatement;
import java.sql.ResultSet;