This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
mvn install:install-file -Dfile=/Users/abhishek/code/gerrit/sthiraank4j/target/sthiraank4j-1.0-SNAPSHOT.jar -DgroupId=com.helpshift.sthiraank -DartifactId=sthiraank4j -Dversion=1.0-SNAPSHOT -Dpackaging=jar |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Install virtualenv using pip. | |
$ pip install --user virtualenv | |
Create a virtual environment and name it. | |
$ virtualenv ~/cli-ve | |
Alternatively, you can use the -p option to specify a version of Python other than the default. | |
$ virtualenv -p /usr/bin/python3.4 ~/cli-ve | |
Activate your new virtual environment. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Does Hive ACID tables for Hive version 1.2 posses the capability of being read into Apache Pig using HCatLoader (or other means) or in Spark using SQLContext(or other means). | |
For Spark, it seems it is only possible to read ACID tables if the table is fully compacted i.e no delta folders exist in any partition. Details in the following JIRA | |
https://issues.apache.org/jira/browse/SPARK-15348, | |
https://issues.apache.org/jira/browse/SPARK-15348 | |
However I wanted to know if it is supported at all in Apache Pig to read ACID tables in Hive. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# GC logging options | |
time java -Xms4G -Xmx4G -XX:+UseParallelOldGC -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps AL | |
time java -Xms4G -Xmx4G -XX:+UseConcMarkSweepGC -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps AL | |
time java -Xms4G -Xmx4G -XX:+UseG1GC -XX:+PrintGC -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps AL | |
java -Xmx100m -XX:NewSize=10m -XX:+PrintGCDetails -XX:+HeapDumpBeforeFullGC -XX:+PrintHeapAtGC AL |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
MOBY_ENV=staging storm jar bitter-0.44.0-standalone.jar bitter.topologies.core predict staging -c nimbus.host=10.0.3.140 -c nimbus.port=6627 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
$ lein pom | |
$ mvn dependency:tree -Dverbose=true |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Bitter steps: | |
1. Rsync | |
2. Change ownership | |
3. Hbase-site copy from backup | |
4. local.clj copy from backup | |
create_namespace 'dataplat_analytics' | |
create 'dataplat_analytics:agent_issue_events' , {NAME => 'raw' ,VERSIONS => 1, COMPRESSION => 'SNAPPY'},{REGION_REPLICATION => 2}, METADATA => {'table-version' => 1} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import org.apache.hadoop.hbase.HBaseConfiguration | |
import org.apache.hadoop.hbase.client.HTable | |
import org.apache.hadoop.hbase.client.Scan; | |
import org.apache.hadoop.hbase.util.Bytes; | |
import org.apache.hadoop.hbase.client.ResultScanner; | |
import org.apache.hadoop.hbase.client.Result; | |
import java.util.ArrayList; | |
def get_result(table) | |
htable = HTable.new(HBaseConfiguration.new, table) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# Form post | |
curl -X POST -H "Content-Type: application/x-www-form-urlencoded" -d "name=Abhishek&email=abhilater@gmail.com&comments=Hi bhai" http://ayaneshuadvisors.com/php/contact.php |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
avg of a column | |
cat joom_12.txt | cut -d',' -f 3|cut -d')' -f 1 |awk '{ sum += $1; n++ } END { if (n > 0) print sum / n; }' |