Skip to content

Instantly share code, notes, and snippets.

View pavlov99's full-sized avatar
:shipit:
Coding mode

Kirill Pavlov pavlov99

:shipit:
Coding mode
View GitHub Profile
@pavlov99
pavlov99 / configure-mac-os.sh
Last active May 10, 2017 09:47
vim configuration with python2
./configure \
--enable-perlinterp \
--enable-pythoninterp \
--enable-rubyinterp \
--enable-luainterp \
--enable-fail-if-missing \
--enable-cscope \
--enable-gui=auto \
--enable-gtk2-check \
--enable-gnome-check \
@pavlov99
pavlov99 / gist:222f379ca275a2b3f512
Created March 24, 2015 05:03
Jira set remaining estimate to zero on Issue Close (need for hour burndown setup)
https://confluence.atlassian.com/display/JIRA/Logging+Work+on+an+Issue#LoggingWorkonanIssue-Loggingworkwhileresolvingorclosinganissue
https://confluence.atlassian.com/display/JIRAKB/Set+Remaining+Estimate+to+0+on+post+function
@pavlov99
pavlov99 / README.txt
Last active August 29, 2015 14:17
Countries with their timezones.
List of currenies is generated using python.
Dependencies
============
pip install pycountry pytz requests
@pavlov99
pavlov99 / deque.awk
Last active July 1, 2023 00:16
AWK data structures
function deque_init(d) {d["+"] = d["-"] = 0}
function deque_is_empty(d) {return d["+"] == d["-"]}
function deque_push_back(d, val) {d[d["+"]++] = val}
function deque_push_front(d, val) {d[--d["-"]] = val}
function deque_back(d) {return d[d["+"] - 1]}
function deque_front(d) {return d[d["-"]]}
function deque_pop_back(d) {if(deque_is_empty(d)) {return NULL} else {i = --d["+"]; x = d[i]; delete d[i]; return x}}
function deque_pop_front(d) {if(deque_is_empty(d)) {return NULL} else {i = d["-"]++; x = d[i]; delete d[i]; return x}}
function deque_print(d){x="["; for (i=d["-"]; i<d["+"] - 1; i++) x = x d[i]", "; print x d[d["+"] - 1]"]; size: "d["+"] - d["-"] " [" d["-"] ", " d["+"] ")"}
@pavlov99
pavlov99 / mongo-dump-csv.sh
Last active September 7, 2018 07:23 — forked from mderazon/mongo-dump-csv.sh
Export all of Mongodb collections as csv without the need to specify fields
OIFS=$IFS;
IFS=",";
# fill in your details here
dbname=DBNAME
user=USERNAME
pass=PASSWORD
host=HOSTNAME:PORT
# first get all collections in the database
@pavlov99
pavlov99 / gist:a9af5871a1db578de16e
Created October 23, 2015 02:03
create user directory in hadoop
The /user/ directory is owned by "hdfs" with 755 permissions. As a result only hdfs can write to that directory. Unlike unix/linux, hdfs is the superuser and not root. So you would need to do this:
sudo -u hdfs hadoop fs -mkdir /user/,,myfile,,
sudo -u hdfs hadoop fs -put myfile.txt /user/,,/,,
If you want to create a home directory for root so you can store files in his directory, do:
sudo -u hdfs hadoop fs -mkdir /user/root
sudo -u hdfs hadoop fs -chown root /user/root
@pavlov99
pavlov99 / Graph.scala
Created January 27, 2016 06:30
Graph BFS DFS
class Graph[T] {
type Vertex = T
type GraphMap = Map[Vertex,List[Vertex]]
var g:GraphMap = Map()
def BFS(start: Vertex): List[List[Vertex]] = {
def BFS0(elems: List[Vertex],visited: List[List[Vertex]]): List[List[Vertex]] = {
val newNeighbors = elems.flatMap(g(_)).filterNot(visited.flatten.contains).distinct
if (newNeighbors.isEmpty)
@pavlov99
pavlov99 / bash-random-lines-test.sh
Created February 20, 2016 13:43
sample random lines from file in bash, benchmark
#!/bin/bash
FILENAME="/tmp/random-lines.$$.tmp"
NUMLINES=10000000
seq -f 'line %.0f' $NUMLINES > $FILENAME;
echo "10 random lines with nl:"
$(which time) -v nl -ba $filename | sort -r | sed 's/.*[0-9]\t//' | head > /dev/null
echo "10 random lines with shuf:"
$(which time) -v shuf $FILENAME -n10 | head > /dev/null
@pavlov99
pavlov99 / gist:369492916e44ddb1de06
Created February 22, 2016 09:57
fix-spark-union.scala
df1.unionAll(df2.select(fd1.columns.map(df1(_)): _*))
@pavlov99
pavlov99 / 0-apache-spark-presentation.md
Last active May 13, 2016 03:38
Apache Spark in data science presentation

This gist consists of Spark presentation examples.