Skip to content

Instantly share code, notes, and snippets.

View Zejnilovic's full-sized avatar

Saša Zejnilović Zejnilovic

View GitHub Profile
@Zejnilovic
Zejnilovic / switch_m2_settings.pl
Created May 3, 2019 05:48
Script to easily switch between settings files - written for maven, easily changeable to support others.
#!/usr/bin/perl
# Perl script to fast switch between maven settings. Applicable to other programs as well.
# Expects an original file like settings.xml and reference files like settings.xml.alfa1
# and settings.xml.alfa2. These are in the same $m2Folder. When run gives user list of
# postfixes of the original file to chose from (alfa1, alfa2). And that file will
# overwrite the original settings.xml
use warnings;
use File::Copy;
@Zejnilovic
Zejnilovic / csv_to_json.rb
Created May 3, 2019 06:08
Convert CSV to JSON
require 'json'
require 'csv'
# Transforms CSV into JSON and saves it next to the CSV.
# Paths to CSV accepted as an input argument
csv_in = ARGV[0]
ARGV.clear
while true
break if !csv_in.nil? && File.exist?(csv_in)
@Zejnilovic
Zejnilovic / buildHadoopNativeLibraries.sh
Created May 20, 2019 15:32
A script to build and use Hadoop Native libraries on Mac
# !!CHANGE THIS!!
HVERSION=2.7.5
#!! AND LET IT RUN
brew install gcc autoconf automake libtool cmake snappy gzip bzip2 zlib openssl
cd ~
mkdir -p tmp
cd ~/tmp
@Zejnilovic
Zejnilovic / back_up_hdfs.sh
Created May 24, 2019 10:05
Backup HDFS to local FS
#!/bin/bash
now=$(date +"%m_%d_%Y")
file=hadoop_backup_$now
cd ~
mkdir -p $file
hdfs dfs -ls / | grep "^[d-]"| awk '{print $8}' | while read line; do hdfs dfs -get $line $file ; done
zip -r $file.zip $file
@Zejnilovic
Zejnilovic / ssh_to_any_server.sh
Created July 26, 2019 11:13
Connect to one of the servers
#!/bin/bash
# Switch this for the list of addresses you want to ssh to
ADDRESS=(127.0.0.1 localhost)
# Switch this for the name of the user you are sshing as
NAME=`echo $USER`
for i in $ADDRESS
do
echo "Trying to connect as $NAME to $i"
@Zejnilovic
Zejnilovic / decismal_38_18_transformer.js
Last active July 29, 2019 14:10
Smart JSON Editor transformer for Decimal(38,18)
// https://github.com/SmartJSONEditor/PublicDocuments/wiki/ValueTransformers
var ValueTransformer = function () {
this.displayName = "Decimal(38,18)";
this.shortDescription = "https://spark.apache.org/docs/2.4.0/api/java/org/apache/spark/sql/types/Decimal.html"
this.transform = function (inputValue, jsonValue, arrayIndex, parameters, info) {
var result = '';
var characters = '0123456789';
var charactersLength = characters.length;
var precision = 38;
@Zejnilovic
Zejnilovic / docker-compose.yaml
Last active February 6, 2020 13:35
Docker compose for menas
---
version: '3'
services:
arangodb:
image: arangodb:3.5.1
environment:
ARANGO_NO_AUTH: 1
volumes:
- /tmp/arangodb:/var/lib/arangodb3
@Zejnilovic
Zejnilovic / stupid_hdfs_backup.sh
Created April 27, 2020 08:48
A simple HDFS backup for local playing
#!/bin/bash
now=$(date +"%Y_%m_%d")
file=hadoop_backup_$now
cd ~
mkdir -p $file
hdfs dfs -ls / | grep "^[d-]"| awk '{print $8}' | while read line; do hdfs dfs -get $line $file ; done
zip -r $file.zip $file
@Zejnilovic
Zejnilovic / hadoop_remove_old_tmp.sh
Created August 17, 2020 16:30
Remove old files from Hadoop tmp
today=`date +'%s'` # date today
files=`hdfs dfs -ls /tmp | tail -n +2` # all files in tmp
granularity=$(( 24*60*60 )) # granularity of time. Now set to days
olderThan=7 # granularity times olderThan gives you what age files should be deleted
for line in $files; do
dir_date=$(echo ${line} | awk '{print $6}')
# difference=$(( ( ${today} - $(date -j -u -f "%Y-%m-%d %H:%M" ${dir_date} +%s) ) / ${granularity} )) # MacOS
difference=$(( ( ${today} - $(date -d ${dir_date} +%s) ) / ${granularity} )) # Linux
filePath=$(echo ${line} | awk '{print $8}')
@Zejnilovic
Zejnilovic / SparkApp.scala
Created December 7, 2021 11:59
SPNEGO Auth in Scala / Java
import org.apache.spark.sql.SparkSession
import org.slf4j.LoggerFactory
import sun.security.krb5.internal.ktab.KeyTab
import org.springframework.security.kerberos.client.KerberosRestTemplate
object SparkApp {
def main(args: Array[String]) {
// Spark session is obsolete, but I needed it to test it as a spark app
val spark = SparkSession.builder.appName("KerberosTest Spark Job").getOrCreate()
val logger = LoggerFactory.getLogger(this.getClass)