Skip to content

Instantly share code, notes, and snippets.

🎯
Focusing

Sam Bessalah samklr

🎯
Focusing
Block or report user

Report or block samklr

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
View 70-ec2-nvme-devices.rules
# Copyright (C) 2006-2016 Amazon.com, Inc. or its affiliates.
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License").
# You may not use this file except in compliance with the License.
# A copy of the License is located at
#
# http://aws.amazon.com/apache2.0/
#
# or in the "license" file accompanying this file. This file is
View tableau_linux.tf
resource "aws_instance" "int_tableau_linux" {
key_name = "${var.key_name}"
ami = "${data.aws_ami.int_tableau_linux.id}"
instance_type = "m5.4xlarge"
iam_instance_profile = "${aws_iam_instance_profile.int_tableau.id}"
vpc_security_group_ids = ["${aws_security_group.sgrp.id}"]
associate_public_ip_address = false
subnet_id = "${aws_subnet.subnet.id}"
private_ip = "${var.dq_internal_dashboard_linux_instance_ip}"
View kinit.sh
kinit -t /etc/security/keytabs/c******-*****.keytab -k ****-info**t@NX.***.somewhere
klist
View spark_joins.scala
import spark.implicits._
case class Row(id: Int, value: String)
val r1 = Seq(Row(1, "A1"), Row(2, "A2"), Row(3, "A3"), Row(4, "A4")).toDS()
val r2 = Seq(Row(3, "A3"), Row(4, "A4"), Row(4, "A4_1"), Row(5, "A5"), Row(6, "A6")).toDS()
@samklr
samklr / clean-slate.sh
Created Dec 7, 2018 — forked from noisesmith/clean-slate.sh
clean some things from zk, kafka, mongo
View clean-slate.sh
#!/bin/sh
# vars
## EDITOR/VISUAL - what process to use to pick targets interactively
## ZK_WL - regex for zookeeper paths not to remove
## KAFKA_WL - regex for kafka topics not to remove
## MONGO_WL - regex for mongo item ids not to remove
# set -x
View keybase.md

Keybase proof

I hereby claim:

  • I am samklr on github.
  • I am samklr_ (https://keybase.io/samklr_) on keybase.
  • I have a public key ASAJAlW3njCb2s4F77DE8jY37PhD4uZVvuKUs6x71A15PAo

To claim this, I am signing this object:

View links.txt
@samklr
samklr / SparkTaskListener.scala
Created Nov 19, 2018
Listener to collect Spark execution information.
View SparkTaskListener.scala
import org.apache.spark.executor.TaskMetrics
import org.apache.spark.scheduler._
import scala.collection.mutable
class ValidationListener extends SparkListener {
private val taskInfoMetrics = mutable.Buffer[(TaskInfo, TaskMetrics)]()
private val stageMetrics = mutable.Buffer[StageInfo]()
@samklr
samklr / spark-hive.scala
Created Nov 15, 2018
Integrate Spark with Hive
View spark-hive.scala
import org.apache.spark.sql.{SaveMode, SparkSession}
case class HelloWorld(message: String)
def main(args: Array[String]): Unit = {
// Creation of SparkSession
val sparkSession = SparkSession.builder()
.appName("example-spark-scala-read-and-write-from-hive")
.config("hive.metastore.warehouse.dir", params.hiveHost + "user/hive/warehouse")
.enableHiveSupport()
You can’t perform that action at this time.