Skip to content

Instantly share code, notes, and snippets.

View pwasiewi's full-sized avatar
😎
An usual day ;)

Piotr Wasiewicz pwasiewi

😎
An usual day ;)
View GitHub Profile
@crnisamuraj
crnisamuraj / picom-extended.conf
Last active December 22, 2023 16:48
Picom config file for KDE Plasma + Kwin + Picom
### Fading
fading = true;
fade-in-step = 0.06;
fade-out-ste = 0.06;
fade-delta = 3;
fade-exclude = [
"class_g = 'yakuake'"
]
### Opacity
@primaryobjects
primaryobjects / tplink-archer-t4u.md
Last active April 28, 2024 18:40
Steps to Install the TP-Link Archer T4U Plus AC1300 USB WiFi Adapter on Linux Mint
@pwasiewi
pwasiewi / Config.scala
Created November 8, 2020 11:32 — forked from manku-timma/Config.scala
Scala program to print all properties of a hadoop config
import org.apache.hadoop.conf.Configured
import org.apache.hadoop.util.Tool
import org.apache.hadoop.util.ToolRunner
import scala.collection.JavaConversions._
object Config extends Configured with Tool {
def run(args: Array[String]) : Int = {
getConf.map(x => (x.getKey, x.getValue)).toList.sorted.foreach(println)
return 0
}
@clrcrl
clrcrl / README.md
Last active September 3, 2023 07:42
dbt Workout — Using Jinja + SQL

Question on Slack:

Hello everyone! I have a Jinja DBT question! I have set a list and I'm looping over it.

{% set my_list= ["$apple",
"$avocado",
"tomato"] %}

I want now to loop over this list, retrive the data from each column (each item is a column name), and in the column name I want to remove the dollar symbol $. I was trying to do it with this code bellow:

@TakuroFukamizu
TakuroFukamizu / m5atom_sesame_controller.ino
Created April 3, 2020 00:07
Sesame smartlock controller application run on M5Atom Matrix.
#include "M5Atom.h"
#include <WiFi.h>
#include <WiFiMulti.h>
#include <HTTPClient.h>
#include <ArduinoJson.h>
#define WIFI_SSID "YOUR WIFI SSID"
#define WIFI_PASS "YOUR WIFI PASS"
#define SESAME_TOKEN "YOUR API KEY"
#define SESAME_DEVICE_ID "YOUR SESAME ID"
@semihsezer
semihsezer / nested_dag.py
Last active February 3, 2022 01:06
Creating Dynamic Nested Subdags in Airflow (subdags within subdags)
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from airflow.operators.subdag_operator import SubDagOperator
from datetime import datetime, timedelta
default_args = {
'owner': 'Airflow',
'start_date': datetime(2020, 3, 10)
}
@mrpeardotnet
mrpeardotnet / PVE-HP-ssacli-smart-storage-admin.md
Created November 25, 2019 22:10
HP Smart Storage Admin CLI (ssacli) installation and usage on Proxmox PVE (6.x)

HP Smart Storage Admin CLI (ssacli) installation and usage on Proxmox PVE (6.x)

Why use HP Smart Storage Admin CLI?

You can use ssacli (smart storage administrator command line interface) tool to manage any of supported HP Smart Array Controllers in your Proxmox host without need to reboot your server to access Smart Storage Administrator in BIOS. That means no host downtime when managing your storage.

CLI is not as convenient as GUI interface provided by BIOS or desktop utilities, but still allows you to fully manage your controller, physical disks and logical drives on the fly with no Proxmox host downtime.

ssacli replaces older hpssacli, but shares the same syntax and adds support for newer servers and controllers.

Installation

@bfraiche
bfraiche / random_forest_with_python_and_spark_ml.py
Created April 2, 2019 22:30
This gist contains the complete code for my blogpost: 'Random Forest with Python and Spark ML'
from pyspark.ml import Pipeline
from pyspark.ml.feature import VectorAssembler
from pyspark.ml.regression import RandomForestRegressor
from pyspark.ml.evaluation import RegressionEvaluator
from pyspark.ml.tuning import ParamGridBuilder, CrossValidator
import matplotlib.pyplot as plt
import numpy as np
# Pull in the data
df = mc.sql("SELECT * FROM kings_county_housing")
@lyshie
lyshie / config-deb-i386.json
Last active May 11, 2024 20:34
Scratch Desktop (Scratch 3.0 Offline Editor) on GNU/Linux
{
"src": "/tmp/scratch-desktop/",
"dest": "/tmp/",
"arch": "i386",
"icon": "/tmp/scratch-desktop/resources/Icon.png",
"categories": [
"Education"
]
}
@lonly197
lonly197 / spark_udfs.scala
Created August 29, 2018 16:19
Some Custom Spark UDF
import scala.collection.mutable.WrappedArray
import scala.collection.JavaConversions._
import scala.collection.JavaConverters._
import breeze.linalg.{DenseVector => BDV, SparseVector => BSV, Vector => BV}
import org.apache.spark.ml.linalg.{DenseVector, Matrices, Matrix, SparseVector, Vector, Vectors}
import org.apache.spark.mllib.linalg.{Vectors => OldVectors}
import org.apache.spark.sql.UDFRegistration
import streaming.common.UnicodeUtils