Skip to content

Instantly share code, notes, and snippets.

View caseyduquettesc's full-sized avatar

Casey caseyduquettesc

  • Snap Inc
  • CA
View GitHub Profile
@sj26
sj26 / LICENSE.md
Last active March 8, 2024 18:31
Bash retry function

This is free and unencumbered software released into the public domain.

Anyone is free to copy, modify, publish, use, compile, sell, or distribute this software, either in source code form or as a compiled binary, for any purpose, commercial or non-commercial, and by any means.

In jurisdictions that recognize copyright laws, the author or authors of this software dedicate any and all copyright interest in the software to the public domain. We make this dedication for the benefit

@remojansen
remojansen / class_decorator.ts
Last active September 14, 2023 14:54
TypeScript Decorators Examples
function logClass(target: any) {
// save a reference to the original constructor
var original = target;
// a utility function to generate instances of a class
function construct(constructor, args) {
var c : any = function () {
return constructor.apply(this, args);
}
@sindresorhus
sindresorhus / post-merge
Last active May 2, 2024 03:18
git hook to run a command after `git pull` if a specified file was changed.In this example it's used to run `npm install` if package.json changed and `bower install` if `bower.json` changed.Run `chmod +x post-merge` to make it executable then put it into `.git/hooks/`.
#!/usr/bin/env bash
# MIT © Sindre Sorhus - sindresorhus.com
# git hook to run a command after `git pull` if a specified file was changed
# Run `chmod +x post-merge` to make it executable then put it into `.git/hooks/`.
changed_files="$(git diff-tree -r --name-only --no-commit-id ORIG_HEAD HEAD)"
check_run() {
echo "$changed_files" | grep --quiet "$1" && eval "$2"
@shantanuo
shantanuo / mysql_to_big_query.sh
Last active September 14, 2022 07:12
Copy MySQL table to big query. If you need to copy all tables, use the loop given at the end. Exit with error code 3 if blob or text columns are found. The csv files are first copied to google cloud before being imported to big query.
#!/bin/sh
TABLE_SCHEMA=$1
TABLE_NAME=$2
mytime=`date '+%y%m%d%H%M'`
hostname=`hostname | tr 'A-Z' 'a-z'`
file_prefix="trimax$TABLE_NAME$mytime$TABLE_SCHEMA"
bucket_name=$file_prefix
splitat="4000000000"
bulkfiles=200