Skip to content

Instantly share code, notes, and snippets.

View rueycheng's full-sized avatar

Ruey-Cheng Chen rueycheng

View GitHub Profile
rueycheng /
Last active May 2, 2024 09:28
Google Sheet functions cheatsheet

Google Sheet Functions - Cheatsheet

Based on


Name Syntax Description
ARRAY_CONSTRAIN ARRAY_CONSTRAIN(input_range, num_rows, num_cols) Constrains an array result to a specified size.
BYCOL BYCOL(array_or_range, LAMBDA) Groups an array by columns by application of a LAMBDA function to each column.
rueycheng /
Last active October 28, 2023 22:04
Python implementation for bootstrapping sampling distributions of Krippendorff's Alpha
Bootstrapping sampling distributions of Krippendorff's Alpha
A fast python implementation based on `fast-krippendorff` github repo and
Krippendorff's method.
import numpy as np
rueycheng /
Last active February 12, 2023 14:47
Fit multiple 35mm x 45mm passport photos into a 4x6-sized printable format
import argparse
import math
from PIL import Image, ImageDraw
def main():
parser = argparse.ArgumentParser()
parser.add_argument('-o', dest='output_file', required=True, help='output file')
parser.add_argument('input_files', metavar='FILE', nargs='+')
diff --git a/include/LightGBM/dataset.h b/include/LightGBM/dataset.h
index 12dbe6c..ef058af 100644
--- a/include/LightGBM/dataset.h
+++ b/include/LightGBM/dataset.h
@@ -88,6 +88,8 @@ class Metadata {
void SetLabel(const label_t* label, data_size_t len);
+ void SetOrdering(const label_t* ordering, data_size_t len);
rueycheng /
Last active February 12, 2023 14:45
Commonly used Unix recipes
rueycheng /
Last active February 12, 2023 14:44
Some useful git recipes

Clear out Git local/remote repos

git checkout --orphan new_branch
git add -A                        # add all files and commit them
git commit
git branch -D master              # deletes the master branch
git branch -m master # rename the current branch to master
rueycheng /
Last active July 6, 2024 14:14
GNU Make cheatsheet
rueycheng /
Last active February 12, 2023 14:41
extracting tuples from Wikipedia database dumps by directly parsing SQL INSERT statements
This is a tool for extracting tuples directly out of Wikipedia database dumps
by parsing SQL INSERT statements. For now, the output is in (tab-delimited)
CSV format, where each column indicate a field and each row a record. The tool works
with gzip'ed files out of the box.
To use this tool, simply go (take `pagelinks.sql.gz` as an example):
python enwiki-20150515-pagelinks.sql.gz