Skip to content

Instantly share code, notes, and snippets.

View cwhy's full-sized avatar

Chen Yu cwhy

View GitHub Profile
import csv
from collections import Counter
def get_table(m):
invert_counter = {v: i for i, v in Counter(''.join(m)).items()}
digit_table = {
'e': invert_counter[4],
'b': invert_counter[6],
'f': invert_counter[9],
}
@cwhy
cwhy / add.css
Last active September 21, 2020 06:18
Print Hacker News Comment
html{
font-size: 16px;
width: 100%;
}
table td, table th { overflow-wrap: anywhere; }
.title>a{
color: black;
font-size: 1.5rem;
font-family: "charter", "Palatino Linotype", "Source Serif Pro", "Georgia", "serif"
}
@cwhy
cwhy / pytorch_twisted.py
Last active July 23, 2020 09:32
Pytorch tensor plays
import torch
def cum_softmax(t: torch.Tensor) -> torch.Tensor:
# t shape: ..., sm_d, sm_d is the dim to reduce
tmax = t.cummax(dim=-1)[0].unsqueeze(-2)
denominator = (t.unsqueeze(-1) - tmax).exp()
# shape: ..., sm_d, csm_d
numerator = denominator.sum(dim=-2, keepdim=True)
return numerator / denominator
@cwhy
cwhy / Style.elm
Last active December 8, 2018 07:39
An abandoned render engine with poor man's flexbox
module Style exposing (..)
type CustomColor
= CItem
| CItemHidden
| CUser
| CUserHidden
| Grey
| White
@cwhy
cwhy / icml2018.md
Created September 19, 2018 05:31
ICML 2018 Abstracts

Spline Filters For End-to-End Deep Learning

We propose to tackle the problem of end-to-end learning for raw waveforms signals by introducing learnable continuous time-frequency atoms. The derivation of these filters is achieved by first, defining a functional space with a given smoothness order and boundary conditions. From this space, we derive the parametric analytical filters. Their differentiability property allows gradient-based optimization. As such, one can equip any Deep Neural Networks (DNNs) with these filters. This enables us to tackle in a front-end fashion a large scale bird detection task based on the freefield1010 dataset

@cwhy
cwhy / nips2018.md
Last active January 11, 2024 13:17
NIPS 2018 Abstract

Unsupervisedly Learned Latent Graphs as Transferable Representations

Modern deep transfer learning approaches have mainly focused on learning \emph{generic} feature vectors from one task that are transferable to other tasks, such as word embeddings in language and pretrained convolutional features in vision. However, these approaches usually transfer unary features and largely ignore more structured graphical representations. This work explores the possibility of learning generic latent graphs that capture dependencies between pairs of data units (e.g., words or pixels) from large- scale unlabeled data and transferring the graphs to downstream tasks. Our

Learning Active Learning from Data

In this paper, we suggest a novel data-driven approach to active learning (AL). The key idea is to train a regressor that predicts the expected error reduction for a candidate sample in a particular learning state. By formulating the query selection procedure as a regression problem we are not restricted to working with existing AL heuristics; instead, we learn strategies based on experience from previous AL outcomes. We show that a strategy can be learnt either from simple synthetic 2D datasets or from a subset of domain-specific data. Our method yields strategies that work well on

@cwhy
cwhy / ubuntuDesktopRoutine.md
Last active March 22, 2018 03:57
Ubuntu desktop routine
  • (Tested on 16.04.3)

System Config

  • Choose closest update server location
  • Disable activity record in Security & Privacy
  • Change Firefox download settings
  • Enable workspace in Appearance/Behavior
  • [Optional] Dual screen setup (remove sticky edges)
  • [Optional] Add printers
@cwhy
cwhy / fontEndStack_Typescript.md
Last active January 2, 2023 21:05
Web frontend stack (yarn + webpack + typescript)

This file shows how to set up a minimal web font-end stack with Yarn, webpack and Typescript

Install newest NodeJS

curl -sL https://deb.nodesource.com/setup_8.x | sudo -E bash -
sudo apt-get install -y nodejs

[1]

Install yarn