Skip to content

Instantly share code, notes, and snippets.

View EdisonLeeeee's full-sized avatar

Jintang Li EdisonLeeeee

View GitHub Profile
@EdisonLeeeee
EdisonLeeeee / graph_tool_jupyter_notebook_inline_draw.ipynb
Created March 1, 2023 07:14 — forked from joshlk/graph_tool_jupyter_notebook_inline_draw.ipynb
Inline graph-tool figures in Jupyter notebook (graph_draw)
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@EdisonLeeeee
EdisonLeeeee / logger.py
Created December 19, 2021 09:25
Python logger
import os
import sys
import functools
import logging
from termcolor import colored
from typing import Optional
__all__ = ["setup_logger", "get_logger"]
@EdisonLeeeee
EdisonLeeeee / numpy_topk.py
Created October 31, 2021 06:52
Numpy implementation of torch.topk
import numpy as np
from collections import namedtuple
topk_namedtuple = namedtuple('topk_namedtuple', ['values', 'indices'])
def topk(array: np.ndarray, k: int, largest: bool = True) -> topk_namedtuple:
"""Returns the k largest/smallest elements and corresponding indices
from an array-like input.
@EdisonLeeeee
EdisonLeeeee / gist:aa4e0c1f075be972a2054e27f673dc32
Created September 23, 2021 00:38 — forked from rxaviers/gist:7360908
Complete list of github markdown emoji markup

People

:bowtie: :bowtie: 😄 :smile: 😆 :laughing:
😊 :blush: 😃 :smiley: ☺️ :relaxed:
😏 :smirk: 😍 :heart_eyes: 😘 :kissing_heart:
😚 :kissing_closed_eyes: 😳 :flushed: 😌 :relieved:
😆 :satisfied: 😁 :grin: 😉 :wink:
😜 :stuck_out_tongue_winking_eye: 😝 :stuck_out_tongue_closed_eyes: 😀 :grinning:
😗 :kissing: 😙 :kissing_smiling_eyes: 😛 :stuck_out_tongue:
@EdisonLeeeee
EdisonLeeeee / clip_by_norm.py
Created June 11, 2021 01:47
tensorflow clip_by_norm in pytorch and numpy
import numpy as np
import torch
import tensorflow as tf
import numpy as np
import torch
import tensorflow as tf
# Numpy
np.random.seed(42)
x = np.random.randn(10, 10)
@EdisonLeeeee
EdisonLeeeee / f1_score.py
Created May 10, 2021 12:19 — forked from SuperShinyEyes/f1_score.py
F1 score in PyTorch
def f1_loss(y_true:torch.Tensor, y_pred:torch.Tensor, is_training=False) -> torch.Tensor:
'''Calculate F1 score. Can work with gpu tensors
The original implmentation is written by Michal Haltuf on Kaggle.
Returns
-------
torch.Tensor
`ndim` == 1. 0 <= val <= 1
@EdisonLeeeee
EdisonLeeeee / translator.py
Created May 10, 2021 02:17
Youdao Translator with PyThon
import json
import requests
def translate(word):
url = 'http://fanyi.youdao.com/translate?smartresult=dict&smartresult=rule&smartresult=ugc&sessionFrom=null'
key = {
'type': "AUTO",
'i': word,
"doctype": "json",
"version": "2.1",
@EdisonLeeeee
EdisonLeeeee / pytorch_l2_normalize.py
Last active August 22, 2022 10:53
PyTorch equivalence for tf.nn.l2_normalize
import torch
import tensorflow as tf
########## PyTorch Version 1 ################
x = torch.randn(5, 6)
norm_th = x/torch.norm(x, p=2, dim=1, keepdim=True)
norm_th[torch.isnan(norm_th)] = 0 # to avoid nan
########## PyTorch Version 2 ################
norm_th = torch.nn.functional.normalize(x, p=2, dim=1)
@EdisonLeeeee
EdisonLeeeee / pytorch_softmax_cross_entropy_with_logits.py
Last active May 8, 2021 07:47
PyTorch equivalence for softmax_cross_entropy_with_logits
import torch
import tensorflow as tf
def softmax_cross_entropy_with_logits(labels, logits, dim=-1):
return (-labels * F.log_softmax(logits, dim=dim)).sum(dim=dim)
logits = [[4.0, 2.0, 1.0], [0.0, 5.0, 1.0]]
labels = [[1.0, 0.0, 0.0], [0.0, 0.8, 0.2]]