Skip to content

Instantly share code, notes, and snippets.

View ciela's full-sized avatar
🕶️
How fabulous

ciela ciela

🕶️
How fabulous
View GitHub Profile
@ciela
ciela / sample_gist.ipynb
Created October 25, 2021 01:45
sample gist for medium
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@ciela
ciela / executor_test.py
Created August 11, 2020 02:31
Trial of ProcessPoolExecutor
import time
from concurrent.futures import ProcessPoolExecutor, as_completed
import random
import jsonlines as jl
class MyExecutor:
def write(self, text: str):
time.sleep(random.randint(0, 2))
@ciela
ciela / grl.py
Last active January 28, 2020 03:20
Getting familiar with Gradient Reversal Layer.
from typing import Tuple
import torch
import torch.nn as nn
class GradientReversalFunction(torch.autograd.Function):
@staticmethod
def forward(ctx, input_forward: torch.Tensor, scale: torch.Tensor) -> torch.Tensor:
@ciela
ciela / count_labels.py
Last active December 21, 2018 12:19
Count labels using collection package
import collection
log.debug(len(train_loader))
labels = []
for i, data in enumerate(train_loader):
_, label = data
labels.append(label.item())
labels_counter = collections.Counter(labels)
log.debug(labels_counter)
labels = []
log.debug(len(val_loader))
@ciela
ciela / bce_losses_and_comparison.py
Created December 7, 2017 03:10
Multi-label Classification 用のロス比較2(全て同じ結果にはなる)
import torch
from torch import nn as nn
from torch import autograd as ag
data = ag.Variable(torch.Tensor(torch.randn([100, 100])))
labels = ag.Variable(torch.Tensor(torch.randn([100, 100])))
multi_label_soft_margin = nn.MultiLabelSoftMarginLoss()
print(multi_label_soft_margin(data, labels))
@ciela
ciela / multi-label_classification_losses.py
Last active December 6, 2017 10:31
pytorch で multi-labal classification に利用されそうなロスとその使い方
# MultiLabelSoftMarginLoss only
ml_criterion = nn.MultiLabelSoftMarginLoss()
## torch.randn
data, labels = Variable(torch.randn([1, 5])), Variable(torch.randn([1, 5]))
print(data.data, labels.data)
print(ml_criterion(data, labels))
## fixed FloatTensor
data, labels = Variable(torch.FloatTensor([1, 50, 100, 50, 1])), Variable(torch.FloatTensor([0, 0, 1, 0, 0]))
print(data.data, labels.data)
print(ml_criterion(data, labels))
@ciela
ciela / pca.py
Created July 2, 2017 15:12
主成分分析のシミュレーション
#!/usr/bin/env python3
# coding: utf-8
# プログラミングのための確率統計 p.280 例題8.2 の numpy での実装
import numpy as np
# データ群
x1 = np.array([0, 5])
x2 = np.array([0, -5])
x3 = np.array([4, 3])
@ciela
ciela / portfolio.py
Last active March 20, 2017 14:16
ポートフォリオのシミューレション
import sys
import numpy as np
import numpy.random as rnd
import matplotlib.pyplot as plt
A = 0.7
SEED = 10000
TRIAL = 100
@ciela
ciela / gale_shapley.py
Last active December 25, 2016 10:26
n次正方行列による優先順位行列を用いて安定結婚問題を解くシミュレーション
#!/usr/bin/env python3
# coding: utf-8
import sys
import numpy as np
def make_data(n):
"""
優先順位が行ごとにシャッフルされたn次正方行列を生成する
@ciela
ciela / montyhall.py
Last active December 18, 2016 07:39
モンティホール問題シミューレション
#!/usr/bin/env python3
# coding: utf-8
import numpy as np
NUM = 100000 # 試行回数
DOORS = np.array([1, 2, 3]) # ドア