Skip to content

Instantly share code, notes, and snippets.

View mkocabas's full-sized avatar

Muhammed Kocabas mkocabas

View GitHub Profile
@jph00
jph00 / py.md
Last active May 31, 2022 06:16
Organized and hyperlinked index to every module, function, and class in the Python standard library

All of the python 3.9 standard library

For a version without the collapsible details sections (so you can search the whole thing in your browser), click here.

@alecjacobson
alecjacobson / denoise.sh
Last active July 31, 2023 10:05
Remove background audio noise from a video clip via the command line (using ffmpeg and sox)
#!/bin/bash
if [ -z "$2" ];then
echo 'USAGE:
denoise input.mov output.mov
OR
denoise input.mov output.mov [ambient-noise-start-time] [ambient-noise-duration] [sox-noisered-amount] [sox-norm-param]
@edgarriba
edgarriba / pytorch_train_template.py
Last active October 9, 2019 02:44
PyTorch training template
/*
* Copyright (c) 2019 Edgar Riba.
*
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation, version 3.
*
* This program is distributed in the hope that it will be useful, but
* WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
@sergeyprokudin
sergeyprokudin / gauss_neg_loglikelihood_keras.py
Last active January 18, 2024 00:58
Multivariate Gaussian Negative LogLikelihood Loss Keras
import keras.backend as K
import numpy as np
def gaussian_nll(ytrue, ypreds):
"""Keras implmementation of multivariate Gaussian negative loglikelihood loss function.
This implementation implies diagonal covariance matrix.
Parameters
----------
@Tushar-N
Tushar-N / hook_activations.py
Created August 3, 2018 00:06
Pytorch code to save activations for specific layers over an entire dataset
import torch
import torch.nn as nn
import torch.nn.functional as F
import torchvision.models as tmodels
from functools import partial
import collections
# dummy data: 10 batches of images with batch size 16
dataset = [torch.rand(16,3,224,224).cuda() for _ in range(10)]
@tejaskhot
tejaskhot / shapenet_synset_list
Created June 24, 2018 00:44
List of category names and their id in the ShapeNet dataset
04379243 table
03593526 jar
04225987 skateboard
02958343 car
02876657 bottle
04460130 tower
03001627 chair
02871439 bookshelf
02942699 camera
02691156 airplane
@jeasinema
jeasinema / spatial_softmax.py
Last active March 10, 2023 08:11
Spatial(Arg)Softmax for pytorch
import torch
import torch.nn.functional as F
from torch.nn.parameter import Parameter
import numpy as np
class SpatialSoftmax(torch.nn.Module):
def __init__(self, height, width, channel, temperature=None, data_format='NCHW'):
super(SpatialSoftmax, self).__init__()
self.data_format = data_format
@heiswayi
heiswayi / repo-reset.md
Created February 5, 2017 01:32
GitHub - Delete commits history with git commands

First Method

Deleting the .git folder may cause problems in our git repository. If we want to delete all of our commits history, but keep the code in its current state, try this:

# Check out to a temporary branch:
git checkout --orphan TEMP_BRANCH

# Add all the files:
git add -A
@gocarlos
gocarlos / Eigen Cheat sheet
Last active February 11, 2024 14:07
Cheat sheet for the linear algebra library Eigen: http://eigen.tuxfamily.org/
// A simple quickref for Eigen. Add anything that's missing.
// Main author: Keir Mierle
#include <Eigen/Dense>
Matrix<double, 3, 3> A; // Fixed rows and cols. Same as Matrix3d.
Matrix<double, 3, Dynamic> B; // Fixed rows, dynamic cols.
Matrix<double, Dynamic, Dynamic> C; // Full dynamic. Same as MatrixXd.
Matrix<double, 3, 3, RowMajor> E; // Row major; default is column-major.
Matrix3f P, Q, R; // 3x3 float matrix.
@russelldb
russelldb / tmux.md
Last active October 18, 2023 14:48 — forked from andreyvit/tmux.md
tmux cheatsheet

tmux cheat sheet

(C-x means ctrl+x, M-x means alt+x)

Prefix key

The default prefix is C-b. If you (or your muscle memory) prefer C-a, you need to add this to ~/.tmux.conf:

remap prefix to Control + a