Skip to content

Instantly share code, notes, and snippets.

View tamerthamoqa's full-sized avatar

Tamer Tahamoqa tamerthamoqa

View GitHub Profile
@koaning
koaning / config-mega-basic.yml
Last active November 1, 2022 11:39
DIET Benchmarks
language: en
pipeline:
- name: WhitespaceTokenizer
- name: CountVectorsFeaturizer
- name: EmbeddingIntentClassifier
policies:
- name: EmbeddingPolicy
max_history: 10
epochs: 100
batch_size:
@jamal919
jamal919 / install_wrf41.sh
Last active March 13, 2024 15:38
WRF 4.1.2 installation with netCDF4, HDF5 in ubuntu 18.04 LTS
#!/bin/bash
## WRF installation with parallel process.
# Download and install required library and data files for WRF.
# License: LGPL
# Jamal Khan <jamal.khan@legos.obs-mip.fr>
# Tested in Ubuntu 18.04 LTS
# basic package managment
sudo apt update
sudo apt upgrade
@tgsmith61591
tgsmith61591 / ranking.py
Last active March 21, 2024 06:36
Ranking metrics for recommender systems
# -*- coding: utf-8 -*-
#
# Author: Taylor G Smith
#
# Recommender system ranking metrics derived from Spark source for use with
# Python-based recommender libraries (i.e., implicit,
# http://github.com/benfred/implicit/). These metrics are derived from the
# original Spark Scala source code for recommender metrics.
# https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/mllib/evaluation/RankingMetrics.scala
@Stonesjtu
Stonesjtu / mem_report.py
Last active March 7, 2023 16:58
A simple Pytorch memory usages profiler
import gc
import torch
## MEM utils ##
def mem_report():
'''Report the memory usage of the tensor.storage in pytorch
Both on CPUs and GPUs are reported'''
def _mem_report(tensors, mem_type):
@paduvi
paduvi / FlatCnnLayer.py
Last active May 13, 2023 20:30
Hierarchical Softmax CNN Classification
import torch
import torch.nn as nn
import torch.nn.init as init
dropout_prob = 0.5
class FlatCnnLayer(nn.Module):
def __init__(self, embedding_size, sequence_length, filter_sizes=[3, 4, 5], out_channels=128):
super(FlatCnnLayer, self).__init__()
import numpy as np
from keras import backend as K
from keras.models import Sequential
from keras.layers.core import Dense, Dropout, Activation, Flatten
from keras.layers.convolutional import Convolution2D, MaxPooling2D
from keras.preprocessing.image import ImageDataGenerator
from sklearn.metrics import classification_report, confusion_matrix
#Start
train_data_path = 'F://data//Train'
@yrevar
yrevar / imagenet1000_clsidx_to_labels.txt
Last active April 30, 2024 12:39
text: imagenet 1000 class idx to human readable labels (Fox, E., & Guestrin, C. (n.d.). Coursera Machine Learning Specialization.)
{0: 'tench, Tinca tinca',
1: 'goldfish, Carassius auratus',
2: 'great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias',
3: 'tiger shark, Galeocerdo cuvieri',
4: 'hammerhead, hammerhead shark',
5: 'electric ray, crampfish, numbfish, torpedo',
6: 'stingray',
7: 'cock',
8: 'hen',
9: 'ostrich, Struthio camelus',
@karpathy
karpathy / min-char-rnn.py
Last active May 6, 2024 16:42
Minimal character-level language model with a Vanilla Recurrent Neural Network, in Python/numpy
"""
Minimal character-level Vanilla RNN model. Written by Andrej Karpathy (@karpathy)
BSD License
"""
import numpy as np
# data I/O
data = open('input.txt', 'r').read() # should be simple plain text file
chars = list(set(data))
data_size, vocab_size = len(data), len(chars)