Navigation Menu

Skip to content

Instantly share code, notes, and snippets.

View EKami's full-sized avatar
🏠
Working from home

Tuatini Godard EKami

🏠
Working from home
View GitHub Profile
We can't make this file beautiful and searchable because it's too large.
Date,Time,"Virtual Memory Committed [MB]","Virtual Memory Available [MB]","Virtual Memory Load [%]","Physical Memory Used [MB]","Physical Memory Available [MB]","Physical Memory Load [%]","Page File Usage [%]","Core VIDs (avg) [V]","Core 0 VID [V]","Core 1 VID [V]","Core 2 VID [V]","Core 3 VID [V]","Core 4 VID [V]","Core 5 VID [V]","Core 6 VID [V]","Core 7 VID [V]","Core 8 VID [V]","Core 9 VID [V]","Core 10 VID [V]","Core 11 VID [V]","Core 12 VID [V]","Core 13 VID [V]","Core 14 VID [V]","Core 15 VID [V]","Core Clocks (avg) [MHz]","Core 0 Clock (perf #1/2) [MHz]","Core 1 Clock (perf #1/1) [MHz]","Core 2 Clock (perf #5/6) [MHz]","Core 3 Clock (perf #6/7) [MHz]","Core 4 Clock (perf #4/5) [MHz]","Core 5 Clock (perf #7/8) [MHz]","Core 6 Clock (perf #2/3) [MHz]","Core 7 Clock (perf #3/4) [MHz]","Core 8 Clock (perf #13/14) [MHz]","Core 9 Clock (perf #14/15) [MHz]","Core 10 Clock (perf #15/16) [MHz]","Core 11 Clock (perf #12/13) [MHz]","Core 12 Clock (perf #10/11) [MHz]","Core 13 Clock (perf #11/12) [MHz]","Core 14 C
#!/bin/bash
set -e
if [ -z "$AWS_ACCESS_KEY_ID" ] || [ -z "$AWS_SECRET_ACCESS_KEY" ]; then
echo "Please provide AWS credentials"
exit 1
fi
if [ -z "$IAM_ROLE_ARN" ]; then
@EKami
EKami / launch_focus_min_wayland.sh
Last active October 19, 2022 07:18
Open and close terminator on Gnome 3 wayland
#!/bin/bash
# https://superuser.com/questions/142945/bash-command-to-focus-a-specific-window/1627429#1627429
# To use, set a shortcut in Gnome like this: bash -c '/home/user/launch_focus_min_wayland.sh terminator'
# To check the available function names, do ALT + F2 and type "lg"
app=$1
if [[ $app == terminator ]]; then
process_name=/usr/bin/terminator
else
process_name=$app
-----BEGIN PGP PUBLIC KEY BLOCK-----
mQENBF6eYPcBCADEhBpvhJ9/C6psD031FwUK9/32AHCTMd83B0SIzke55tKKK7JM
6NcSzjux8qw9d1zB3NJ7ziMok3pMSVHEX6QgJOb1O2/PZq7oI60Oao0+O9uTWOaO
Gu9B8rv69tdnu66jdM9copw0w6eokGmQId4pgM0xNSgLgEvh3NV09M0VP49C39XC
D7AF+/VfDjKbaVI5U7yp/KYz6wVnZo9LjEkD/lOga9DEiIdOAJnbUhDVj9S2ZXhW
QE7IO8rUOcUc8AvJXAPaqKQmB2EW6CwUyW3ucm2PTwFVXnH5cu3YhHBbrqUS4lzX
j54Z5MHgX7IjOftElQM0bKTZ8HwPpVUr5OTJABEBAAG0UkdPREFSRCBUdWF0aW5p
IChUaGUgR1BHIGtleSBmb3IgbXkgd2Vic2l0ZSBhbmQgZW1haWwgc2lnbmF0dXJl
KSA8ZW1haWxAdHVhdGluaS5tZT6JAU4EEwEIADgWIQSW3rJ7WQ+eYBAPigy80Dgv
@EKami
EKami / amazon_forest_notebook.ipynb
Created July 9, 2018 15:32
Planet: Understanding the Amazon deforestation from Space challenge
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
mask_img = np.array(Image.open(mask.file, mode='r').convert('1'))
# Absolute WSI coordinates
dx = max(patch_x, mask.x)
dx2 = min(patch_x + self.crop_size, mask.x + mask.width)
dy = max(patch_y, mask.y)
dy2 = min(patch_y + self.crop_size, mask.y + mask.height)
# Make the coordinates relative to the mask position
dx = dx - mask.x
Traceback (most recent call last):
File "/Users/Ekami/Programs/anaconda/envs/dl/lib/python3.6/site-packages/django/core/handlers/exception.py", line 35, in inner
response = get_response(request)
File "/Users/Ekami/Programs/anaconda/envs/dl/lib/python3.6/site-packages/django/core/handlers/base.py", line 128, in _get_response
response = self.process_exception_by_middleware(e, request)
File "/Users/Ekami/Programs/anaconda/envs/dl/lib/python3.6/site-packages/django/core/handlers/base.py", line 126, in _get_response
response = wrapped_callback(request, *callback_args, **callback_kwargs)
File "/Users/Ekami/Programs/anaconda/envs/dl/lib/python3.6/site-packages/django/views/generic/base.py", line 69, in view
return self.dispatch(request, *args, **kwargs)
File "/Users/Ekami/Programs/anaconda/envs/dl/lib/python3.6/site-packages/django/views/generic/base.py", line 89, in dispatch
Error: Failed to start algorithm - Traceback (most recent call last):
File "/opt/algorithm/bin/pipe.py", line 14, in <module>
algorithm = __import__('src.'+config['algoname'], fromlist=["apply"])
File "/opt/algorithm/src/srgan_algorithmia.py", line 6, in <module>
from torchlite.eval import eval
File "/opt/algorithm/dependencies/torchlite/eval/eval.py", line 5, in <module>
from torchlite.data.datasets.srgan import EvalDataset
File "/opt/algorithm/dependencies/torchlite/data/datasets/srgan.py", line 3, in <module>
import torchlite.nn.transforms as ttransforms
File "/opt/algorithm/dependencies/torchlite/nn/transforms.py", line 20, in <module>
import os
import requests
from io import BytesIO
import Algorithmia
from Algorithmia.acl import ReadAcl
from torchlite.eval import eval
from pathlib import Path
from PIL import Image
import uuid
import os
import requests
from io import BytesIO
import Algorithmia
from Algorithmia.acl import ReadAcl
from torchlite.eval import eval
from pathlib import Path
from PIL import Image
import uuid