Skip to content

Instantly share code, notes, and snippets.

Avatar

Malcolm Greaves malcolmgreaves

View GitHub Profile
View deploy_ide_via_notebook_api.sh
export CONTAINER_URI="gcr.io/deeplearning-platform-release/experimental.theia.1-7"
export INSTANCE_NAME=...
export PROJECT_NAME=...
export IMAGE_PROJECT="deeplearning-platform-release"
export IMAGE_FAMILY="theia-container-experimental"
export MACHINE_TYPE=... #"n1-standard-4"
export ZONE=.... #"us-central1-a"
gcloud notebooks instances create "${INSTANCE_NAME}" \
--project="${PROJECT_NAME}" \
--location="${ZONE}" \
View start_ide.sh
export CONTAINER_URI="gcr.io/deeplearning-platform-release/experimental.theia.1-7"
export INSTANCE_NAME=...
export PROJECT_NAME=...
export IMAGE_PROJECT="deeplearning-platform-release"
export IMAGE_FAMILY="theia-container-experimental"
export MACHINE_TYPE=... #"n1-standard-4"
export ZONE=... #"us-central1-a"
gcloud compute instances create "${INSTANCE_NAME}" \
--project="${PROJECT_NAME}" \
--zone="${ZONE}" \
@stefan-it
stefan-it / run_ner.py
Last active Apr 2, 2022
NER fine-tuning with PyTorch-Transformers (heavily based on https://github.com/kamalkraj/BERT-NER)
View run_ner.py
from __future__ import absolute_import, division, print_function
import argparse
import glob
import logging
import os
import random
import numpy as np
import torch
View gist:7f876c6ad4e4adcd36caea98b159b6f6
import torch
from torch_geometric.data import InMemoryDataset
class MyOwnDataset(InMemoryDataset):
def __init__(self, root, transform=None, pre_transform=None):
super(MyOwnDataset, self).__init__(root, transform, pre_transform)
self.data, self.slices = torch.load(self.processed_paths[0])
@property
@mollymerp
mollymerp / sshfs-gcp-instance-osx.md
Last active Jun 24, 2022
How to mount a GCP compute instance filesystem locally using `sshfs` on MacOS
View sshfs-gcp-instance-osx.md

How to mount a GCP compute instance filesystem locally using sshfs

This guide assumes that:

  • you already have an instance set up on GCP that you want to mount locally
  • the GCP CLI (gcloud) is installed on your local machine
  • you have authenticated locally to your google account gcloud auth login
  1. make sure your gcloud config is correct for the instance you're trying to access:
@stevenhao
stevenhao / mode-presto-linter.js
Last active Mar 28, 2020
Lint prestodb sql in mode analytic's web editor
View mode-presto-linter.js
// ==UserScript==
// @name PrestoDB Linter v0.1.3
// @namespace http://tampermonkey.net/
// @version 0.1
// @description try to take over the world!
// @author Steven Hao
// @match https://modeanalytics.com/editor/*
// @grant none
// ==/UserScript==
@HarshTrivedi
HarshTrivedi / pad_packed_demo.py
Last active Jun 26, 2022 — forked from Tushar-N/pad_packed_demo.py
Minimal tutorial on packing (pack_padded_sequence) and unpacking (pad_packed_sequence) sequences in pytorch.
View pad_packed_demo.py
import torch
from torch import LongTensor
from torch.nn import Embedding, LSTM
from torch.autograd import Variable
from torch.nn.utils.rnn import pack_padded_sequence, pad_packed_sequence
## We want to run LSTM on a batch of 3 character sequences ['long_str', 'tiny', 'medium']
#
# Step 1: Construct Vocabulary
# Step 2: Load indexed data (list of instances, where each instance is list of character indices)
@malcolmgreaves
malcolmgreaves / download.sh
Created Apr 24, 2018
Bash function for a better CLI remote file download experience.
View download.sh
#!/bin/bash
# Bash function to download a file with wget, showing a progress bar and enables
# re-downloading if interrupted. Also can automatically determine filename from
# supplied URL or override from command line.
# First argument is URL.
# Second optional argument is filename.
download () {
local URL="$1"
local FI="$2"
@cbaziotis
cbaziotis / SelfAttention.py
Created Apr 21, 2018
SelfAttention implementation in PyTorch
View SelfAttention.py
class SelfAttention(nn.Module):
def __init__(self, attention_size, batch_first=False, non_linearity="tanh"):
super(SelfAttention, self).__init__()
self.batch_first = batch_first
self.attention_weights = Parameter(torch.FloatTensor(attention_size))
self.softmax = nn.Softmax(dim=-1)
if non_linearity == "relu":
self.non_linearity = nn.ReLU()
View pandas_cheatsheet.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.