Skip to content

Instantly share code, notes, and snippets.

View NeuronQ's full-sized avatar
🎯
Focusing

Andrei Anton NeuronQ

🎯
Focusing
View GitHub Profile
@NeuronQ
NeuronQ / ssh-agent-autostart.sh
Created June 20, 2016 04:30
ssh-agent autostart (bash) - add this to .bashrc
SSH_ENV="$HOME/.ssh/environment"
function start_agent {
echo "Initialising new SSH agent..."
/usr/bin/ssh-agent | sed 's/^echo/#echo/' > "${SSH_ENV}"
echo succeeded
chmod 600 "${SSH_ENV}"
. "${SSH_ENV}" > /dev/null
/usr/bin/ssh-add;
}
@NeuronQ
NeuronQ / python-shell-like-script.py
Created April 24, 2017 21:05
Boilerplate for writing "shell-like" Python scripts
#!/usr/bin/env python
# both python 2 and 3 comatible
from __future__ import absolute_import, division, print_function, unicode_literals
try:
input = raw_input
except NameError:
pass
import sys
@NeuronQ
NeuronQ / screen-cheatsheet.sh
Last active August 17, 2017 13:46
UNIX Screen cheatheet
#!/bin/cat
# Intro to using UNIX Screen
## Basics
screen -list # LIST current screen sessions
screen -S myscreen1 # CREATE a new session named *myscreen1*
@NeuronQ
NeuronQ / widows-remap-capslock2ctrl.ps
Created January 13, 2018 20:51
run in PowerShell as Admin, changes take effect after reboot
REGEDIT4
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Keyboard Layout]
"Scancode Map"=hex:00,00,00,00,00,00,00,00,02,00,00,00,1d,00,3a,00,00,00,00,00
@NeuronQ
NeuronQ / windows-remap-capslock2ctrl.reg
Last active January 13, 2018 20:52
Save in file with .reg extension. Import requires Admin privileges (use Admin user or run Regedit as Admin)
REGEDIT4
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Keyboard Layout]
"Scancode Map"=hex:00,00,00,00,00,00,00,00,02,00,00,00,1d,00,3a,00,00,00,00,00
# sync_scrape.py (needs Python 3.7+)
import time, re, requests
def fetch_url(url):
t = time.perf_counter()
html = requests.get(url).text
print(f"time of fetch_url({url}): {time.perf_counter() - t:.2f}s")
return html
def scrape_data(html):
// sync_scrape.js (tested with node 11.3)
const request = require("sync-request");
const fetchUrl = url => {
console.time(`fetchUrl(${url})`);
const html = request("GET", url).getBody();
console.timeEnd(`fetchUrl(${url})`);
return html;
};
// callbacks_async_scrape.js (tested with node 11.3)
const http = require('http');
const https = require('https');
const fetchUrl = (url, onSuccess, onError) => {
console.time(`fetchUrl(${url})`);
(url.indexOf('https') === 0 ? https : http).get(url, resp => {
let html = '';
resp.on('data', chunk => html += chunk);
resp.on('end', () => {
# async_scrape.py (requires Python 3.7+)
import asyncio, random, time
async def fetch_url(url):
print(f"~ executing fetch_url({url})")
t = time.perf_counter()
await asyncio.sleep(random.randint(1, 5))
print(f"time of fetch_url({url}): {time.perf_counter() - t:.2f}s")
return f"<em>fake</em> page html for {url}"
// async_scrape.js (tested with node 11.3)
const sleep = ts => new Promise(resolve => setTimeout(resolve, ts * 1000));
async function fetchUrl(url) {
console.log(`~ executing fetchUrl(${url})`);
console.time(`fetchUrl(${url})`);
await sleep(1 + Math.random() * 4);
console.timeEnd(`fetchUrl(${url})`);
return `<em>fake</em> page html for ${url}`;
}