Skip to content

Instantly share code, notes, and snippets.

@eykanal
eykanal / autoscroll.user.js
Last active April 1, 2024 14:34
Automatic scroll to bottom of page in twitter and facebook
// ==UserScript==
// @name AutoScroll
// @namespace https://gist.github.com/eykanal/a0b30e035d8c15995deeffec6ab21866
// @version 0.1
// @description Automatically scroll to the bottom of the page
// @author Eliezer Kanal
// @match https://www.facebook.com/search/str/*
// @match https://twitter.com/search*
// @grant none
// ==/UserScript==
@eykanal
eykanal / solidity_bugs.md
Last active November 29, 2017 21:18
Bugs in solidity

unchecked send

if (gameHasEnded && !( prizePaidOut ) ) {
  winner.send(1000); // send a prize to the winner
  prizePaidOut = True;
}
@eykanal
eykanal / confirmation_bias.md
Last active August 22, 2017 13:28
Selection of text from http://www.hpmor.com/chapter/8 describing the confirmation bias.

"Now the way this game works," said the boy, "is that you give me a triplet of three numbers, and I'll tell you 'Yes' if the three numbers are an instance of the rule, and 'No' if they're not. I am Nature, the rule is one of my laws, and you are investigating me. You already know that 2-4-6 gets a 'Yes'. When you've performed all the further experimental tests you want - asked me as many triplets as you feel necessary - you stop and guess the rule, and then you can unfold the sheet of paper and see how you did. Do you understand the game?"

"Of course I do," said Hermione.

"Go."

"4-6-8" said Hermione.

"Yes," said the boy.

@eykanal
eykanal / gist:4683189
Created January 31, 2013 14:26
Script used to track progress on thesis work. Relies on svn and ploticus.
#!/bin/sh
# move to correct directory
currDir=pwd
cd /Users/eliezerk/Documents/Research-grad/thesis/thesis-tex-svn/
# get current revision
revisions=`svnversion | sed 's/^[A-Z0-9]*://'`
isModified=`echo $revisions | sed -e 's/^[0-9A-Z]*://' -e 's/[0-9]//g'`
if [ "$isModified" = "M" ] ; then
// ==UserScript==
// @name Toodledo Add Goal Count
// @namespace
// @include http://www.toodledo.com/organize/goals.php
// @include https://www.toodledo.com/organize/goals.php
// ==/UserScript==
// for each higher-order goal:
// count how many lower-order goals have that stated as a higher-order goal
// display that number next to the higher-order goal
@eykanal
eykanal / run_tSSS.sh
Created February 14, 2012 02:31
Analyze MEG data and email when complete
#!/bin/sh
#
# The script should be run with the directory containing files to be analyzed.
# For each file in that directory, the script will run maxfilter with the
# parameters defined below, and save it to the output directory defined in
# SAVE_DIR. It will then email the address specified at the bottom
# (you@example.com) with the analysis log, containing the results of the
# analysis.
#
@eykanal
eykanal / setup_gitconfig.rb
Created October 31, 2011 21:32 — forked from manveru/setup_gitconfig.rb
Create basic .gitconfig file
#!/usr/bin/ruby
gitconfig = File.expand_path("~/.gitconfig")
if File.file?(gitconfig)
puts "#{gitconfig} already exists."
exit 0
end
puts('##',
@eykanal
eykanal / setup_gitconfig.rb
Created October 31, 2011 21:14
Create basic .gitconfig file
#!/usr/bin/ruby
if File.exists?(File.expand_path("~/.gitconfig"))
puts "~/.gitconfig file already exists, exiting."
exit false
end
puts '##'
puts '## Git stores each entry with your name, email, and a unique identifier.'
puts '## The following will set up Git with this information, as well as some '
@eykanal
eykanal / mailx_disk_monitor.sh
Created June 27, 2011 14:35
Monitor disk space, send email when reaches
#!/bin/sh
# How full should the disk be allowed to get (percentage)?
warn_at=90
# get disk space info for all disks
df | grep "/dev/disk" | awk '{ print $1,$3,$4,$5,$6 }' > /tmp/space
# for each disk, determine if full
while read line
@eykanal
eykanal / mailx_site_monitor.sh
Created June 27, 2011 14:27
Simple tool using curl, mailx, and a crontab to monitor whether a site is up or not
#!/bin/sh
# How many seconds does it usually take your site to respond?
sec=2
# Try to download the site to a file
curl -s -m $sec example.org > /tmp/dltime
# Get size of file
a=`wc -m /tmp/dltime | awk '{print $1}'`