Skip to content

Instantly share code, notes, and snippets.

View nigelheap's full-sized avatar

Nigel nigelheap

View GitHub Profile
@nigelheap
nigelheap / breadcrum_hell.php
Created June 8, 2011 05:08 — forked from nicksantamaria/gist:1013816
Psuedo-code for getting the parent page's title when using urls as hierarchy
<?php
// Get the requested URL alias and explode into array
$q = trim($_SERVER['REQUEST_URI'], '/');
$arg = explode('/', $q);
// Shorten the URL by 1 argument
$newArg = array_slice($arg, 0, (count($arg) - 1));
// Get the menu item for the new URL

Share Counts

I have always struggled with getting all the various share buttons from Facebook, Twitter, Google Plus, Pinterest, etc to align correctly and to not look like a tacky explosion of buttons. Seeing a number of sites rolling their own share buttons with counts, for example The Next Web I decided to look into the various APIs on how to simply return the share count.

If you want to roll up all of these into a single jQuery plugin check out Sharrre

Many of these API calls and methods are undocumented, so anticipate that they will change in the future. Also, if you are planning on rolling these out across a site I would recommend creating a simple endpoint that periodically caches results from all of the APIs so that you are not overloading the services will requests.

Twitter

@nigelheap
nigelheap / git-log2json.sh
Created April 13, 2016 19:15 — forked from textarcana/git-log2json.sh
Convert Git logs to JSON. The first script (git-log2json.sh) is all you need, the other two files contain only optional bonus features :)
#!/usr/bin/env bash
# Use this one-liner to produce a JSON literal from the Git log:
git log \
--pretty=format:'{%n "commit": "%H",%n "author": "%an <%ae>",%n "date": "%ad",%n "message": "%f"%n},' \
$@ | \
perl -pe 'BEGIN{print "["}; END{print "]\n"}' | \
perl -pe 's/},]/}]/'
@nigelheap
nigelheap / Robots Environment .htaccess
Last active June 30, 2017 22:08 — forked from chadclark/Robots Environment .htaccess
Robots.txt for Staging and Production. By adding these rewrite rules to your .htaccess file, robots_dev.txt will be served as robots.txt on any non-production server.
RewriteEngine On
RewriteCond %{HTTP_HOST} \.dev$ [NC]
RewriteCond %{HTTP_HOST} ^dev\. [NC]
RewriteCond %{HTTP_HOST} \.uat$ [NC]
RewriteCond %{HTTP_HOST} ^uat\. [NC]
RewriteRule ^robots.txt robots_dev.txt [L]
@nigelheap
nigelheap / drush-s3cmd.sh
Last active July 5, 2017 23:44 — forked from chrisfree/drush-s3cmd.sh
Backup a Drupal site to Amazon S3 using Drush
# arg1 : bucket
# arg2 : project path
# arg3 : backup path
# arg4 : max s3
# Switch to the docroot.
cd $2
# Backup the database.