Using Python's built-in defaultdict we can easily define a tree data structure:
def tree(): return defaultdict(tree)
That's it!
#include <CoreFoundation/CoreFoundation.h> | |
AXUIElementRef app = AXUIElementCreateApplication(pid); |
const lines = [ | |
'I am a string', | |
'I am not a string', | |
'Lies', | |
'You got me', | |
]; | |
function printLines(callback: () => void) { | |
lines.forEach((line, i) => console.log(`Line ${i}: ${line}`)); | |
callback(); |
df['x'].str.lower() |
def __iter__(self): | |
for attr, value in self.__dict__.iteritems(): | |
yield (attr, value) |
Using Python's built-in defaultdict we can easily define a tree data structure:
def tree(): return defaultdict(tree)
That's it!
groups = dict(list(gb)) |
c1 c2 | |
0 3 10 | |
4 2 100 | |
1 2 30 | |
3 2 15 | |
2 1 20 |
from xml.dom.minidom import parse, parseString | |
import urllib2 | |
# note - i convert it back into xml to pretty print it | |
print parse(urllib2.urlopen("http://search.twitter.com/search.atom?&q=python")).toprettyxml(encoding="utf-8") |
This is a guide for installing the Torch machine learning ecosystem onto a GPU EC2 instance running Ubuntu 14.04.
Note: I have created and made available a Community EC2 AMI following these step with the name torch-ubuntu-14.04-cuda-7.0-28
and ami-id ami-c79b7eac
. Simply search for ami-c79b7eac
in Community AMIs when creating an instance to get up and running quickly.
Preliminary steps:
g2.2xlarge
or g2.8xlarge
instance with the Ubuntu Server 14.04 LTS (HVM), SSD Volume Type - ami-d05e75b8 base AMI;fprintf(stdout, "%s\n", [str UTF8String]); | |
fflush(stdout); |