Skip to content

Instantly share code, notes, and snippets.

@mdeous
mdeous / asynbot.py
Created January 6, 2011 17:13
Asynchronous IRC library
# -*- coding: utf-8 -*-
#TODO: implement commands: oper, mode, topic, names, list, invite
#TODO: '' '' : kick, who, whois, whowas, kill, away
#TODO: '' '' : rehash, restart, wallops,
import logging
from asynchat import async_chat
from datetime import datetime
from re import compile as re_compile
@mdeous
mdeous / conventions.py
Created January 7, 2011 13:22
python code conventions
# put builtin libs first (and direct imports before 'from' imports)
import os
import sys
from threading import Thread
# then put 3rd party imports
import feedparser
from BeautifulSoup import BeautifulSoup
# and then project internal imports
@mdeous
mdeous / gist:798756
Created January 27, 2011 16:38
perl 2 py
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import sys
from base64 import b64decode
from struct import unpack
if len(sys.argv) != 2:
print("Usage: %s HASH" % sys.argv[0])
exit(1)
@mdeous
mdeous / gist:799109
Created January 27, 2011 19:56
Perl SSHA
#!/usr/bin/perl
use strict;
use warnings;
use MIME::Base64;
# ssha uses sha1($pass.$salt)
my $line = "{SSHA}EOmijWjrWh9KVWSDWb6hq4Hd3UFMcWIyOVRCUw==";
@mdeous
mdeous / gist:823821
Created February 12, 2011 15:27
example: parsing html with lxml and xpath
import lxml.html
html = lxml.html.parse("http://pypi.python.org/pypi") # can take an url, a filename or an object with a .read() method
packages = html.xpath('//tr/td/a/text()') # get the text inside all "<tr><td><a ...>text</a></td></tr>"
print packages
Out[7]:
[u'celery\xa02.2.3',
u'django-celery\xa02.2.3',
u'kombu\xa01.0.3',
@mdeous
mdeous / gist:849512
Created March 1, 2011 17:25
Sublime - Wrong Way
Annie's twelve years old in two more she'll be a whore
Nobody ever told her it's the wrong way
Don't be afraid with the quickness you get laid
For your family get paid, it's the wrong way
I gave her all that I had to give
I'm gonna make it hard to live
Salty tears running down to her chin
And it ruins up her make up, I never wanted
@mdeous
mdeous / gist:866667
Created March 11, 2011 22:09
threads synchronization
from async import async
from time import sleep
class ThreadSyncer(object):
def __init__(self, threads_amount):
self.amount = threads_amount
self.finished_count = 0
# container is just a very basic class i use to store objects used in many parts of the code
container.scheduler.add_jobstore(ShelveJobStore(settings.CRONJOBS_DB_FILE), 'crondb')
# this is the part of code used to add jobs
if name == 'reindexer':
target_func = reindex
kwargs = {}
else:
target_func = container.rpc.run_spider
kwargs = {
<ikarios> bon j'ai réussi à lance jtr
<grimmjowbo> bah maintenant cherche
<ikarios> mais je ne sais pas l'utiliser alors si qqun a un bon tuto sur le soft ce ne serais pas de refus
<ikarios> merci grimmjowbo j'avais compris lol
<Shiney> ikarios j'ai pas suivi tu veux cracker quelque chose?
<MatToufoutu> tu veux faire quoi en fait?
<ikarios> oui je veux craker lme nsa
<ikarios> la nsa
<ikarios> nan je rigole
<Shiney> ha ba la faut que je te file un tool
@mdeous
mdeous / gist:909853
Created April 8, 2011 13:40
sitemap spider
from scrapy.contrib.spiders import XMLFeedSpider
class SitemapSpider(XMLFeedSpider):
name = "sitemap"
namespaces = [
# ('', 'http://www.sitemaps.org/schemas/sitemap/0.9'),
('video', 'http://www.sitemaps.org/schemas/sitemap-video/1.1'),
]
start_urls = ["http://mattoufoutu.rafale.org/sample_sitemap.xml"]
itertag = 'url'