Skip to content

Instantly share code, notes, and snippets.

View benallard's full-sized avatar
🤹
Juggling

Benoît Allard benallard

🤹
Juggling
View GitHub Profile
@benallard
benallard / .block
Last active February 1, 2022 12:37
Darts
license: gpl-3.0
height: 800
scrolling: no
border: no
@benallard
benallard / incidence.csv
Last active May 19, 2022 07:00
Statistic extraction from PDF Data
We can make this file beautiful and searchable if this error is corrected: It looks like row 2 should actually have 37 columns, instead of 36. in line 1.
day,Alfhausen,Ankum,Bad Essen,Bad Iburg,Bad Laer,Bad Rothenfelde,Badbergen,Belm,Berge,Bersenbrück,Bippen,Bissendorf,Bohmte,Bramsche,Dissen a.T.W.,Eggermühlen,Fürstenau,Gehrde,Georgsmarienhütte,Glandorf,Hagen a.T.W.,Hasbergen,Hilter a.T.W.,Kettenkamp,Landkreis Osnabrück,Melle,Menslage,Merzen,Neuenkirchen,Nortrup,Ostercappeln,Quakenbrück,Rieste,Stadt Osnabrück,Voltlage,Wallenhorst
2020-11-06,393.1847968545216,498.8913525498891,482.4477877229734,95.49440055560379,205.44982698961937,163.24309617739084,132.89036544850498,318.7020136172679,214.88047273704004,188.4185403843738,197.82393669634024,146.5354825204103,237.28545440164518,160.50333846944017,117.89924973204715,623.2294617563739,173.2572360375051,40.20908725371934,164.1425872588188,248.17518248175185,112.73957158962796,44.77478284230321,68.47974955977304,176.47058823529412,186.93524149066863,83.12698974177573,237.52969121140143,221.67487684729065,87.56567425569177,168.18028927009755,325.2544329031581,398.9985917696761,184.27518427518427,152.96088571636682,1
@benallard
benallard / input
Last active March 29, 2020 14:50
Travelling's Salesman based on historic data
+ABCDE
+CBDF
+ACDF
+BEACD
+ABDE
ABCDEF?
-ABCDE
BCDEF?
@benallard
benallard / README.md
Created October 28, 2019 19:44
D3 scaleTime with proper DST

So I found a mistake on the wunderground site:

https://twitter.com/benoit__allard/status/1188425266160394240

And I looked around If I could fix it.

  • data is correct (all points have a correct UTC timestamp).
  • d3 is being used.
  • The data is always showed in the "proper" timezone (the one of the station, not the local one).

Keybase proof

I hereby claim:

  • I am benallard on github.
  • I am benallard (https://keybase.io/benallard) on keybase.
  • I have a public key whose fingerprint is B146 7B59 28F2 5BF8 F90C 14C1 66E9 DC81 4FCE 1E3B

To claim this, I am signing this object:

include "alldifferent.mzn";
var 1..9: a;
var 1..9: b;
var 1..9: c;
var 1..9: d;
var 1..9: e;
var 1..9: f;
var 1..9: g;
var 1..9: h;
This patch displays the background.png image as background of the drawing area
If no image can be loaded, the previous behavior is kept.
diff -Npru delaunay 2/DelaunayAp.java delaunay/DelaunayAp.java
--- delaunay 2/DelaunayAp.java 2007-12-14 14:51:28.000000000 +0100
+++ delaunay/DelaunayAp.java 2015-02-14 16:03:21.000000000 +0100
@@ -22,6 +22,7 @@ package delaunay;
import java.awt.*;
@benallard
benallard / matrixbuilder.py
Created November 20, 2014 17:09
A buildbot utility class to create a bunch of Builders for a list of slaves.
class MatrixBuilders(object):
""" Create a bunch of builders, one per slave, and the
corresponding schedulers """
def __init__(self, baseName, factory, slaveNames):
""" Create a MatrixBuilder, with the factory for each slave """
self._baseName = baseName
self._factory = factory
self._slaves = slaveNames
def builderName(self, slaveName):
@benallard
benallard / xml_split.py
Last active March 5, 2024 05:28
Small python script to split huge XML files into parts. It takes one or two parameters. The first is always the huge XML file, and the second the size of the wished chunks in Kb (default to 1Mb) (0 spilt wherever possible) The generated files are called like the original one with an index between the filename and the extension like that: bigxml.…
#!/usr/bin/env python
import os
import xml.parsers.expat
from xml.sax.saxutils import escape
from optparse import OptionParser
from math import log10
# How much data we process at a time
@benallard
benallard / analysedump.py
Created October 25, 2013 23:16
Script to parse MegaDump logs from galileod (fitbit), and try to extract interesting information from it.
#!/usr/bin/env python
import sys
import time
def readdata():
""" input is from stdin in the format of lines of long string starting
with a tab ('\t') representing the hexadcimal representation of the data
(megadump)"""
d = []