Navigation Menu

Skip to content

Instantly share code, notes, and snippets.

View mathigatti's full-sized avatar

Mathias Gatti mathigatti

View GitHub Profile
@mathigatti
mathigatti / number2pattern.py
Created June 7, 2019 16:28
Convert an input number into a pattern (The number is interpreted as the rules for a 2D Grid Cellular automata with Von Neumann neighbourhood)
import numpy as np
import cv2
ON = 255
OFF = 0
N = 200
def rule(neighbours,rules):
if neighbours in rules:
return ON
@mathigatti
mathigatti / normalized_google_distance.py
Created June 9, 2019 12:16
Normalized Google Distance in Python
import requests
from bs4 import BeautifulSoup
import math
import sys
def number_of_results(text):
headers = {'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36'}
r = requests.get("https://www.google.com/search?q="+text.replace(" ","+"),params={"gl":"us"},headers=headers)
soup = BeautifulSoup(r.text, "lxml")
res = soup.find("div", {"id": "resultStats"})
@mathigatti
mathigatti / foxdot.py
Last active November 21, 2019 15:04
FoxDot
# Acceder a este archivo en: https://bit.ly/2OyyWr2
# (Prestar atención a mayusculas y minusculas)
# Tempo y escala
print(help(Scale))
Scale.default = Scale.minor
print(Clock.bpm)
# Que instrmentos hay

Intento de descripción del funcionamiento de algunos Comandos en TidalCycles (y en otros lados también)

Indice

  1. Algoritmo Euclideano

  2. Mixer Ritchse/Lil' Data/ Kindhom / Otres?

  3. La Doble Booleana ft Martín Karadagian

### Keybase proof
I hereby claim:
* I am mathigatti on github.
* I am mathigatti (https://keybase.io/mathigatti) on keybase.
* I have a public key ASC6UTYH0FvHsHHVGVrZYm9uCr8gWfWencCBnDnCZuok-Qo
To claim this, I am signing this object:
@mathigatti
mathigatti / palabras_comunes.py
Created June 2, 2020 20:56
Sort words on a text file by its relevance.
import math
from es_lemmatizer import lemmatize
import spacy
from nltk.tokenize import word_tokenize
from nltk import ngrams, FreqDist
from tqdm import tqdm
from collections import defaultdict
import unidecode
import sys
import os
@mathigatti
mathigatti / scrape_pagina12.py
Last active June 19, 2020 12:42
Scraper para bajar noticias de pagina12
from bs4 import BeautifulSoup
import requests
import os
def descargar_noticia(url):
page = requests.get(url)
soup = BeautifulSoup(page.content, 'html.parser')
noticia = ""
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
import os
import sys
def unir_txts(carpeta_con_los_txts):
lista_de_txts = os.listdir(carpeta_con_los_txts) # Esto arma la lista de todos los archivos que estan en ese directorio/carpeta
texto_final = "" # Variable donde voy a ir juntando el texto de todos los txts
for nombre_del_txt in lista_de_txts: # recorro los archivos uno por una
# uno el nombre de la carpeta con el nombre del archivo para armar la ruta exacta donde el txt se ubica