Skip to content

Instantly share code, notes, and snippets.

Andrés Aquino andresaquino

View GitHub Profile
andresaquino /
Created May 4, 2020 — forked from leesmith/
Simple Git Workflow For Continuous Delivery

Simple Git Workflow For Continuous Delivery

Workflow guidelines:

  • master branch is always production-ready, deployable, 100% green test suite
  • New development is done on feature branches, with frequent rebasing onto master
  • Clean commit history by preferring to rebase instead of merge (git pull is configured to automatically rebase)

rebase workflow


andresaquino /
Created Jan 8, 2020
Para cambiar el nombre de varios archivos CFDI y asignarles el UUID por nombre.
for oldname in Comprobantes*.xml; do newname=$(xmllint --format $efile | grep -o 'UUID="[^"]*' | sed -e "s/UUID=.//g"); mv ${oldname} ${newname^^}.xml; done
andresaquino /
Created Nov 20, 2019 — forked from mattratleph/
vimdiff cheat sheet

vimdiff cheat sheet

##git mergetool

In the middle file (future merged file), you can navigate between conflicts with ]c and [c.

Choose which version you want to keep with :diffget //2 or :diffget //3 (the //2 and //3 are unique identifiers for the target/master copy and the merge/branch copy file names).

:diffupdate (to remove leftover spacing issues)
:only (once you’re done reviewing all conflicts, this shows only the middle/merged file)
View gist:b16b2cc1f2ac23f42997f14366c6386c
# basic pfctl control
# ==
# Related:
# Last update: Tue Dec 28, 2004
# ==
# Note:
# this document is only provided as a basic overview
# for some common pfctl commands and is by no means
# a replacement for the pfctl and pf manual pages.
andresaquino /
Created Oct 16, 2019 — forked from Chaser324/
GitHub Standard Fork & Pull Request Workflow

Whether you're trying to give back to the open source community or collaborating on your own projects, knowing how to properly fork and generate pull requests is essential. Unfortunately, it's quite easy to make mistakes or not know what you should do when you're initially learning the process. I know that I certainly had considerable initial trouble with it, and I found a lot of the information on GitHub and around the internet to be rather piecemeal and incomplete - part of the process described here, another there, common hangups in a different place, and so on.

In an attempt to coallate this information for myself and others, this short tutorial is what I've found to be fairly standard procedure for creating a fork, doing your work, issuing a pull request, and merging that pull request back into the original project.

Creating a Fork

Just head over to the GitHub page and click the "Fork" button. It's just that simple. Once you've done that, you can use your favorite git client to clone your repo or j

andresaquino /
Last active Jan 27, 2019
Resize icon using imagemagick or ffmpeg
#!/usr/bin/env bash
# Install:
# cp ~/bin/iconresize
# chmod 0700 ~/bin/iconresize
# How to use:
# iconresize <filename:icon.png> <directory:android> <tool:im|ff>
Image edition
# convert white color to transparency
convert <image.png> -fuzz 5% -transparent white <new-image.png>
# resize image to 24, 48, 64, 96 & 128 preserving ratio (width)
for isize in 24 48 64 96 128
ffmpeg -i <original-img.png> -vf scale=${isize}:-1 <image_${isize}.png>

Size DB


mysql> SELECT sum( data_length + index_length ) / (1024 * 1024)  "Size in MB" FROM information_schema.TABLES;


SELECT table_schema "Data Base Name", sum( data_length + index_length ) / (1024 * 1024) as Size FROM information_schema.TABLES GROUP BY table_schema;
  • Finalement, je me rends compte qu'il n'y a pas grand chose à faire pour un poste de dev à partir du moment où je ne laisse pas mon poste allumé tout le temps.
  • A savoir que la plupart des variables sont à mettre dans [mysqld] pour qu'elles soient prises en compte.
  • Il faut un MySQL allumé et utilisé depuis longtemps pour avoir des recommandations pertinentes
  • Ceci dit, quelques améliorations demandé par mysqltuner :
# sudo vim /etc/mysql/conf.d/perso.cnf
key_buffer_size = 128M
query_cache_limit = 2048M
andresaquino /
Created Dec 20, 2018 — forked from weblogix/
[MySQL Cheetsheet] #mysql #mariadb

Exporting a Compressed MySQL Dump

mysqldump -u {user} -p {database} | gzip > {database}.sql.gz

Importing a Compressed MySQL Dump

gzip -dc < {database}.sql.gz | mysql -u {user} -p {database}
You can’t perform that action at this time.