Skip to content

Instantly share code, notes, and snippets.

View renanpalmeira's full-sized avatar

Renan renanpalmeira

View GitHub Profile
@renanpalmeira
renanpalmeira / settings.json
Created May 23, 2024 03:14 — forked from diego3g/settings.json
VSCode Settings (Updated)
{
"workbench.startupEditor": "newUntitledFile",
"editor.fontSize": 14,
"editor.lineHeight": 1.8,
"javascript.suggest.autoImports": true,
"javascript.updateImportsOnFileMove.enabled": "always",
"editor.rulers": [80, 120],
"extensions.ignoreRecommendations": true,
"typescript.tsserver.log": "off",
"files.associations": {
@renanpalmeira
renanpalmeira / README.md
Created November 28, 2017 14:37 — forked from andineck/README.md
Authentication and Authorization Concepts for MicroServices

auth with microservices

Authorization and Authentication are hard. when you only have to implement them once (as you do within a monolith) instead of over and over again, it makes the developer happy :-), and maybe leads to less implementation failures.

When you have a bunch of microservices, this is something that has to be considered.

Implement it once or in every microservice, or something in between?

@renanpalmeira
renanpalmeira / poedit_install.sh
Created December 9, 2016 12:00 — forked from bancek/poedit_install.sh
Install Poedit 1.7.5 from source on Ubuntu 14.10
wget https://github.com/vslavik/poedit/releases/download/v1.7.5-oss/poedit-1.7.5.tar.gz
tar xf poedit-1.7.5.tar.gz
cd poedit-1.7.5
apt-get install -y build-essential libwxgtk3.0-dev libicu-dev libgtkspell-dev libdb5.3++-dev liblucene++-dev libboost1.54-dev libboost-regex1.54-dev libboost-system1.54-dev
sed -i 's/Version: 3.0.3.4/Version: 3.0.5/' /usr/lib/x86_64-linux-gnu/pkgconfig/liblucene++.pc
./configure
make
make install
@renanpalmeira
renanpalmeira / httpd.conf_spiders
Created September 25, 2016 04:07 — forked from gplv2/httpd.conf_spiders
Apache bot control system, filter out spiders good and bad crawlers/ webspiders when they hit your server hard, like googlebot , bingbot. Block all them for specific places marked in the robots.txt to not visit (yet they do sometimes).
# To relieve servers
##Imagine a robots.txt file like this (Google understands this format):
#User-agent: *
#Disallow: /detailed
#Disallow: /?action=detailed
#Disallow: /*/detailed
#Crawl-delay: 20
##