Skip to content

Instantly share code, notes, and snippets.

View arthuralvim's full-sized avatar

Arthur Alvim arthuralvim

View GitHub Profile
# List unique values in a DataFrame column
# h/t @makmanalp for the updated syntax!
df['Column Name'].unique()
# Convert Series datatype to numeric (will error if column has non-numeric values)
# h/t @makmanalp
pd.to_numeric(df['Column Name'])
# Convert Series datatype to numeric, changing non-numeric values to NaN
# h/t @makmanalp for the updated syntax!
@arthuralvim
arthuralvim / bootstrap_homeshick.sh
Created June 29, 2018 17:43 — forked from andsens/bootstrap_homeshick.sh
Script that can set up an entire user account with homeshick automatically
#!/bin/bash -ex
# Paste this into ssh
# curl -sL https://gist.github.com/andsens/2913223/raw/bootstrap_homeshick.sh | tar -xzO | /bin/bash -ex
# When forking, you can get the URL from the raw (<>) button.
### Set some command variables depending on whether we are root or not ###
# This assumes you use a debian derivate, replace with yum, pacman etc.
aptget='sudo apt-get'
chsh='sudo chsh'
@arthuralvim
arthuralvim / boto3_iam_access_key_rotation.py
Created June 4, 2018 20:37 — forked from andymotta/boto3_iam_access_key_rotation.py
Rotate AWS IAM access keys for every Boto profile on host (Compliance)
## Meant to be scheudled on a cron/timer of 90 days (CIS Benchmark)
## The target keys need permissions to rotate themselves
import boto3
from botocore.exceptions import ClientError
import os
from datetime import datetime
import shutil
from ConfigParser import SafeConfigParser
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@arthuralvim
arthuralvim / NLTK and Named Entity Recognition (Person).ipynb
Created October 19, 2017 19:11
LTK and Named Entity Recognition (Person) in Brazilian Portuguese
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@arthuralvim
arthuralvim / pep20_by_example.py
Created October 19, 2017 19:11 — forked from evandrix/pep20_by_example.py
PEP 20 (The Zen of Python) by example
#!/usr/bin/env python
"""
=====================================
PEP 20 (The Zen of Python) by example
=====================================
Usage: %prog
:Author: Hunter Blanks, hblanks@artifex.org / hblanks@monetate.com
@arthuralvim
arthuralvim / collection_size.py
Created October 4, 2017 14:44 — forked from luizbraga/collection_size.py
List collections, size and count of documents on MongoDB
from pymongo import MongoClient
MONGO_URI = ''
DATABASE_NAME = ''
client = MongoClient(MONGO_URI)
db = client[DATABASE_NAME]
collections = db.collection_names()
def readable_size(file_size):
@arthuralvim
arthuralvim / jekyll-and-liquid.md
Created August 29, 2017 20:53 — forked from magicznyleszek/jekyll-and-liquid.md
Jekyll & Liquid Cheatsheet

Jekyll & Liquid Cheatsheet

A list of the most common functionalities in Jekyll (Liquid). You can use Jekyll with GitHub Pages, just make sure you are using the proper version.

Running

Running a local server for testing purposes:

@arthuralvim
arthuralvim / awscli.md
Last active October 30, 2018 19:03 — forked from victorfsf/awscli.md
Downloading and uploading from/to a S3 Bucket using AWS CLI

Installing & Configuring

$ sudo pip install awscli (or: sudo apt-get install awscli)
$ aws configure

You'll need to fill the following settings:

@arthuralvim
arthuralvim / mongodb-s3-backup.sh
Created June 9, 2017 11:40 — forked from eladnava/mongodb-s3-backup.sh
Automatically backup a MongoDB database to S3 using mongodump, tar, and awscli (Ubuntu 14.04 LTS)
#!/bin/sh
# Make sure to:
# 1) Name this file `backup.sh` and place it in /home/ubuntu
# 2) Run sudo apt-get install awscli to install the AWSCLI
# 3) Run aws configure (enter s3-authorized IAM user and specify region)
# 4) Fill in DB host + name
# 5) Create S3 bucket for the backups and fill it in below (set a lifecycle rule to expire files older than X days in the bucket)
# 6) Run chmod +x backup.sh
# 7) Test it out via ./backup.sh