Skip to content

Instantly share code, notes, and snippets.

View sseidenthal's full-sized avatar

Steve Seidenthal sseidenthal

  • Luxembourg, Germany
View GitHub Profile
@alexellis
alexellis / k8s-pi.md
Last active April 11, 2024 14:17
K8s on Raspbian
@sniper7kills
sniper7kills / ELK-install.sh
Last active February 7, 2020 02:13
ELK-Install-Ubuntu-16.04
#/bin/bash
#Ask some info
echo -n "Enter ELK Server IP or FQDN: "
read eip
echo -n "Enter Admin Web Password: "
read adpwd
#Update System
sudo apt-get update
sudo apt-get upgrade -y
@sseidenthal
sseidenthal / vhost.conf
Created September 7, 2016 15:02
NGINX, Symfony 2 with SimpleSamlPhp as Alias
server {
server_name default;
root /var/www/symfony/web;
client_max_body_size 100M;
fastcgi_read_timeout 1800;
location / {
@Graceas
Graceas / FormErrorsSerializer.php
Created September 10, 2013 06:26
Symfony 2 Form Error Serializer. May be used for AJAX form validation. Allows tree and flat array styles for errors.
class FormErrorsSerializer {
public function serializeFormErrors(\Symfony\Component\Form\Form $form, $flat_array = false, $add_form_name = false, $glue_keys = '_')
{
$errors = array();
$errors['global'] = array();
$errors['fields'] = array();
foreach ($form->getErrors() as $error) {
$errors['global'][] = $error->getMessage();
@cdown
cdown / gist:1163649
Last active April 9, 2024 01:10
Bash urlencode and urldecode
urlencode() {
# urlencode <string>
old_lc_collate=$LC_COLLATE
LC_COLLATE=C
local length="${#1}"
for (( i = 0; i < length; i++ )); do
local c="${1:$i:1}"
case $c in
@jesstess
jesstess / wget_spider_https
Created November 27, 2010 20:02
Use wget to spider a site as a logged-in user.
http://addictivecode.org/FrequentlyAskedQuestions
To spider a site as a logged-in user:
1. post the form data (_every_ input with a name in the form, even if it doesn't have a value) required to log in (--post-data).
2. save the cookies that get generated (--save-cookies), including session cookies (--keep-session-cookies), which are not saved when --save-cookies alone is specified.
2. load the cookies, continue saving the session cookies, and recursively (-r) spider (--spider) the site, ignoring (-R) /logout.
# log in and save the cookies
wget --post-data='username=my_username&password=my_password&next=' --save-cookies=cookies.txt --keep-session-cookies https://foobar.com/login