Skip to content

Instantly share code, notes, and snippets.

@hortonew
hortonew / update-home-assistant-docker.sh
Last active Feb 21, 2021
Home Assistant: Docker container update script
View update-home-assistant-docker.sh
#!/bin/bash
# Note: This config renews letsencrypt certs for "mydomain.com". Modify this for your own domain or remove those steps.
set -e
# Update image
docker pull homeassistant/home-assistant:latest
d=`date +%Y-%m-%d-%H`
# Identify and stop old instance of home assistant
olddockerid=`docker ps -a | grep home-assistant | awk '{ print $1 }'`
@hortonew
hortonew / gist:ff778ff0dd10ef4df808373c3c3c41a0
Created Sep 26, 2018
Example krb5.conf to authenticate via kerberos from unix to windows
View gist:ff778ff0dd10ef4df808373c3c3c41a0
[libdefaults]
default_realm = CORPORATE.COMPANY.COM
krb4_config = /etc/krb.conf
krb4_realms = /etc/krb.realms
kdc_timesync = 1
ccache_type = 4
forwardable = true
proxiable = true
v4_instance_resolve = false
v4_name_convert = {
@hortonew
hortonew / Ansible vault with AWS Dynamic Inventory
Created Sep 17, 2018
Will allow you to set variables needed before dynamic inventory is run, allowing you to use ansible vault
View Ansible vault with AWS Dynamic Inventory
---
- name: Set env variables
hosts: localhost
connection: local
environment:
AWS_ACCESS_KEY_ID: "{{ AWS_ACCESS_KEY }}"
AWS_SECRET_ACCESS_KEY: "{{ AWS_SECRET_KEY }}"
tasks:
- name: Build AWS Inventory
command: "{{ ec2_py_location }} --list --refresh-cache"
View Splunk search cluster example
Example SHCluster Config
See doc for up to date configurations: http://docs.splunk.com/Documentation/Splunk/latest/DistSearch/SHCdeploymentoverview
Would recommend sharing splunk.secret on deployer and any search head part of cluster so you can maintain configs on deployer.
Splunk admin user: admin
Splunk admin password: my-splunk-admin-password
3 Search heads in cluster: searchhead-01, searchhead-02, searchhead-03
Search cluster secret: my-shc-secret
Search cluster label: my-shc-cluster-label
View Ayehu Log Parsing - Splunk
TODO
1. Pull date from source file name (fileName_23052018.log)
2. (DONE) Pull time from line entry
3. (DONE) Send to null queue -> "First line of every file"
4. (DONE) Send to null queue -> "Lines with -------------------------------"
props.conf
[Ayehu:EyeShare:Executor]
TRUNCATE=0
TIME_FORMAT=%X
View Example indexes.conf
[default]
homepath.maxDataSizeMB = 300000
coldPath.maxDataSizeMB = 200000
repFactor = auto
[volume:primary]
path = /opt/splunk/indexes/hot
maxVolumeDataSizeMB = 1500000
[volume:secondary]
View gist:cd54a30d597d0297a4130f5e9a798267
Variables
AnsibleServer = current servername
RemoteServerIP = server i'm trying to ping
MyUsername = username with same password across local, proxy, and remote server
MyProxyHostIP = proxy host IP
[ansible@AnsibleServer ~]$ ansible -m ping -i teams/monitoring/ RemoteServerIP -u MyUsername -k -vvvv
Using /home/ansible/ansible.cfg as config file
SSH password:
Loading callback plugin minimal of type stdout, v2.0 from /usr/lib/python2.7/site-packages/ansible/plugins/callback/__init__.pyc
View Example-Splunk-Outputs.conf
[tcpout]
defaultGroup = mySplunkIndexers
maxQueueSize = 7MB
[tcpout:mySplunkIndexers]
server = 10.10.10.10:9997, 10.10.10.20:9997
autoLB = true
useACK = true
@hortonew
hortonew / Splunk-one-way-diff.txt
Created Mar 21, 2017
Splunk one-way diff search
View Splunk-one-way-diff.txt
| set diff
[| makeresults count=1 | eval users="user1 user2 user6" | makemv users | mvexpand users | table users | append [search sourcetype=linux_secure logged_in_user!="" | table users] | stats values(users) as users | mvexpand users | table users]
[search sourcetype=linux_secure logged_in_user!="" | stats values(logged_in_user) as users | mvexpand users | table users]
View GetSplunkFrozenDates.sh
#!/bin/bash
# Author: Erik Horton
# usage1: ./GetSplunkFrozenDates.sh /path/to/frozen/directory/with/buckets/
# usage2: ./GetSplunkFrozenDates.sh /path/to/frozen/directory/with/buckets/ 2016-08-04
# output: db_bucket_name <start date> <end date>
#
# Use Case: Output date range for all Splunk buckets in a directory. Can specify a date, and it'll only output buckets that contain that date.
FILES="$1"*
DATE="$2"