Skip to content

Instantly share code, notes, and snippets.

View linuxmalaysia's full-sized avatar
🏠
Working from home

Harisfazillah Jamel linuxmalaysia

🏠
Working from home
View GitHub Profile
@linuxmalaysia
linuxmalaysia / logstash-bind9.txt
Created March 25, 2019 23:31
Logstash Grok Pattern for Bind9 input using Filebeat
### Logstash Grok Pattern for Bind9 input using Filebeat
%{BIND9_TIMESTAMP:timestamp}%{SPACE}%{LOGLEVEL:loglevel}:%{SPACE}client%{SPACE}%{IP:clientip}#%{POSINT:clientport}%{SPACE}\(%{DATA:query}\):%{SPACE}view%{SPACE}internal:%{SPACE}query:%{SPACE}%{DATA:query2} %{DATA:queryclass} %{DATA:querytype} %{DATA:queryflag} \(%{IP:dnsip}\)
####
grok {
match => {
"message" => [ "%{BIND9_TIMESTAMP:timestamp}%{SPACE}%{LOGLEVEL:loglevel}:%{SPACE}client%{SPACE}%{IP:clientip}#%{POSINT:clientport}%{SPACE}\(%{DATA:query}\):%{SPACE}view%{SPACE}internal:%{SPACE}query:%{SPACE}%{DATA:query2} %{DATA:queryclass} %{DATA:querytype} %{DATA:queryflag} \(%{IP:dnsip}\)" ]
}
@linuxmalaysia
linuxmalaysia / convert-to-yaml-logstash.sh
Last active March 29, 2019 12:12
Blacklist from Steven Black and others convert to yml to be used with logstash translate field must contain URI / URL
#!/bin/sh
# origin https://gist.github.com/erlepereira/c11f4f7a3f60cd2071e79018e895fc8a
# logstash yaml "www.google.com": "known search engine"
# Choose from here https://github.com/StevenBlack/hosts
#HOSTS_RAW=https://raw.githubusercontent.com/StevenBlack/hosts/master/hosts
##### https://raw.githubusercontent.com/StevenBlack/hosts/master/data/malwaredomainlist.com/hosts
### first must using > and others using >> for pipe
##### first file
@linuxmalaysia
linuxmalaysia / logstash-filebeat-dns.conf
Last active April 10, 2022 16:13
This logstash config is only for DNS Bind9. Filebeat at Bind9 server read the queries files and send to this port.
# This logstash input is only for DNS Bind9. Filebeat at Bind9 server read the queries files and send to this port.
#### https://www.redpill-linpro.com/sysadvent/2015/12/15/rpz-malware-detection.html
#### For Bind9 RPZ log
#### TODO Need to put tags: [filebeat1] in filebeat and remove tags => in logstash input.
#### 29032019
input {
beats {
id => "server-filebeat-input"
@linuxmalaysia
linuxmalaysia / logstash-yml-shallalist.sh
Last active April 20, 2019 23:31
Convert shallalist to logstash yml
#!/bin/bash
# convert shallalist.tar.gz into logstash yml
# http://www.shallalist.de/categories.html
# Harisfazillah Jamel 30032019
# wget -c http://www.shallalist.de/Downloads/shallalist.tar.gz
# make sure uncompress under same directory as this script.
echo "localhost: locahost" > /etc/logstash/malware2.yml
find BL/ -name 'domains' -print0 |
while IFS= read -r -d $'\0' line; do
@linuxmalaysia
linuxmalaysia / ansible-hardening.txt
Last active April 17, 2019 01:20
Ansible script for hardening two files play.yml and requirements.yml
Ansible script for hardening two files play.yml and requirements.yml
This ansible steps are for hardening MariaDB server
1) ansible-galaxy install -r requirements.yml
2) ansible-playbook play.yml
#### start of requirements.yml
#### ansible-galaxy install -r requirements.yml
@linuxmalaysia
linuxmalaysia / haproxy.cfg
Last active May 12, 2019 23:37
haproxy.cfg Haproxy For Elastic beats And Logstash
# Haproxy For Elastic beats And Logstash
# Date: 13 May 2019
# 1) Example of haproxy.cfg listen for Filebeat or other beats by using port 5044/tcp
# And load balance to 2 servers.
# https://www.haproxy.com/blog/introduction-to-haproxy-logging/
# Please read above article for syslog configuration to listen port 514
# Or change config log to
@linuxmalaysia
linuxmalaysia / elasticsearch-handlers-main.yml
Created June 20, 2019 21:24 — forked from labrown/elasticsearch-handlers-main.yml
Ansible rolling restart of Elasticsearch Cluster
---
###
# Elasticsearch Rolling restart using Ansible
###
##
## Why is this needed?
##
#
# Even if you use a serial setting to limit the number of nodes processed at one
#!/bin/sh
# origin https://gist.github.com/erlepereira/c11f4f7a3f60cd2071e79018e895fc8a
# logstash yaml "www.google.com": "known search engine"
# Choose from here https://github.com/StevenBlack/hosts
#HOSTS_RAW=https://raw.githubusercontent.com/StevenBlack/hosts/master/hosts
##### https://raw.githubusercontent.com/StevenBlack/hosts/master/data/malwaredomainlist.com/hosts
### first must using > and others using >> for pipe
##### first file
@linuxmalaysia
linuxmalaysia / process-blacklist-csv.sh
Created July 19, 2019 06:21
# convert shallalist.tar.gz into csv # http://www.shallalist.de/categories.html # Harisfazillah Jamel 30032019 # wget -c http://www.shallalist.de/Downloads/shallalist.tar.gz # https://www.squidblacklist.org/downloads/dg-malicious.acl (masukkan dalam BL/malware dan tukar nama fail ke domains
#!/bin/bash
# convert shallalist.tar.gz into csv
# http://www.shallalist.de/categories.html
# Harisfazillah Jamel 30032019
# wget -c http://www.shallalist.de/Downloads/shallalist.tar.gz
# https://www.squidblacklist.org/downloads/dg-malicious.acl (masukkan dalam BL/malware dan tukar nama fail ke domains
echo "\"localhost\",locahost" > malware3.tmp
find BL/ -name 'domains' -print0 |
while IFS= read -r -d $'\0' line; do
@linuxmalaysia
linuxmalaysia / logstash-config-example-searching-filebeat-index.txt
Created October 13, 2019 02:33
Example logstash configuration file. Tested using Logstash 7.4.0 and Filebeat as input and Elasticsearch. Terima Kasih Kepada Amir Haris Ahmad, Localhost Sdn Bhd # kerana izinkan saya gunakan servers ujian mereka di Digital Ocean # dan team beliau dengan berkongsi pengalaman dan pandangan mereka. # Untuk saya menguji bruteforce attack log kepada…
### Terima Kasih Kepada Amir Haris Ahmad, Localhost Sdn Bhd
### kerana izinkan saya gunakan servers ujian mereka di Digital Ocean
### dan team beliau dengan berkongsi pengalaman dan pandangan mereka.
###
### Untuk saya menguji bruteforce attack log kepada syslog dengan fail2ban
###
### Server telah dipasang dengan fail2ban dan SSH dibuka dengan port 22.
### SSH tidak membenarkan module password dan hanya digital cert.
###
### Filebeat telah digunakan untuk mengumpulkan log.