Skip to content

Instantly share code, notes, and snippets.

View maborak's full-sized avatar
:octocat:
Benchmarking my internal AI

Wilmer A. maborak

:octocat:
Benchmarking my internal AI
View GitHub Profile
@maborak
maborak / Step 1: Compile Zabbix 5.4 in Ubuntu 21.10.md
Last active November 22, 2021 17:30
Compile Zabbix 5.4 in Ubuntu 21.10

Install required packages

  1. apt-get update
  2. apt-get install git libpcre2-8-0 wget build-essential automake pkg-config autoconf autogen vim libmysqlclient-dev libxml2-dev libsnmp-dev libssh2-1-dev libopenipmi-dev libevent-dev libcurl4-openssl-dev libpcre3-dev unixodbc-dev golang-go openjdk-17-jdk libldap2-dev libgnutls28-dev libmodbus-dev libmysqlclient-dev net-tools

Compile Zabbix from source

  1. wget https://cdn.zabbix.com/zabbix/sources/stable/5.4/zabbix-5.4.7.tar.gz
  2. tar xvfz zabbix-5.4.7.tar.gz
  3. cd zabbix-5.4.7
@ines
ines / streamlit_prodigy.py
Created October 3, 2019 20:37
Streamlit + Prodigy
"""
Example of a Streamlit app for an interactive Prodigy dataset viewer that also lets you
run simple training experiments for NER and text classification.
Requires the Prodigy annotation tool to be installed: https://prodi.gy
See here for details on Streamlit: https://streamlit.io.
"""
import streamlit as st
from prodigy.components.db import connect
from prodigy.models.ner import EntityRecognizer, merge_spans, guess_batch_size
@NiceGuyIT
NiceGuyIT / README.md
Last active October 28, 2024 08:16
nginx JSON to Filebeat to Logstash to Elasticsearch

Intro

This is an example configuration to have nginx output JSON logs to make it easier for Logstash processing. I was trying to get nginx > Filebeat > Logstash > ES working and it wasn't until I connected Filebeat directly to Elasticsearch that I saw the expected data. Google led me to ingest-convert.sh and I realized filebeat setup works for Filebeat > ES but not Filebeat > Logstash > ES. This is because Logstash does not use ingest pipelines by default. You have to enable them in the elasticsearch output block.

Having nginx log JSON in the format required for Elasticsearch means there's very little processing (i.e. grok) to be done in Logstash. nginx can only output JSON for access logs; the error_log format cannot be changed.

Extra fields are output and not used by the Kibana dashboards. I included them in case they might be useful. Since they are not declared in the filebeat setup, their default is "string" when yo

@prkstaff
prkstaff / init.vim
Last active September 13, 2023 20:27
My NeoVim config + Dracula theme + NerdTree
"*****************************************************************************
"" Vim-PLug core
"*****************************************************************************
if has('vim_starting')
set nocompatible " Be iMproved
endif
let vimplug_exists=expand('~/.config/nvim/autoload/plug.vim')
let g:vim_bootstrap_langs = "javascript,php,python,ruby"