Skip to content

Instantly share code, notes, and snippets.

View 7error's full-sized avatar
🎯
Focusing

7error

🎯
Focusing
View GitHub Profile
@kacole2
kacole2 / harbor.sh
Last active June 3, 2025 17:00
Quick Start Harbor Installation Script on Ubuntu 18.04
#!/bin/bash
#Harbor on Ubuntu 18.04
#Prompt for the user to ask if the install should use the IP Address or Fully Qualified Domain Name of the Harbor Server
PS3='Would you like to install Harbor based on IP or FQDN? '
select option in IP FQDN
do
case $option in
IP)
@y0ngb1n
y0ngb1n / docker-registry-mirrors.md
Last active October 26, 2025 05:27
国内的 Docker Hub 镜像加速器,由国内教育机构与各大云服务商提供的镜像加速服务 | Dockerized 实践 https://github.com/y0ngb1n/dockerized
@Herz3h
Herz3h / install.md
Last active June 14, 2022 18:18
Locales alpine 3.9

Copy locale.md file below into same directory as your Dockerfile

FROM alpine:3.9

# Install language pack
RUN apk --no-cache add ca-certificates wget && \
    wget -q -O /etc/apk/keys/sgerrand.rsa.pub https://alpine-pkgs.sgerrand.com/sgerrand.rsa.pub && \
    wget https://github.com/sgerrand/alpine-pkg-glibc/releases/download/2.25-r0/glibc-2.25-r0.apk && \
@rishiloyola
rishiloyola / gist:79f869749bf54d135f7f6fe61e0e99a7
Last active January 31, 2025 01:18
[ELK Stack] Generate TLS certs for filebeat and logstash
1. Generate new domain name for logstash server.
For this tutorial
domain name = logstash-prod.xyz.com
ip = 1.2.3.4
* Enter to following directory
```
$ sudo mkdir /etc/pki
$ cd /etc/pki
@avthart
avthart / Vagrantfile
Last active December 24, 2019 10:18
Vagrantfile for running Harbor. https://goharbor.io/
# -*- mode: ruby -*-
# vi: set ft=ruby :
# This script to install Kubernetes will get executed after we have provisioned the box
$script = <<-SCRIPT
apt-get update
apt-get install -y docker.io docker-compose python
curl -s https://storage.googleapis.com/harbor-releases/release-1.7.0/harbor-online-installer-v1.7.1.tgz | tar zxv
cd harbor
@abs51295
abs51295 / repo-list.txt
Last active July 10, 2020 15:08
Repository list containing go packages used by openshift as well as the repositories written in golang under openshift organization.
https://github.com/urfave/cli
https://github.com/mreiferson/go-httpclient
https://github.com/crewjam/rfc5424
https://github.com/kubernetes/heapster
https://github.com/go-openapi/spec
https://github.com/andygrunwald/go-gerrit
https://github.com/openshift/ci-secret-mirroring-controller
https://github.com/fsnotify/fsnotify
https://github.com/BurntSushi/toml
https://github.com/kubernetes-csi/drivers
@ScienJus
ScienJus / sofa-consul-registry.md
Last active April 16, 2020 09:43
SOFA Consul Registry

Sofa Consul Registry

Sofa Consul Registry 的实现,介绍 consul 服务注册发现 API、数据模型、健康检查等设计。

Consul API

服务注册

/v1/agent/service/register	
#!/bin/bash
#k8setup script v1.2019.4.11
#RHEL or CentOS 7.4+
#Direct any questions to landon.key@gmail.com
#01101000 01110100 01110100 01110000 01110011 00111010 00101111 00101111 01101100 01100001 01101110 01100100 01101111 01101110 01101011 01100101 01111001 00101110 01100011 01101111 01101101 00101111
# Watch how it is used on youtube: https://youtu.be/KWehrWGjkm4
#
@detiber
detiber / README.md
Last active October 24, 2024 05:56
Using CFSSL as an external CA for kubeadm

CFSSL as an external CA for non-ha kubeadm intialized clusters

Using cfssl to Create an External CA Infrastructure

Install cfssl

# This requires an existing Go environment with GOPATH set
go get -u github.com/cloudflare/cfssl/cmd/...
@NiceGuyIT
NiceGuyIT / README.md
Last active October 28, 2024 08:16
nginx JSON to Filebeat to Logstash to Elasticsearch

Intro

This is an example configuration to have nginx output JSON logs to make it easier for Logstash processing. I was trying to get nginx > Filebeat > Logstash > ES working and it wasn't until I connected Filebeat directly to Elasticsearch that I saw the expected data. Google led me to ingest-convert.sh and I realized filebeat setup works for Filebeat > ES but not Filebeat > Logstash > ES. This is because Logstash does not use ingest pipelines by default. You have to enable them in the elasticsearch output block.

Having nginx log JSON in the format required for Elasticsearch means there's very little processing (i.e. grok) to be done in Logstash. nginx can only output JSON for access logs; the error_log format cannot be changed.

Extra fields are output and not used by the Kibana dashboards. I included them in case they might be useful. Since they are not declared in the filebeat setup, their default is "string" when yo