Skip to content

Instantly share code, notes, and snippets.

@un1ko85
un1ko85 / pnorm.sql
Created Jul 28, 2022 — forked from olooney/pnorm.sql
PostgreSQL pnorm() function calculated the c.d.f. of the normal Gaussian distribution. This function match's R's build in pnorm() function to within +/- 2e-7 over the entire real line. However, it's a constant 1/0 above/below z=+7/-7.
View pnorm.sql
CREATE OR REPLACE FUNCTION pnorm(z double precision) RETURNS double precision AS $$
SELECT CASE
WHEN $1 >= 0 THEN 1 - POWER(((((((0.000005383*$1+0.0000488906)*$1+0.0000380036)*$1+0.0032776263)*$1+0.0211410061)*$1+0.049867347)*$1+1),-16)/2
ELSE 1 - pnorm(-$1)
END;
$$ LANGUAGE SQL IMMUTABLE STRICT;
@un1ko85
un1ko85 / serp_heatmap.py
Created Jun 1, 2022 — forked from eliasdabbas/serp_heatmap.py
Create a heatmap of SERPs, using a table with columns: "keyword", "rank", and "domain"
View serp_heatmap.py
import plotly.graph_objects as go
import pandas as pd
def serp_heatmap(df, num_domains=10, select_domain=None):
df = df.rename(columns={'domain': 'displayLink',
'searchTerms': 'keyword'})
top_domains = df['displayLink'].value_counts()[:num_domains].index.tolist()
top_domains = df['displayLink'].value_counts()[:num_domains].index.tolist()
top_df = df[df['displayLink'].isin(top_domains) & df['displayLink'].ne('')]
@un1ko85
un1ko85 / README.md
Created Aug 29, 2020 — forked from NiceGuyIT/README.md
nginx JSON to Filebeat to Logstash to Elasticsearch
View README.md

Intro

This is an example configuration to have nginx output JSON logs to make it easier for Logstash processing. I was trying to get nginx > Filebeat > Logstash > ES working and it wasn't until I connected Filebeat directly to Elasticsearch that I saw the expected data. Google led me to ingest-convert.sh and I realized filebeat setup works for Filebeat > ES but not Filebeat > Logstash > ES. This is because Logstash does not use ingest pipelines by default. You have to enable them in the elasticsearch output block.

Having nginx log JSON in the format required for Elasticsearch means there's very little processing (i.e. grok) to be done in Logstash. nginx can only output JSON for access logs; the error_log format cannot be changed.

Extra fields are output and not used by the Kibana dashboards. I included them in case they might be useful. Since they are not declared in the filebeat setup, their default is "string" when yo

@un1ko85
un1ko85 / docker-compose.yml
Created Aug 29, 2020 — forked from axw/docker-compose.yml
Docker Compose with Elastic Stack and APM Server 6.5.0
View docker-compose.yml
version: "2.1"
services:
apm-server:
image: docker.elastic.co/apm/apm-server:${STACK_VERSION:-6.5.0}
ports:
- "127.0.0.1:${APM_SERVER_PORT:-8200}:8200"
- "127.0.0.1:${APM_SERVER_MONITOR_PORT:-6060}:6060"
command: >
apm-server -e
-E apm-server.rum.enabled=true
View sales product .sql
-- Продажи товаров с атрибутами
SELECT
orders.created_at AS created_at,
products.id,
-- Категории
(
SELECT string_agg(c.title, ' / ')
FROM hm_taxonomies_category_product c_p
JOIN hm_taxonomies_categories c ON c_p.category_id = c.id
WHERE c_p.product_id = products.id -- todo: order
View console_72.sql
-- Сравнение продаж текущий год с предыдущим
SELECT
-- Категории
(
SELECT string_agg(c.title, ' / ')
FROM hm_taxonomies_category_product c_p
JOIN hm_taxonomies_categories c ON c_p.category_id = c.id
WHERE c_p.product_id = s.product_id -- todo: order
) AS categories,
@un1ko85
un1ko85 / README.md
Created Apr 23, 2020 — forked from mosquito/README.md
Add doker-compose as a systemd unit
View README.md

Docker compose as a systemd unit

Create file /etc/systemd/system/docker-compose@.service

[Unit]
Description=%i service with docker compose
Requires=docker.service
After=docker.service
View orders.sql
SELECT
orders.created_at AS created_at,
products.id AS product_id,
products.article,
(
SELECT string_agg(c.title, ' | ') FROM hm_taxonomies_category_product c_p
JOIN hm_taxonomies_categories c ON c_p.category_id = c.id
WHERE c_p.product_id = products.id
) AS category,
products.title AS product_title,
@un1ko85
un1ko85 / products_reposts.sql
Last active Mar 31, 2019
Отчет всех движений товаров в магазине
View products_reposts.sql
SELECT
products.id as id,
products.title as product_title,
products.article,
products.state,
products.on_hand as stock, --количество в базовых единицах оприходования: бисер в граммах, бусины в шт, нитях.
category.title as category,
products.sold as sold, -- количество проданных базовых единиц товара
(products.classes->>'revenue_class') || (products.classes->>'profit_class') || '/' || (products.classes->>'variance_class') AS revenue_class,
@un1ko85
un1ko85 / pf.conf
Created May 21, 2017
pf.conf freebsd kiev server
View pf.conf
#### First declare a couple of variables ####
### Outgoing tcp / udp port ####
### 43 - whois, 22 - ssh ###
tcp_services = "{ ssh, smtp, domain, www, https, 22, ntp, 43, 587}"
udp_services = "{ domain, ntp }"
### allow ping / pong ####
icmp_types = "{ echoreq, unreach }"
### Allow IP
enabled_ip = "{ 188.231.230.23, 188.231.222.49, 134.249.151.49 }"