Skip to content

Instantly share code, notes, and snippets.

View zafergurel's full-sized avatar
🏀
Focusing

Zafer Gürel zafergurel

🏀
Focusing
  • CTO@Perculus / PhDc@Ozyegin Uni.
  • İstanbul
  • X @xaferel
View GitHub Profile
@pdarragh
pdarragh / papers.md
Last active April 17, 2024 19:29
Approachable PL Papers for Undergrads

Approachable PL Papers for Undergrads

On September 28, 2021, I asked on Twitter:

PL Twitter:

you get to recommend one published PL paper for an undergrad to read with oversight by someone experienced. the paper should be interesting, approachable, and (mostly) self-contained.

what paper do you recommend?

@xcambar
xcambar / nginx-cors.conf
Last active November 1, 2021 09:27 — forked from algal/nginx-cors.conf
nginx configuration for CORS (Cross-Origin Resource Sharing), with an origin whitelist, and HTTP Basic Access authentication allowed
#
# A CORS (Cross-Origin Resouce Sharing) config for nginx
#
# == Purpose
#
# This nginx configuration enables CORS requests in the following way:
# - enables CORS just for origins on a whitelist specified by a regular expression
# - CORS preflight request (OPTIONS) are responded immediately
# - Access-Control-Allow-Credentials=true for GET and POST requests
@GABeech
GABeech / haproxy.cfg
Created August 21, 2014 18:35
Stack Exchange HAProxy
# This is an example of the Stack Exchange Tier 1 HAProxy config
# The only things that have been changed from what we are running are:
# 1. User names have been removed
# 2. All Passwords have been remove
# 3. IPs have been changed to use the example/documentation ranges
# 4. Rate limit numbers have been changed to randome numbers, don't read into them
userlist stats-auth
group admin users $admin_user
user $admin_user insecure-password $some_password
@denji
denji / nginx-tuning.md
Last active April 26, 2024 11:21
NGINX tuning for best performance

Moved to git repository: https://github.com/denji/nginx-tuning

NGINX Tuning For Best Performance

For this configuration you can use web server you like, i decided, because i work mostly with it to use nginx.

Generally, properly configured nginx can handle up to 400K to 500K requests per second (clustered), most what i saw is 50K to 80K (non-clustered) requests per second and 30% CPU load, course, this was 2 x Intel Xeon with HyperThreading enabled, but it can work without problem on slower machines.

You must understand that this config is used in testing environment and not in production so you will need to find a way to implement most of those features best possible for your servers.

@jboner
jboner / latency.txt
Last active April 30, 2024 18:37
Latency Numbers Every Programmer Should Know
Latency Comparison Numbers (~2012)
----------------------------------
L1 cache reference 0.5 ns
Branch mispredict 5 ns
L2 cache reference 7 ns 14x L1 cache
Mutex lock/unlock 25 ns
Main memory reference 100 ns 20x L2 cache, 200x L1 cache
Compress 1K bytes with Zippy 3,000 ns 3 us
Send 1K bytes over 1 Gbps network 10,000 ns 10 us
Read 4K randomly from SSD* 150,000 ns 150 us ~1GB/sec SSD