Skip to content

Instantly share code, notes, and snippets.

View parhamr's full-sized avatar

Reid Parham parhamr

View GitHub Profile

Development Environment User Stories

Boilerplate Features and Behaviors For Engineering Needs.

This document contains epics and their stories to identify engineering needs that should be satisfied by application development efforts. It is written for systems engineers and uses the perspectives of software, front end, and user experience engineers.

Epic: As an engineer, I can build a new development environment

As an engineer, I want to read a guide to familiarize myself with an application

Use the log output from wget to capture a list of URLs:

wget -a wget-log.txt --ignore-tags=img,link,script -X admin -e robots=off -nv --header="Accept: text/html" -r -l 0 --domains=example.org --no-cookies http://example.org/

Process that log file to collect a sorted list of unique URLs:

cat wget-log.txt | grep -F 'URL:http' | awk '{print $3}' | sort | uniq | sed -e s/^URL:// > tmp.txt
@parhamr
parhamr / sysctl.conf
Last active August 29, 2015 14:05 — forked from colby/-
Boilerplate Linux Kernel tweaks for a webserver
fs.file-max=131072
kernel.shmall=32768
kernel.shmmax=536870912
net.core.netdev_max_backlog=4096
net.core.optmem_max=25165824
net.core.rmem_default=25165824
net.core.rmem_max=25165824
net.core.somaxconn=4096
net.core.wmem_default=65536
net.core.wmem_max=25165824
@parhamr
parhamr / mariadb-10.0.10-global-status.md
Last active August 29, 2015 14:01
MySQL statistics for PHP upgrade research

MariaDB 10.0.10 global statistics

This test server has been running for 24.7 days as the data store for load testing Magento 1.13.1 EE against various PHP versions.

Minutely averages

  • 5042 SELECTs
  • 2301 INSERTs
  • 176 COMMITs
  • 253 UPDATEs
@parhamr
parhamr / opcache.ini
Last active March 14, 2016 23:15
Known working Zend OPcache configuration for PHP 5.5
; configuration for php ZendOpcache module
; Tuned for Magento 1.13.1 on PHP 5.5
; Test server has 8 CPU cores and 32 GB RAM
zend_extension=opcache.so
[opcache]
opcache.memory_consumption=256
opcache.interned_strings_buffer=12
opcache.max_accelerated_files=16000
opcache.enable_file_override=1

Magento Scaling Hypotheses

Magento’s checkout throughput can increase to at least 8 times its current capacity and up to 26 times more in ideal conditions.

  1. The current checkouts per hour limit for large, real-world Magento stores is 4,500
  2. This limit cannot effectively be increased with more and/or better hardware
  3. The improper type handling in Magento’s SQL code is the cause of the current limit
  4. If one SQL query is fixed, large Magento stores can scale to a new, real-world limit of 120,000 checkouts per hour
  5. For commodity hardware, this new limit might be 36,000 checkouts per hour
@parhamr
parhamr / Magento1.12-SQL_type_example.md
Last active December 25, 2015 17:18
MySQL 5.6.11 skips indices when given quoted INTs

Removing quotes from an INT column allows MySQL to use the index and this can reduce rows scanned by 99.997 percent. Example from Magento 1.12 on a production database; the query takes nearly a second to execute:

mysql> explain DELETE FROM `catalog_product_index_price` WHERE entity_id IN('433284', 433283)\G;
*************************** 1. row ***************************
           id: 1
  select_type: SIMPLE
        table: catalog_product_index_price
         type: ALL
possible_keys: NULL
@parhamr
parhamr / CompetitionEngine.md
Created October 15, 2013 19:05
How to represent competitions in code

Given…

  1. Competitions of fixed time will be scored on a measurement criteria
  2. Competitions of fixed measurements will be scored on a time criteria

The criteria can be high score OR low score wins

Measurements are typically distance (miles, km, etc) or counts (arbitrary; pins in bowling, free throws in basketball, dollars in fundraising)

Override layers are needed for some conditions where special scoring is wanted. These can be tournaments, score handicaps, or even something playful like house rules.

@parhamr
parhamr / MySQL data process.md
Last active December 25, 2015 03:49
Fast, safe (excepting schema changes), and ugly data transfer between MySQL databases
  1. Source host: mysqldump --skip-opt -e -q --compact --single-transaction -n -t --set-charset [$database_name] [$tables]
  2. Move the dump to destination host (scp)
  3. lock destination $tables
  4. truncate destination $tables
  5. `SET SESSION sql_mode='ALLOW_INVALID_DATES'; SET SESSION foreign_key_checks=0; SET SESSION unique_checks=0; SET SESSION autocommit=0;
  6. Load dump as SQL source
  7. commit transaction
  8. unlock destination $tables