First I created 3 droplets on digital ocean with 4-cores and 8GB of RAM. Login as root to each and run:
sysctl -w fs.file-max=12000500
sysctl -w fs.nr_open=20000500
ulimit -n 4000000
sysctl -w net.ipv4.tcp_mem='10000000 10000000 10000000'
defmodule Benchmarks do | |
def init do | |
:ok = :lbm_kv.create(Web.Job) | |
end | |
def count_entries do | |
IO.puts "Web.Job => #{:lbm_kv.match_key(Web.Job, :_) |> elem(1) |> Enum.count}" | |
end | |
def measure_throughput(fun, num_items) do |
# from https://github.com/Shopify/graphql-batch/blob/c84a5236ecb1c38fa275c76ee38017f30b7074be/examples/association_loader.rb | |
# This handles creating generic association loaders (see team_member_graph.rb for an example) | |
class AssociationLoader < GraphQL::Batch::Loader | |
def self.validate(model, association_name) | |
new(model, association_name) | |
nil | |
end | |
def initialize(model, association_name) | |
@model = model |
defmodule UserController do | |
def create(params) do | |
changeset = params | |
|> cast([:name, :email]) | |
|> validate_required([:email]) | |
|> validate_other_stuff() | |
WithSideEffects.create(changeset) | |
end | |
end |
defmodule GOL do | |
def encode(list) do | |
encode(nil, list) | |
end | |
# Figure out the starting condition, whether we start "on" or "blank" | |
def encode(nil, [1 | rest]), do: encode({:on, 1, 1}, rest) | |
def encode(nil, rest), do: encode({:blank, 1}, rest) | |
# If we have a sequence of "on" cells we need to know when it started and whether we are consecutive | |
# with the last cell |
These are my solutions to the three concurrency exercises from Katrina Owen's Go Post.
I'm posting here with example output in the hopes that someone call tell me how to do it better
I wanted to run another round of performance benchmarks for gnat
to see how it's request throughput has changed with the introduction of the ConsumerSupervisor
which handles things like processing each request in its own supervised process.
I used a CPU-optimized digital ocean droplet with 16 cores, gnatsd 1.3.0, erlang 21.2.2 and elixir 1.8.0.rc0
You can read the setup instructions below for more details and the results_by_concurrency.md
contains details about a lot of different runs.
I'm trying to measure the overhead in the system, so the requests are random byte strings that just get echoed back with processing. The measurements use byte strings of 4 bytes up to 1024 bytes.
The main idea here is compose shared functionality into a pipeline of functions that all implement some shared behaviour.
defmodule Notification.Event do
# The event (probably a bad name) is where you would put the structified JSON event you got from RabbitMQ
defstruct [:sent_at, :user, :event]
end
defmodule Notification.Plug do
@callback init(opts) :: opts
type | language | indeed.com postings | stackoverflow postings | |
---|---|---|---|---|
functional | erlang | 199 | 12 | |
functional | elixir | 293 | 33 | |
functional | clojure | 429 | 56 | |
functional | haskell | 356 | 17 | |
functional | f# | 126 | 10 | |
functional | akka | 532 | 31 | |
functional | functional reactive programming | 810 | 899 | |
both | scala | 5260 | 189 | |
both | javascript | 33347 | 1201 |