Skip to content

Instantly share code, notes, and snippets.

@dommmel
Last active August 15, 2018 12:39
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save dommmel/5168595 to your computer and use it in GitHub Desktop.
Save dommmel/5168595 to your computer and use it in GitHub Desktop.
Parallelizing long running requests with celluloid
require 'celluloid'
require "benchmark"
require 'open-uri'
delay_seconds = [4,4,4,4,4,4]
BASE_URL = "http://slowapi.com/delay"
class Crawler
include Celluloid
def read(delay)
url = "#{BASE_URL}/#{delay}"
open(url) { |x| x.read }
end
end
pool = Crawler.pool(size: delay_seconds.length)
time = Benchmark.measure do
crawlers = delay_seconds.map do |delay|
begin
pool.future(:read, delay)
rescue DeadActorError, MailboxError
end
end
result = crawlers.compact.map { |crawler| crawler.value rescue nil }
end
#result.each_with_index {|i,v| puts "Response ##{v}: #{i}"}
puts time
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment