Skip to content

Instantly share code, notes, and snippets.

@lucasrcosta
Last active February 20, 2019 08:51
Show Gist options
  • Star 7 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save lucasrcosta/4ddd0afadfee75398536cb4125a8732b to your computer and use it in GitHub Desktop.
Save lucasrcosta/4ddd0afadfee75398536cb4125a8732b to your computer and use it in GitHub Desktop.
Tornado Handler Thread Decorator
from concurrent.futures import ThreadPoolExecutor
from datetime import timedelta
from tornado import gen
from tornado.concurrent import run_on_executor
THREADPOOL_MAX_WORKERS = 10
THREADPOOL_TIMEOUT_SECS = 30
def onthread(function):
@gen.coroutine
def decorated(self, *args, **kwargs):
future = executed(self, *args, **kwargs)
try:
response = yield gen.with_timeout(
timedelta(seconds=THREADPOOL_TIMEOUT_SECS), future)
if isinstance(response, types.GeneratorType): # subthreads
response = yield gen.with_timeout(
timedelta(seconds=settings.THREADPOOL_TIMEOUT_SECS),
next(response))
except gen.TimeoutError as exc:
future.cancel()
raise exc
self.write(response)
@run_on_executor
def executed(*args, **kwargs):
return function(*args, **kwargs)
return decorated
# Usage
class BaseHandler(object):
executor = ThreadPoolExecutor(max_workers=THREADPOOL_MAX_WORKERS)
...
class MyHandler(BaseHandler):
@onthread
def get(self):
...
return response
class MyOtherHandler(BaseHandler):
@onthread
def get(self):
yield self.couroutine()
@gen.coroutine
def couroutine():
data = yield {
'one': self.threaded_one(),
'two': self.threaded_two(),
}
return data
@run_on_executor
def threaded_one():
...
return response
@run_on_executor
def threaded_two():
...
return response
@JulienParis
Copy link

Hi ! I really wanted to thank you for this decorator, you saved me a lot of headaches !

I'm working on a opensource scraper webapp based on Tornado and Scrapy, and launching spiders as background tasks was really a painpoint untill I tried to implement your solution.
You can see your decorator added here, and the repo is there

This opensource scraper project is really in its early stage and it's my first experience with Tornado, so you would certainly see many weird stuff in my code... but at least I know your part works great !!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment