Skip to content

Instantly share code, notes, and snippets.

@hakib
Last active November 23, 2023 23:04
Show Gist options
  • Star 14 You must be signed in to star a gist
  • Fork 5 You must be signed in to fork a gist
  • Save hakib/5cbda96c8121299088115a94ec634903 to your computer and use it in GitHub Desktop.
Save hakib/5cbda96c8121299088115a94ec634903 to your computer and use it in GitHub Desktop.
CountTimeoutLimitPaginator - Paginator that enforced a timeout on the count operation.
class TimeLimitedPaginator(Paginator):
"""
Paginator that enforced a timeout on the count operation.
When the timeout is reached a "fake" large value is returned instead,
Why does this hack exist? On every admin list view, Django issues a
COUNT on the full queryset. There is no simple workaround. On big tables,
this COUNT is extremely slow and makes things unbearable. This solution
is what we came up with.
"""
@cached_property
def count(self):
# We set the timeout in a db transaction to prevent it from
# affecting other transactions.
with transaction.atomic(), connection.cursor() as cursor:
cursor.execute('SET LOCAL statement_timeout TO 200;')
try:
return super().count
except OperationalError:
return 9999999999
@maheshgawali
Copy link

maheshgawali commented Dec 6, 2019

Thanks for this @hakib
Modified it to use with MySql backend

class TimeLimitedPaginator(Paginator):
    """
    Paginator that enforced a timeout on the count operation.
    When the timeout is reached a "fake" large value is returned instead,
    Why does this hack exist? On every admin list view, Django issues a
    COUNT on the full queryset. There is no simple workaround. On big tables,
    this COUNT is extremely slow and makes things unbearable. This solution
    is what we came up with.
    """

    @cached_property
    def count(self):
        # We set the timeout in a db transaction to prevent it from
        # affecting other transactions.
        with transaction.atomic(), connection.cursor() as cursor:
            # timeout in milliseconds
            cursor.execute('SET SESSION MAX_EXECUTION_TIME=2000;')
            try:
                count = super().count
            except OperationalError:
                count = 9999999999
            finally:
                # reset to default timeout
                cursor.execute('SET SESSION MAX_EXECUTION_TIME=0;')
            return count

@hakib
Copy link
Author

hakib commented Dec 6, 2019

Hey @maheshgawali, very cool!

To be on the safe side, better to move the second execute to a finally clause.

@maheshgawali
Copy link

Oh yes, updated, thanks for the pointer 👍

@browniebroke
Copy link

Just a note/question: it doesn't work on our Django deployment, it causes other admin queries to timeout.

From a quick investigation, we suspect it's because we have ATOMIC_REQUESTS enabled. Under such setup, the with transaction.atomic() is NOT opening a new transaction, instead it creates a savepoint, which is not enough to isolate local statement timeout.

Maybe something to mention as a requirement of this approach in the associated blog post?

@hakib
Copy link
Author

hakib commented Apr 6, 2020

Hey @browniebroke, it makes sense that it won't work with savepoints, as you mentioned. I actually never saw this setting being used, interesting. I wonder what the usecase is for setting atomic requests to true.

My suggestion to you is to fetch the timeout before the count, set it, and then revert back to the previous value (preferebly using finally).

@mwschall
Copy link

mwschall commented Sep 2, 2020

I dunno what changed along the way, or where exactly it's entering a transaction higher up the call stack in my project, but this wouldn't work for me until I moved the try/except outside of with transaction.atomic(). The original code hides the error inside the transaction preventing an auto-rollback. Otherwise it's quite elegant for a (literally) quick solution to another one of Django's admin woes!

# https://medium.com/@hakibenita/optimizing-django-admin-paginator-53c4eb6bfca3
class TimeoutPaginator(Paginator):
    @cached_property
    def count(self):
        try:
            with transaction.atomic(), connection.cursor() as cursor:
                cursor.execute('SET LOCAL statement_timeout TO 200;')
                return super().count
        except OperationalError:
            return 9999999999

@hakib
Copy link
Author

hakib commented Sep 2, 2020

The SET LOCAL command only affect the current transaction. It shouldn't matter where the try catch is, because when the function ends the transaction ends either because an exception was raised, or because the context was closed. But, if it's working for you stick with it 😉

@mwschall
Copy link

mwschall commented Sep 2, 2020

Aye, Postgres is handy in that way, but with the try/catch inside Django wasn't rolling back the savepoint. So any outer transaction was entering an abort state from the timeout error and that was causing stuff down the line to fail... I think. Honestly I don't understand the inner workings in question but as you say, if it works it works~

@formacube
Copy link

Hi,

Thanks for this @hakib
Modified it to use with MySql backend

            cursor.execute('SET SESSION MAX_EXECUTION_TIME=2000;')

ARG........
I now realize after many lost hours that the main reason I was repeatedly receiving strange errors that I was not able to interpret
is that
cursor.execute('SET LOCAL statement_timeout TO 200;')

was not recognized by my sqlite server !

I have been searching on internet how to implement a request execution timeout on sqlite, without any success.

Could one of you tell me how I could do that?
Thanks in advance

@hakib
Copy link
Author

hakib commented Nov 16, 2020

Hey @formacude,

As far a I can tell, SQLite does not provide a way to limit the time of a single query. the way MySQL and PostgreSQL do. The closest I could find is this.

If you come up with a solution for SQLite please share ;)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment