Skip to content

Instantly share code, notes, and snippets.

@hjhart
Last active March 18, 2019 09:51
Show Gist options
  • Save hjhart/65a5ab33623c14995b48e1e7bdd3ff51 to your computer and use it in GitHub Desktop.
Save hjhart/65a5ab33623c14995b48e1e7bdd3ff51 to your computer and use it in GitHub Desktop.
Identifying Expensive PostgreSQL queries
select substr(query, 0, 250), calls,
to_char(total_time/(60*60), '999,999,9999,999') as "Cumulative Time (hrs)", rows,
to_char(total_time/calls, '999.999') as per_call_ms
from pg_stat_statements
order by total_time desc
limit 12;
select pg_stat_statements_reset()
query | calls | Cumulative Time (hrs) | rows | per_call_ms
-------------------------------------------------------------------------------+-------------+-----------------------+------------+-------------
SELECT "posts".* FROM "posts" WHERE "posts"."crush_id" = ? LIMIT ? | 256078900 | 1510,418 | 35940202 | 21.234
UPDATE "posts" SET "state" = ?, "updated_at" = ? WHERE "posts"."id" = ? | 48610033 | 1086,287 | 48610033 | 80.449
etc... etc...
(12 rows)
@angapov
Copy link

angapov commented Mar 18, 2019

to_char(total_time/(60*60), '999,999,9999,999') as "Cumulative Time (hrs)", rows, 

Total_time is in milliseconds, so to convert it to hours you should divide it by 1000.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment