Skip to content

Analyzing slow queries

Rohan Jay edited this page Sep 19, 2018 · 4 revisions

Queries that take longer than 1s to execute are logged on all the RDS databases by setting log_min_duration_statement to 1000 (ms). The duration threshold can be overridden using the AWS console.

Install aws-cli by following the instructions in AWS Command Line Interface.

Install pgBadger using Homebrew, apt, etc.

Download the logs from AWS using a command of the form:

aws rds download-db-log-file-portion --db-instance-identifier fec-govcloud-prod-replica-1 --log-file-name "error/postgresql.log.2017-12-04-01" --output text > postgresql.log.2017-12-04-01.log

Or can use the following to download all the logs for a given day

for i in $(seq -f "%02g" 0 23)
do
aws rds download-db-log-file-portion --db-instance-identifier fec-govcloud-prod-replica-1 --log-file-name "error/postgresql.log.2018-09-12-${i}" --output text > postgresql.log.replica-1.2018-09-12-${i}.log
aws rds download-db-log-file-portion --db-instance-identifier fec-govcloud-prod-replica-2 --log-file-name "error/postgresql.log.2018-09-12-${i}" --output text > postgresql.log.replica-2.2018-09-12-${i}.log
done

Analyze the downloaded logs using pgBadger:

pgbadger --prefix '%t:%r:%u@%d:[%p]:' *.log

This will create an HTML report named out.html.