Assuming you have an AWS account and a Load Balancer that's logging to S3, the only other things you'll need are:
-
- NOTE: Scroll down. You don't have to build from source.
-
goaccessrc file to map the log format. Create the file ~/.goaccessrc
# AWS S3 Log Format # REF: https://docs.aws.amazon.com/AmazonS3/latest/userguide/LogFormat.html date-format %d/%b/%Y log-format %^ %^ [%d:%^] %h %^ %^ %^ %^ "%^ %r %^" %s %^ %b %^ %^ %^ "%^" "%u" %^ # AWS Elastic Load Balancer Log Format log-format %dT%t.%^ %^ %h:%^ %^ %T %^ %^ %^ %s %^ %b "%r" "%u" date-format %Y-%m-%d time-format %H:%M:%S
Copy some logs from the s3 bucket to your local. This eample demonstrates recursively copying all the logs from July, 2023.
mkdir s3logs && cd s3logs
aws s3 cp s3://<bucket>/<prefix>/2023/07/ ./ --recursive
find . -exec cat {} \; | goaccess -a
find . -exec cat {} \; | goaccess -a -o report_2023-07.html
# download the logs from s3
gunzip -c *.gz >> cloudfront.log
goaccess --log-format CLOUDFRONT logs.log
If desired, you can sort the logs files, recurssively, with the following.
find . -type f -name "*.log" -ls | awk '{print $11}' | sort -k 7,7
Using the above...
while read line; do cat $line ; done <<< $(find . -type f -name "*.log" -ls | awk '{print $11}' | sort -k 7,7) | goaccess -a
#!/usr/bin/env bash
rm logs.log
s3_path='s3://bucket/path/to/logs/'
while read line; do
file=$(awk '{print $4}' <<< $line)
aws s3 cp ${s3_path}${file} .
gunzip -c $file >> logs.log
rm $file
done <<< $(aws s3 ls "$s3_path" | tail -n 10