Skip to content

Instantly share code, notes, and snippets.

@pearsonhenri
Created October 1, 2022 13:20
Show Gist options
  • Save pearsonhenri/eece6be57b476511217dd677f5653f19 to your computer and use it in GitHub Desktop.
Save pearsonhenri/eece6be57b476511217dd677f5653f19 to your computer and use it in GitHub Desktop.
AWS S3 CloudWatch log export consolidation
from io import BytesIO
from gzip import GzipFile
import boto3
# assumes AWS authentication via ~/.aws/credentials, environment variables, or some other mechanism
s3_resource = boto3.resource('s3')
s3_client = boto3.client('s3') # yes, we really do want both
s3_bucket_name = 'my-log-export-bucket'
s3_bucket_resource = s3_resource.Bucket(s3_bucket_name)
log_export_prefix = 'prefix-you-specificied-in-export/<UUID of export job>'
local_log_file_name = 'my_consolidated_logfile.log'
for file in s3_bucket_resource.objects.filter(Prefix='log_export_prefix'):
# we work with client to avoid writing the file to disk
logfile = s3_client.get_object(Bucket=s3_bucket_name, Key=file.key)
bytestream = BytesIO(logfile['Body'].read())
text = GzipFile(None, 'rb', fileobj=bytestream).read().decode('utf-8')
lines = text.split('\n')
with open(local_log_file_name, 'a') as file:
for line in lines:
if '<specific filter string>' in line:
file.write(f'{line}\n')
@bethylamine
Copy link

Line 18 should use Prefix=log_export_prefix (without the quotes) right?

gzip.BadGzipFile: Not a gzipped file (b'Pe')

It looks like the first two bytes were not the magic number but rather the start of some text e.g. "Permission denied"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment