Skip to content

Instantly share code, notes, and snippets.

@vadivelselvaraj
Last active November 10, 2022 04:20
Show Gist options
  • Save vadivelselvaraj/e7798cfeb29ceeebb08c7693d0efdc50 to your computer and use it in GitHub Desktop.
Save vadivelselvaraj/e7798cfeb29ceeebb08c7693d0efdc50 to your computer and use it in GitHub Desktop.
CompactParquetFiles
# Read the S3 folder as glue dynamic data frames
input_dyf = glueContext.create_dynamic_frame_from_options("s3", {
"paths": [ inputPath ],
"recurse": True,
"groupFiles": "inPartition"
},
format = "parquet"
)
# Repartition them as required
repartitionedDYF = input_dyf.repartition(numberOfPartitions)
# Write them as glue parquet files to boost performance.
# Note: glueparquet files are compatible with parquet files and can be read
# by any tool that reads a parquet file.
glueContext.write_dynamic_frame.from_options(
frame = repartitionedDYF,
connection_type = "s3",
connection_options = {"path": outputPath},
format = "glueparquet"
)
@Nabeel-Khan-Ghauri
Copy link

As per the aws documentation(https://docs.aws.amazon.com/glue/latest/dg/grouping-input-files.html), groupFiles is supported for DynamicFrames created from the following data formats: csv, ion, grokLog, json, and xml. This option is not supported for avro, parquet, and orc. Did your code work?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment