Skip to content

Instantly share code, notes, and snippets.

@crizant
Created February 7, 2020 17:31
Show Gist options
  • Save crizant/48de514d9e5f43b252bad10e6a81734b to your computer and use it in GitHub Desktop.
Save crizant/48de514d9e5f43b252bad10e6a81734b to your computer and use it in GitHub Desktop.
Node.js handle dynamoDb batch write limit
const R = require('ramda')
const aws = require('aws-sdk')
const batchWriteToDb = async (tableName, data) => {
const dynamoDb = new aws.DynamoDB({
maxRetries: 999
})
const docClient = new aws.DynamoDB.DocumentClient({
service: dynamoDb,
convertEmptyValues: true
})
const dataSegments = R.splitEvery(25, data)
// dynamoDb can only write 25 items once
for (let i = 0; i < dataSegments.length; i++) {
const segment = dataSegments[i]
const params = {
RequestItems: {
[tableName]: segment.map(item => ({
PutRequest: {
Item: item
}
}))
}
}
try {
let response = await docClient.batchWrite(params).promise()
while (!R.isEmpty(response.UnprocessedItems)) {
const count = response.UnprocessedItems[tableName].length
console.log(`${count} unprocessed item(s) left, retrying...`)
const params = {
RequestItems: response.UnprocessedItems
}
response = await docClient.batchWrite(params)
}
} catch (error) {
console.log(error, error.stack)
}
}
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment