Created
October 29, 2020 03:53
-
-
Save shantanuo/29bf5a1466f537a9969668543054825b to your computer and use it in GitHub Desktop.
A test Lambda function to be integrated into redshift cluster
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# https://aws.amazon.com/blogs/big-data/accessing-external-components-using-amazon-redshift-lambda-udfs/ | |
import json | |
def lambda_handler(event, context): | |
number = str(event["arguments"][0][0]) | |
import requests | |
ret = dict() | |
try: | |
res = list() | |
myurl = 'http://some_site.com/InboundDetails.asmx/GetDetails?destination=' | |
page = requests.get(myurl+number[-10:]) | |
res.append((page.content).decode('utf-8')) | |
print (page.content) | |
ret['success'] = True | |
ret['results'] = res | |
except Exception as e: | |
ret['success'] = False | |
ret['error_msg'] = str(e) | |
return json.dumps(ret) | |
# Layer: arn:aws:lambda:us-east-1:770693421928:layer:Klayers-python38-requests:9 |
Redshift should ensure batches are small enough to not exceed the limit.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
There is a lambda invocation payload response limit of 6 MB. Will I get an error if results array is too large?