aws lambda – aws-sdk s3.getObject() size limit
I am trying to process a 5GB tif file using sharp in a lambda function
but the s3.getObject() request fails with the following error:
{
"errorType": "RangeError",
"errorMessage": "The value "5479395327" is invalid for option "size"",
"trace": [
"RangeError [ERR_INVALID_OPT_VALUE]: The value "5479395327" is invalid for option "size"",
" at Function.alloc (buffer.js:370:3)",
" at Object.alloc (/var/runtime/node_modules/aws-sdk/lib/util.js:136:28)",
" at Object.concat (/var/runtime/node_modules/aws-sdk/lib/util.js:175:28)",
" at Request.HTTP_DONE (/var/runtime/node_modules/aws-sdk/lib/event_listeners.js:424:36)",
" at Request.callListeners (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:106:20)",
" at Request.emit (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:78:10)",
" at Request.emit (/var/runtime/node_modules/aws-sdk/lib/request.js:688:14)",
" at IncomingMessage.onEnd (/var/runtime/node_modules/aws-sdk/lib/event_listeners.js:334:26)",
" at IncomingMessage.emit (events.js:326:22)",
" at IncomingMessage.EventEmitter.emit (domain.js:483:12)"
]
}
code:
const getObjectRequest: GetObjectRequest = {
Bucket: bucket,
Key: filename
};
const s3File = await s3.getObject(getObjectRequest).promise();
I’ve set the memory of the lambda function to 10240MB and timeout to 15 minutes (max).
Is there any other way to load the image into sharp?
Read more here: Source link