adds code


Extract large zip files in aws lambda cry.ptojs node js tutorial file . You can easily do it using simple python script. Amazon S3 can publish events to AWS Lambda and invoke your Lambda function by passing the event data as a parameter. To set this up as a end to end experience you need to provide two permissions You give your lambda function the permissions to read from a bucket and write into a bucket. You also need to give permissions to S3 bucket to invoke respective lambda function. This way you will process image saved on S3 bucket and create its compressed forms. Important point to note is you must use two buckets. If you use the same bucket as the source and the target, each processed image uploaded to the source bucket triggers another object created event, which then invokes the Lambda function again, creating an unwanted recursion. If you want to save images on single bucket, you can save original image to 2nd bucket where you will put your processed images while processing those images in same lambda function. You upload a deployment package as Lamda function. Its basically a folder. from future import printfunction  import boto3  import os  import sys  import uuid  from PIL import Image  import StringIO  from wandage import Image as WandImage  s3client boto3ients3  s3co.nnection boto3sources3  list1 50, 100, 200, 400, 800, 1600  def createImagex, y, im, ext, key   size x, y  printsize  key key strx stry . ext  return imsizesize, key  def handlerevent, co.ntext   for record in eventRecords   bucket records3bucketname  key records3objectkey  s3object s3co.nnection.Objectbucket, key  response s3objectt  splitkey key.split.  Perform the resize operation  stream StringIO.StringIOresponseBodyad  imageImage.openstream  x, y imageze  if x y   for singlesize in list1   try   if singlesize y   for singlesize in list1   try   if singlesize x   singlesize1 intsinglesize/x y  if singlesize1 ! 0   resizedimage, newkey createImagesinglesize, singlesize1, image, splitkey1, splitkey0  resizeddataresizedimagebytes  s3resizedobject s3co.nnection.Objectttresize, newkey  s3resizedobject.putACLauthenticated read, Bodyresizeddata  except IOError   printError  else   for singlesize in list1   try   if singlesize y   singlesize1 intsinglesize/y x  if singlesize1 ! 0   resizedimage, newkey createImagesinglesize1, singlesize, image, splitkey1, splitkey0  resizeddataresizedimagebytes  s3resizedobject s3co.nnection.Objectttresize, newkey  s3resizedobject.putACLauthenticated read, Bodyresizeddata  except IOError   printError  To create a deployment package, you have to create a directory put your lambda function file in that folder. Install all library dependencies in that folder Create a zip of that folder. Your deployment package is ready. You can also use node.js or java. AWS Lambda supports python2.7 not python3. You can follow this link for more info. Step 2.1 Create a Deployment Package Next step will be create lambda functionUpload Deployment Package. aws lambda create function   region us wt 2   function name ImageProcess   zip file fileb //file path/ImageProcess   role role arn   handler ImageProcess.handler   runtime runtime   profile adminuser   timeout 10   memory size 1024  Here You need to set role, profile. In mycase runtime is python. Finally add permissions to the lambda functions access permission policy. aws lambda add permission   function name ImageProcess   region us wt 2   statement id some unique id   action lambda InvokeFunction   principal s3azonawsm   source arn arn aws s3 sourcebucket   source account bucket owner account id   profile adminuser  Now whenever a image is uploaded on your bucket, its processed images will be uploaded to another bucket. Hope this will help you.
Previous Post Next Post