I am working with the following AWS service:

  • S3
  • Lambda
  • SNS

Recently I worked on a specific Lambda in order to report a step from a process. Basically, I have some images to process then once the job is done those images are put into a Bucket. This could be easily tracked down from S3 event system, actually you set up a trigger event link to a Lambda or a SNS topic and then you have all your needed information.

This is what I have done so far, however I wanted this to be more global, more general because I wanted to report in different way depending on the ressource. This is why I worked on a model with a Logicless Lambda. Meaning my lambda was only in charge to take a DynamoDB query as parameters and run it. This lead me to use the same Lambda for multiple use.

How did it work

I know that metadatas are definitely not designed for this, but I thought it could be a tips to use them in this way.


  1. Generate your file
  2. Attach your datas through one or various metadatas key/value as json

I worked with a SNS topic triggered with a S3 ObjectCreated:Put event.

Lambda is defined as bellow:

  1. Receiving a SNS event
  2. Parse S3 records associated
  3. Fetch each files associated and read metadatas accordingly
  4. Decode attached JSON datas
  5. Run your query

Be careful, this is good when you are working with internal file. I do not recommend to make this with public files. As I said before metadatas are not designed to this, it normally intended to be extra data attached to a content in order to define it. Nevertheless is some cases it could help and make thing easier.

Note that you do not necesarily need to fetch localy your files, a headObject see headObject is enough to read metadatas entries from a files.

/* The following example retrieves an object metadata. */

 var params = {
  Bucket: "examplebucket", 
  Key: "HappyFace.jpg"
 s3.headObject(params, function(err, data) {
   if (err) console.log(err, err.stack); // an error occurred
   else     console.log(data);           // successful response
   data = {
    AcceptRanges: "bytes", 
    ContentLength: 3191, 
    ContentType: "image/jpeg", 
    ETag: "\"6805f2cfc46c0f04559748bb039d69ae\"", 
    LastModified: <Date Representation>, 
    Metadata: {
    VersionId: "null"

The idea behind this is that when you put data on a S3 bucket you are not always forced to log this down as part of your process. Meaning if you want to track files on your bucket when this operation is done it's not a mandatory to add extra sugar to make a query and report it elsewhere. You can also attach as I said a metadata with your required parameters and then use this S3 event linked to a Lambda and do the job passively. That's a way.