How to turn an unreadable stream into a readable stream

Hey,

Today as usual, I was playing around with S3. Sometimes you are downloading stuff, often you are uploading stuff as well. I was putting a bunch of objects into buckets, and I've got an unusual source stream. I needed to turn that stream into a readable stream and to be honest I did not have the straightforward solution on top of my head, so I dig a little bit and found out a quite interesting solution.

Node.js File System and Stream apis

Node.js has native file system and stream apis that let you do a huge variety of stuff with streams. You can play with stream, you can transform them through a pipe smart stuff with streams.

Classic stuff we are use to do with stream is the following, you have a readStream which define our source, and a writeStream which defines our destination.
See the following from File System native module from Node.js https://nodejs.org/api/fs.html

const fs = require("fs")
const p = require("phin")

const main = async () => {
  const output = "/path/to/file/output.txt"
  const readStream = await p({
    url: "https://example.com/18e977f5-2663-4f97-aa69-2cdcf1f18a30",
    followRedirects: true,
    stream: true,
  })
  const writeStream = fs.createWriteStream(output)
  readStream.pipe(writeStream)
  console.log(`File saved to ${output}`)
}
Downloading a remote file locally

This are classic stuff we are used to do, also working with S3, like downloading stuff and uploading it back directly to s3 could be done this way

const p = require("phin")

const main = async () => {
  const readStream = await p({
    url: "https://example.com/18e977f5-2663-4f97-aa69-2cdcf1f18a30",
    followRedirects: true,
    stream: true,
  })
  const s3 = new AWS.S3()
  const file = { Bucket: BUCKET_NAME, Key: FILENAME, Body: readStream }
  s3.upload(file, (err, data) => {
    if (err) {
      return console.log("Error while uploading your file")
    }
    console.log(`File saved to ${output}`)
  })
}
Transfering a remote file to S3

The vast majority of the time, your stream will be a readStream if you do control the way you are getting it. However as I said, sometime you could use a third party libs which does not return a readable stream and you won't be able to transfer your file to S3. This is what I faced recently and you'll the following error.

.../node_modules/aws-sdk/lib/s3/managed_upload.js:422
    var buf = self.body.read(self.partSize - self.partBufferLength) ||
                        ^

TypeError: self.body.read is not a function
    at ManagedUpload.fillStream (.../node_modules/aws-sdk/lib/s3/managed_upload.js:422:25)
    at Stream.<anonymous> (.../node_modules/aws-sdk/lib/s3/managed_upload.js:192:28)
    at Stream.emit (events.js:223:5)
    at Stream.EventEmitter.emit (domain.js:475:20)
    at Stream.handleEnd (.../node_modules/duplexer/index.js:81:21)
    at Stream.emit (events.js:228:7)
    at Stream.EventEmitter.emit (domain.js:475:20)
    at Stream.end (...)
    at _end (.../node_modules/through/index.js:65:9)
    at Stream.stream.end (.../node_modules/through/index.js:74:5)
Error when stream is not a readable stream

In order to solve this you have to turn your output stream into a readable stream. I got leads and solutions from @chrisradek from here:

The role of stream Passthrough In S3 Upload? · Issue #2100 · aws/aws-sdk-js
Hello, I am trying to upload a mp3 file from a website directly to S3. when I use request.get(options) in uploadParams&#39;s body as follow Method1 const options = { &#39;url&#39;: url, &#39;header...

The solution is simple as stream just provide PassThrough stream apis, more info here:

Usage:

const { PassThrough } = require("stream");
  
const asReadable = (unreadableStream) => {
    const pass = new PassThrough();
    return unreadableStream.pipe(pass);
}

That's sit for today enjoy.