How to PUT Object on Scaleway S3 Object Storage


Today I'll present you Scaleway S3 Object Storage Solution. I had the opportunity to try out public Beta few months ago, and I was really pleased how things went out. I was surprised how I easily integrated it even working with the aws-sdk npm package. Nevertheless recently few days ago I did not manage to make it work as expected with aws-sdk so this is why we will visit a bit how you can integrate it on your side and especially with Node.js ecosystem.

Scaleway S3 Object Storage


  1. Create an account here:
  2. Log in here:
  3. Create a bucket
  4. Get credentials

Once everything is set up, you'll have to make a request to PUT data into your bucket. It has many way to interact with S3 Object Storage solution, either with well known open clients such as:

  • AWS Cli
  • s3cmd
  • s3fs

More information here:

Also you can make it programmatically, I wanted to make this with Node.js. So far while using the Public Beta, as I said I was able to directly communicate with Scaleway S3 Object Storage, however now it seems AWS recently introduced AWSv4 signature and it make things a little bit more complicated to go through.

What you need to know, is that Scaleway S3 Object Storage solution is based on S3 protocol but does not support all functions that are currently supported by AWS S3 service. So you may double check if what you are trying to achieve match your expectations. Nevertheless they do support the vast majority of S3 features and you may figure out your needs in it. You can check current status of their api from there:

Make a PUT Object request with Node.js

I used a config.json file in order to store my credentials, this can be done through environments variable as well.

  "accessKeyId": "YOUR_ACCESS_KEY_ID",
  "secretAccessKey": "YOUR_SECRET_ACCESS_KEY,
  "region": "fr-par | nl-ams",
  "endpoint": ""
const aws4 = require('aws4')
const config = require('./config')
const fs = require('fs')
const https = require('https')
const pump = require('pump')

const putObject = async path => {
  const hash = aws4.sign(
      service: 's3',
      region: config.region,
      method: 'PUT',
      path: path,
      host: config.endpoint,
      headers: {
        'Content-Type': 'application/octet-stream'
  return https.request({
    hostname: config.endpoint,
    port: 443,
    method: 'PUT',
    path: path,
    headers: hash.headers

const main = async (localPath, bucketPath, archiveName) => {
  const localFile = [localPath, archiveName].join('/')
  const bucketFile = [bucketPath, archiveName].join('/')
  const ws = await putObject(bucketFile)
  const rs = fs.createReadStream(localFile)
  pump(rs, ws, () => console.log('Object uploaded !'))

main('/home/iiaku/dev/backup', '', 'index.js')
  1. aws4 will help you to generate a s3 awsv4 signature, you can do it on your own following: However aws4 dependency will help you drastically to get rid of this. see:
  2. config.json load your configuration credentials
  3. fs will help you to make a readstream from your local file through fs.createReadStream
  4. https make an https request toward api endpoint
  5. pump, just the clean and safest way to do a rs.pipe(ws)see here:

Just to let you know I did work with it in order to make my backup storage run smoothly. I'm just generating sql dump calling a shell script, once the backup is done, gathering all the generated files into a tar.gz file, loading it with the putObject script and sending it back to my bucket on a sub-directory designed for it.

That's sit for today, hope you enjoyed the way you can work with Scaleway Object Storage solution, give it a try.