Skip to content

Instantly share code, notes, and snippets.

@Saad-ISAA
Created July 10, 2021 04:25
Show Gist options
  • Save Saad-ISAA/f7be94a72a23efadacf722bf7f9cc08a to your computer and use it in GitHub Desktop.
Save Saad-ISAA/f7be94a72a23efadacf722bf7f9cc08a to your computer and use it in GitHub Desktop.
Strapi with custom AWS S3 uploader for Large Files ~50GB
// File: aws/controllers/aws.js
const { S3Client, CreateMultipartUploadCommand, UploadPartCommand, ListPartsCommand,
CompleteMultipartUploadCommand } = require("@aws-sdk/client-s3");
const path = require("path");
const fs = require('fs');
const s3Client = new S3Client({ region: process.env.AWS_REGION });
module.exports = {
// this function Fetches a Unique ID from AWS S3 for an upload which will be in multi parts
async GetMultiPartId(ctx) {
try {
// Key is the complete path in your bucket where the filw will be uploaded along with file name with extension
// example: /path/to/folder/file_name.zip
// Bucket: This can be fetched from an env variable or if each user has it's own bucket in S3, then you can pass
// it as an argument
const bucketParams = { Key: ctx.request.body.file_name, Bucket: ctx.request.body.bucket_name };
const data = await s3Client.send(
new CreateMultipartUploadCommand(bucketParams)
);
console.log(data.UploadId); // UploadId is required to be sent with each call in the parts upload
ctx.send(data)
} catch (err) {
console.log("Error", err);
}
},
// This method will List all the parts received by S3 against a unique ID, an array of parts will be sent to complete function to
// comiple all the parts in S3 buckets.
async GetPartsList(ctx) {
var bucketParams = {
Bucket: ctx.request.body.bucket_name,
Key: ctx.request.body.file_name,
UploadId: ctx.request.body.upload_id,
};
const data = await s3Client.send(
new ListPartsCommand(bucketParams)
);
if (data) {
ctx.send(data)
} else {
console.log('Error');
ctx.throw(400);
}
},
async UploadParts(ctx) {
const { state: { userAbility, user }, request: { body } } = ctx;
var files = ctx.request.files.file
async function readFile(path) {
return new Promise((resolve, reject) => {
fs.readFile(path, function (err, data) {
if (err) {
reject(err);
}
resolve(data);
});
});
}
const file_data = await readFile(files.path);
const uploadParams = {
Bucket: body.bucket_name,
Key: body.folder_path + files.name,
Metadata: {
type: body.type,
createdBy: body.createdBy
},
Body: file_data,
ACL: 'public-read-write',
PartNumber: body.part_number, // the index of the chunk generated by the front-end
UploadId: body.upload_id,
};
const data = await s3Client.send(new UploadPartCommand(uploadParams));
// delete the data from local storage to free up space
fs.unlink(files.path, function (err) {
if (err) {
console.error(err);
}
});
if (data) {
ctx.send(data)
} else {
console.log('Error');
ctx.throw(400);
}
},
async CompleteMultiPart(ctx) {
var completeParams = {
Bucket: ctx.request.body.bucket_name,
Key: ctx.request.body.file_name,
MultipartUpload: {
Parts: ctx.request.body.parts // the data.Parts fetched from GetPartsList
},
UploadId: ctx.request.body.upload_id,
};
const completed = await s3Client.send(
new CompleteMultipartUploadCommand(completeParams) // the file can now be viewed in S3 bucket
);
if (completed) {
ctx.send(completed)
} else {
console.log('Error');
ctx.throw(400);
}
},
}
@derrickmehaffy
Copy link

For each of the ctx.send() lines you can actually swap those out for just standard return.

In the case of errors you can instead use return ctx.badRequest('Some error Message') instead also (this is using the boom methods listed in here: https://github.com/strapi/strapi/blob/df832fafd1705266d65bae22d8dd3a0fe37964cf/packages/strapi/lib/middlewares/boom/index.js#L12-L43

I think the only thing missing here is adding the final file to the Strapi file model so it would be listed in the media library but you may not have done so intentionally.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment