Add s3 to code render lambdas #316

Closed
opened 2021-05-11 11:11:01 +02:00 by Irev-Dev · 1 comment
Irev-Dev commented 2021-05-11 11:11:01 +02:00 (Migrated from github.com)

Trying to return large files from the lambdas is not a good idea for a couple reasons, but chief among them is the API gatway has a 10mb limit. Instead the aim is to instead put the generated 3d file into s3 and return a link to it. We should be able to expire these files quickly.

Besides overcoming the immediate 10mb problem, pushing things to s3 is more compatible for future features where we want to keep for longer, it also gets around some base 64 ecoding issues that I've had with binary media types and the lambdas.

Trying to return large files from the lambdas is not a good idea for a couple reasons, but chief among them is the API gatway has a 10mb limit. Instead the aim is to instead put the generated 3d file into s3 and return a link to it. We should be able to expire these files quickly. Besides overcoming the immediate 10mb problem, pushing things to s3 is more compatible for future features where we want to keep for longer, it also gets around some base 64 ecoding issues that I've had with binary media types and the lambdas.
Irev-Dev commented 2021-05-11 11:13:47 +02:00 (Migrated from github.com)

I'm not loving the AWS docs, but this seems like a reasonable example of sending a read file which is pretty much what we want to do.
https://docs.aws.amazon.com/sdk-for-javascript/v3/developer-guide/s3-example-creating-buckets.html

// Import required AWS SDK clients and commands for Node.js.
const { S3Client, PutObjectCommand } = require("@aws-sdk/client-s3");
const path = require("path");
const fs = require("fs");

// Set the AWS Region.
const REGION = "REGION"; //e.g. "us-east-1"

const file = "OBJECT_PATH_AND_NAME"; // Path to and name of object. For example '../myFiles/index.js'.
const fileStream = fs.createReadStream(file);

// Set the parameters
const uploadParams = {
  Bucket: "BUCKET_NAME",
  // Add the required 'Key' parameter using the 'path' module.
  Key: path.basename(file),
  // Add the required 'Body' parameter
  Body: fileStream,
};

// Create an Amazon S3 service client object.
const s3 = new S3Client({ region: REGION });

// Upload file to specified bucket.
const run = async () => {
  try {
    const data = await s3.send(new PutObjectCommand(uploadParams));
    console.log("Success", data);
  } catch (err) {
    console.log("Error", err);
  }
};
run();

Not completely sure how credentials fit in with my current setup with serverless.

I'm not loving the AWS docs, but this seems like a reasonable example of sending a read file which is pretty much what we want to do. https://docs.aws.amazon.com/sdk-for-javascript/v3/developer-guide/s3-example-creating-buckets.html ```javascript // Import required AWS SDK clients and commands for Node.js. const { S3Client, PutObjectCommand } = require("@aws-sdk/client-s3"); const path = require("path"); const fs = require("fs"); // Set the AWS Region. const REGION = "REGION"; //e.g. "us-east-1" const file = "OBJECT_PATH_AND_NAME"; // Path to and name of object. For example '../myFiles/index.js'. const fileStream = fs.createReadStream(file); // Set the parameters const uploadParams = { Bucket: "BUCKET_NAME", // Add the required 'Key' parameter using the 'path' module. Key: path.basename(file), // Add the required 'Body' parameter Body: fileStream, }; // Create an Amazon S3 service client object. const s3 = new S3Client({ region: REGION }); // Upload file to specified bucket. const run = async () => { try { const data = await s3.send(new PutObjectCommand(uploadParams)); console.log("Success", data); } catch (err) { console.log("Error", err); } }; run(); ``` Not completely sure how credentials fit in with my current setup with serverless.
Sign in to join this conversation.
No Label
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: h3n3/cadhub#316