how to upload larger file( greater than 12 MB) to aws s3 bucket in using salesforce apex

Parbati Bose

I need some help for uploading large files into s3 bucket from salesforce apex server side.

I need to be able to split a blob and upload it to aws s3 bucket using Http PUT operation. I am able to do that upto 12 MB file in a single upload because that is the PUT request body size limit in Apex . So i need to be able to upload using multipart operation. I noticed s3 allows to upload in parts and gives back a uploadId. wondering if anyone has already done this before in salesforce apex code. it would be greatly appreciated.

Thanks in advance Parbati Bose.

Here is the code

public with sharing class AWSS3Service {

    private static Http http;

    @auraEnabled
    public static  void uploadToAWSS3( String fileToUpload , String filenm , String doctype){


        fileToUpload = EncodingUtil.urlDecode(fileToUpload, 'UTF-8');

        filenm = EncodingUtil.urlEncode(filenm , 'UTF-8'); // encode the filename in case there are special characters in the name 
        String filename = 'Storage' + '/' + filenm ;
        String attachmentBody = fileToUpload;

        String formattedDateString = DateTime.now().formatGMT('EEE, dd MMM yyyy HH:mm:ss z');




        // s3 bucket!
        String key = '**********' ;
        String secret = '********' ;
        String bucketname = 'testbucket' ;
        String region = 's3-us-west-2' ;



        String host = region + '.' + 'amazonaws.com' ; //aws server base url




    try{
        HttpRequest req = new HttpRequest();
        http = new Http() ;
        req.setMethod('PUT');
        req.setEndpoint('https://' + bucketname + '.' + host +  '/' +  filename );
        req.setHeader('Host', bucketname + '.' + host);

        req.setHeader('Content-Encoding', 'UTF-8');
        req.setHeader('Content-Type' , doctype);
        req.setHeader('Connection', 'keep-alive');
        req.setHeader('Date', formattedDateString);
        req.setHeader('ACL', 'public-read-write');


        String stringToSign = 'PUT\n\n' +
                doctype + '\n' +
                formattedDateString + '\n' +
                '/' + bucketname +  '/' + filename;


        Blob mac = Crypto.generateMac('HMACSHA1', blob.valueof(stringToSign),blob.valueof(secret));
        String signed = EncodingUtil.base64Encode(mac);
        String authHeader = 'AWS' + ' ' + key + ':' + signed;
        req.setHeader('Authorization',authHeader);

        req.setBodyAsBlob(EncodingUtil.base64Decode(fileToUpload)) ;



        HttpResponse response = http.send(req);

        Log.debug('response from aws s3 is ' + response.getStatusCode() + ' and ' + response.getBody());


    }catch(Exception e){
            Log.debug('error in connecting to s3 ' + e.getMessage());
            throw e ;
        }
    }

Juned Ahsan

The AWS SDK for Java exposes a high-level API, called TransferManager, that simplifies multipart uploads (see Uploading Objects Using Multipart Upload API). You can upload data from a file or a stream. You can also set advanced options, such as the part size you want to use for the multipart upload, or the number of concurrent threads you want to use when uploading the parts. You can also set optional object properties, the storage class, or the ACL. You use the PutObjectRequest and the TransferManagerConfiguration classes to set these advanced options.

Here is the sample code from https://docs.aws.amazon.com/AmazonS3/latest/dev/HLuploadFileJava.html.

You can adapt to your Salesforce Apex code:

import com.amazonaws.SdkClientException;
import com.amazonaws.auth.profile.ProfileCredentialsProvider;
import com.amazonaws.regions.Regions;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.services.s3.transfer.TransferManager;
import com.amazonaws.services.s3.transfer.TransferManagerBuilder;
import com.amazonaws.services.s3.transfer.Upload;

import java.io.File;

public class HighLevelMultipartUpload {

    public static void main(String[] args) throws Exception {
        Regions clientRegion = Regions.DEFAULT_REGION;
        String bucketName = "*** Bucket name ***";
        String keyName = "*** Object key ***";
        String filePath = "*** Path for file to upload ***";

        try {
            AmazonS3 s3Client = AmazonS3ClientBuilder.standard()
                    .withRegion(clientRegion)
                    .withCredentials(new ProfileCredentialsProvider())
                    .build();
            TransferManager tm = TransferManagerBuilder.standard()
                    .withS3Client(s3Client)
                    .build();

            // TransferManager processes all transfers asynchronously,
            // so this call returns immediately.
            Upload upload = tm.upload(bucketName, keyName, new File(filePath));
            System.out.println("Object upload started");

            // Optionally, wait for the upload to finish before continuing.
            upload.waitForCompletion();
            System.out.println("Object upload complete");
        } catch (AmazonServiceException e) {
            // The call was transmitted successfully, but Amazon S3 couldn't process 
            // it, so it returned an error response.
            e.printStackTrace();
        } catch (SdkClientException e) {
            // Amazon S3 couldn't be contacted for a response, or the client
            // couldn't parse the response from Amazon S3.
            e.printStackTrace();
        }
    }
}

이 기사는 인터넷에서 수집됩니다. 재 인쇄 할 때 출처를 알려주십시오.

침해가 발생한 경우 연락 주시기 바랍니다[email protected] 삭제

에서 수정
0

몇 마디 만하겠습니다

0리뷰
로그인참여 후 검토

관련 기사

분류에서Dev

Salesforce Apex를 사용하여 더 큰 파일 (12MB 이상)을 aws s3 버킷에 업로드하는 방법

분류에서Dev

AWS S3 Bucket Upload using CollectionFS and cfs-s3 meteor package

분류에서Dev

Checking if file is larger than 1 MB

분류에서Dev

aws s3 php: fails to upload directory using UploadSyncBuilder

분류에서Dev

upload file using flask to amazon s3

분류에서Dev

How to transfer uploaded file size greater than 16kb using wcf service

분류에서Dev

How to upload 2+ MB file into xampp MySQL via phplocalmyadmin?

분류에서Dev

How to Give Access to non-public Amazon S3 bucket folders using Parse authenticated user

분류에서Dev

how to upload a pdf file to ftp server using file upload controller?

분류에서Dev

Grant access to AWS S3 bucket/folder to users without AWS account

분류에서Dev

Using less than or greater validation in ruby on rails

분류에서Dev

NodeJS File Upload to Amazon S3 Store Mapping to Database

분류에서Dev

Python process a csv file to remove unicode characters greater than 3 bytes

분류에서Dev

Is it possible to access a public AWS S3 bucket without providing keys?

분류에서Dev

AWS S3 권한-put-bucket-acl 오류

분류에서Dev

How to change a string if the variable is greater than the current

분류에서Dev

Plupload file size more than 10 mb saving as blob file

분류에서Dev

AWS 리소스를 사용하여 'aws s3 sync s3 : // bucket1 s3 : // bucket2'를 예약하는 방법은 무엇입니까?

분류에서Dev

Weird ~3MB/s Synology upload rate limit?

분류에서Dev

Android JAVA $_FILE returms empty at files bigger than ~ 50 mb

분류에서Dev

Mule ESB: How to take all the files in a folder inside Bucket of Amazon S3 ( get object content)

분류에서Dev

How to upload file in MVC using Database First approach?

분류에서Dev

Express 4: How to upload file to memory (e.g. as a UTF-8 string) rather than disk?

분류에서Dev

Save a file to a bucket using wget -P

분류에서Dev

How to burn 801MB iso file

분류에서Dev

Upload Files directly to S3 chunk-by-chunk using Play Scala using Iteratees

분류에서Dev

Writing csv file to Amazon S3 using python

분류에서Dev

Workaround for CarrierWave multiple file upload using sqlite3 in development

분류에서Dev

d3.js diagram line is larger than range

Related 관련 기사

  1. 1

    Salesforce Apex를 사용하여 더 큰 파일 (12MB 이상)을 aws s3 버킷에 업로드하는 방법

  2. 2

    AWS S3 Bucket Upload using CollectionFS and cfs-s3 meteor package

  3. 3

    Checking if file is larger than 1 MB

  4. 4

    aws s3 php: fails to upload directory using UploadSyncBuilder

  5. 5

    upload file using flask to amazon s3

  6. 6

    How to transfer uploaded file size greater than 16kb using wcf service

  7. 7

    How to upload 2+ MB file into xampp MySQL via phplocalmyadmin?

  8. 8

    How to Give Access to non-public Amazon S3 bucket folders using Parse authenticated user

  9. 9

    how to upload a pdf file to ftp server using file upload controller?

  10. 10

    Grant access to AWS S3 bucket/folder to users without AWS account

  11. 11

    Using less than or greater validation in ruby on rails

  12. 12

    NodeJS File Upload to Amazon S3 Store Mapping to Database

  13. 13

    Python process a csv file to remove unicode characters greater than 3 bytes

  14. 14

    Is it possible to access a public AWS S3 bucket without providing keys?

  15. 15

    AWS S3 권한-put-bucket-acl 오류

  16. 16

    How to change a string if the variable is greater than the current

  17. 17

    Plupload file size more than 10 mb saving as blob file

  18. 18

    AWS 리소스를 사용하여 'aws s3 sync s3 : // bucket1 s3 : // bucket2'를 예약하는 방법은 무엇입니까?

  19. 19

    Weird ~3MB/s Synology upload rate limit?

  20. 20

    Android JAVA $_FILE returms empty at files bigger than ~ 50 mb

  21. 21

    Mule ESB: How to take all the files in a folder inside Bucket of Amazon S3 ( get object content)

  22. 22

    How to upload file in MVC using Database First approach?

  23. 23

    Express 4: How to upload file to memory (e.g. as a UTF-8 string) rather than disk?

  24. 24

    Save a file to a bucket using wget -P

  25. 25

    How to burn 801MB iso file

  26. 26

    Upload Files directly to S3 chunk-by-chunk using Play Scala using Iteratees

  27. 27

    Writing csv file to Amazon S3 using python

  28. 28

    Workaround for CarrierWave multiple file upload using sqlite3 in development

  29. 29

    d3.js diagram line is larger than range

뜨겁다태그

보관