There are two possible ways to migrate the data from Amazon S3 to Salesforce using APPflow. To view, the first option click here. In this blog I will go through the alternate way.
Step 1
Go to your Salesforce sandbox, setting-> Change Data Capture and add the object which you interested to send into Amazon S3. I have added Account here. The goal here is if anything changes into Account records, then that information will be passed to Amazon S3.

Step 2 – Create Flow in Amazon Appflow
Go to Amazon Console and search the Amazon APPFlow. Click on the Create flow. Provide the name of the flow and description and then click next.
My Amazon Console link. This link will vary depending on the Amazon region.

Step 3
Select the source name as “Salesforce” and select the connection. Then select the Salesforce Events and then select event type. Here I choose Account Change Event because I want to fetch the Account information from Salesforce. Destination is Amazon S3 and in bucket details choose your S3 bucket name where you want to store the data. For this blog, I have created a bucket called salesforceaccountswag. Flow event should be always Run flow on event.

Step 4 – Map Salesforce and Amazon S3
Next in the Amazon Appflow step, Map the fields which you want to fetch from Salesforce to S3.

Step 5 – Activate Amazon Appflow
Activate it. Then go back to your Salesforce instance and change any account record. Comeback to your Amazon Appflow’s Run history tab and you will see that the data has been moved from Salesforce to S3.

Step 6 – Check data in Amazon S3
Go to your Amazon S3 bucket and you will find the data inside that folder. This is how you can integrate Amazon S3 with Salesforce.


Leave a Reply