Skip to content

Mule 4 app created to scan items from Amazon DynamoDB and insert them in Salesforce Data Cloud.

Notifications You must be signed in to change notification settings

alexandramartinez/mule-dynamodb-to-datacloud

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Amazon DynamoDB to Salesforce Data Cloud integration in Mule 4

Mule 4 app created to scan items from Amazon DynamoDB and insert them in Salesforce Data Cloud.

Resources

If you need additional resources to do your configs, check out the following links:

MuleSoft

Salesforce

AWS

Run this app locally

Besides making sure you have all the configuration ready in AWS/Salesforce, this is how you have to set up your Mule project to run locally.

  1. Download or clone this repo to your local computer
  2. Open it in Studio or ACB (preferably ACB)
  3. Add your values to the config.yaml file
    • cdp.api.name - the Source API Name from your Ingestion API in Salesforce
    • cdp.object.name - the Object Name from your Ingestion API / Data Stream in Salesforce
    • dynamodb.table.name - the name of the table to scan items from in DynamoDB
  4. Add your encrypted values to the secure.config.yaml file
    • cdp.consumer.key - the Consumer Key from your Connected App in Salesforce
    • cdp.consumer.secret - the Consumer Secret from your Connected App in Salesforce
    • salesforce.username - the username to log in to your Salesforce account
    • salesforce.password - the password to log in to your Salesforce account
    • dynamodb.accessKey - the Access Key for your AWS account
    • dynamodb.secretKey - the Secret Key for your AWS account
  5. Modify the DataWeave code in dynamodb-response.dwl to match your DynamoDB/Data Cloud output/input
    • The code in the file is an example for my specific use case. Use the two functions (removeDynamodbKeys & flattenObject) to transform the data, but make sure you map the keys/values to your specific structure
    • If you're using ACB, you can use the DataWeave extension for VS Code to see the preview of your script before deploying
  6. Make sure to pass the encryption.key property to the runtime (see resources for more info)
  7. Run the application locally
  8. Send a request to localhost:8081/sync to trigger the flow
  9. You should receive a 200-OK response
  10. Wait 2-5 minutes for the insertion to be added to Data Cloud

If you're experiencing issues, please make sure your credentials are correct and your Salesforce/AWS settings have been properly set.

You can also raise an issue to this repo in case I missed something in the code.

Limitations

This is intended to be a simple POC. This app only performs the scan from the given AWS table and does an insert (streaming) to Data Cloud every time you send a request to the /sync path.

This app does not perform deletions or queries in Data Cloud. Check out this GitHub repo for delete/query operations.

Since the Streaming - Insert operation is being used for Data Cloud, there is a max of 200 records set by Data Cloud. You can implement the Size-Based Aggregator to the app if you intend to use more than 200 records or switch the operation to Batch instead of Streaming.