Skip to main content

Run and Monitor Data Processing Engine Definition

Activate and Run a Definition

In the earlier step, you configured the Data Processing Engine definition. Now it’s time to activate and run it. When you run a definition, data is synced from the data sources and each node is executed as you defined in the definition workflow.

  1. From Setup, in the Quick Find Box, enter Data Processing Engine, and then select Data Processing Engine.
  2. Click Get Order Aggregate.
  3. On the Get Order Aggregate definition builder canvas, click Activate.
  4. Click Run Definition.
  5. On the Run definition? window, click Next twice.
  6. Click Run Definition.

Monitor the Definition Run Status

You can monitor the progress of your definition run using the monitor workflow services, and check if the run has completed, canceled, failed, or is in progress. You can also see the status of each run task. Here’s how you can monitor your definition’s run status.

  1. From Setup, in the Quick Find box, enter Monitor Workflow Services, and then select Monitor Workflow Services. Review the current run status of your definition.

The All Workflow Services page showing the Data Processing Engine definition run status.

  1. Click Get Order Aggregate.
  2. Click the Tasks tab. Notice the run status of each task.

The Get Order Aggregate monitor workflow page showing the list of tasks and their run statuses.

  1. After the run completes, the status changes to Completed. Refresh the list or the page to see the latest status.

The All Workflow Services page showing the data processing engine definition run status.

Your results are now written back in your custom object fields, which you had defined in your definition.

Verify the Writeback Results

Now, it’s time to verify the data written back in your custom object fields.

  1. Click App Launcher, then find and select Order Aggregates.
  2. Click List view menu and select All.

The Order Aggregates list view with the All option highlighted.

Congratulations! Your transformed data is ready.

The Order Aggregates object fields and their respective data.

You can now see the four Cloud Kicks sneaker products along with the total quantity ordered and the total revenue generated for each product.

Run a Definition Using Flow

You can also run a Data Processing Engine definition by automating the run process using a Salesforce flow. Here’s how to do it.

Create a Salesforce Flow

First, create a Salesforce flow.

  1. From Setup, in the Quick Find box, enter Flows, and then select Flows.
  2. Click New Flow.
  3. On the New Automation window, under Categories, select Screen.
  4. From Types, Select Screen Flow.

Your flow is now created and ready to be configured.

Call the Data Process Engine Definition

Next, use an action element to bring the Get Order Aggregate definition into the flow you just created.

  1. Click Add Element to add an element.
  2. Select the Action element.

The Add Element window showing the list of flow elements with the Action element highlighted.

  1. In the Search Actions pane, find and select Get Order Aggregate-DataProcessingEngine.

The Search Actions pane showing the Get Order Aggregate-DataProcessingEngine action.

  1. For Label, enter Run Order Aggregate Definition.
  2. API Name: This field auto-populates with Run_Order_Aggregate_Definition.
  3. Save your changes.
  4. On the Save the flow window, specify these details.
    • For Flow Label, enter Order Aggregate Flow.
    • Flow API Name: This field auto-populates with Order_Aggregate_Flow.
    • Save your changes.
  5. Click Activate.
  6. Click Run to run the flow and the definition.

The definition automatically runs and executes each node. After the definition run is complete, the results are written back to the target entities.

The Order Aggregates object fields and their respective data.

The new results appear after the existing ones as the action type is selected as Insert by default in the writeback object node. For the Insert action type, every time you run the definition, the results appear in the target object one after another.

Run a Definition with Scheduled Flow

You can also create a scheduled flow to run the definition automatically at a specific date and time. Here’s how to set up that process.

Create a Scheduled Flow

First, create a scheduled flow.

  1. From Setup, in the Quick Find box, enter Flows, and then select Flows.
  2. Click New Flow.
  3. On the New Automation window, under Categories, select Scheduled.
  4. From Types, select Schedule-Triggered Flow.
  5. In the Set a Schedule pane, add these details.
    • For Start Date, select the current date or a future date.
    • For Start Time, select a time slot.

The Set a Schedule section in a flow showing the Start Date, Start Time, and Frequency fields.

You have successfully created a schedule-triggered flow and it’s now ready to be configured.

Call the Data Processing Engine Definition

Use an action element to bring the Get Order Aggregate definition into the flow you created in the previous section.

  1. Click Add Element to add an element.
  2. Select the Action element.
  3. In the Search Actions pane, find and select Get Order Aggregate-DataProcessingEngine.
  4. For Label, enter Run Scheduled Definition.
  5. API Name: This field auto-populates with Run_Scheduled_Definition.
  6. Save your changes.
  7. On the Save the flow window, for Flow Label, enter Order Aggregate Scheduled Flow.
  8. API Name: This field auto-populates with Order_Aggregate_Scheduled_Flow.
  9. Save your work.
  10. Click Activate.

The flow automatically executes and the definition runs as per the set date and time.

Wrap Up

Thanks to Data Processing Engine, the large volumes of unmanaged customer data at Cloud Kicks is no longer a black hole of complex numbers and figures. The sneaker company can now finally derive useful insights such as the total quantity of sneakers ordered and total revenue for each sneaker. These insights are vital for improving customer service and growing business.

Congratulations on completing this project. You now know how to create a Data Processing Engine definition and add data sources. You also learned how to configure the nodes to transform data and write back results. Additionally, you explored the process of how to run and monitor the definition. You also learned how to invoke and execute a Data Processing Engine definition in Salesforce Flow. With this invaluable knowledge in your admin arsenal, you’re now ready to use Data Processing Engine in your business processes.

Resources

Share your Trailhead feedback over on Salesforce Help.

We'd love to hear about your experience with Trailhead - you can now access the new feedback form anytime from the Salesforce Help site.

Learn More Continue to Share Feedback