How to call a DataFlow asynchronously

PEGA 8.8.3

I need a Declare OnChange call to be triggered when the property of the case is filled, which will call an activity, which in turn will call a DataFlow.

Property > Declare OnChange > Activity(DataFlow-Execute) > DataFlow

In my tests, I noticed that the case waits for all DataFlow processing to be completed before releasing the screen to the user. This is bad because the user will wait for the processing to finish, which should be running asynchronously in the background.

As an alternative, I ran the following test: I created a new activity, then Activity 1 of the declare onchange executes Activity 2 using a Call-Async-Activity, and Activity 2 then executes the DataFlow. Apparently, Activity 2 was executed asynchronously and released the case to the user while the dataflow processing continued in the background by another thread.

Property > Declare OnChange > Activity 1(Call-Async-Activity) > Activity 2(DataFlow-Execute) > DataFlow

However, checking the Pegasystems Documentation documentation, the recommendation is to change the activity type to “Asynchronous.” When I do this, I get an error when saving: Method—Activity of type ‘ASYNCHRONOUS’ cannot use method ‘DataFlow-Execute,’ so I kept the activity type = Activity.

I need to know if I can use Call-Async-Activity to call an activity with Activity type = Activity, or if you recommend another solution to call the Dataflow asynchronously so that its processing doesn’t impact the user’s screen.

@FranciscoH6181

Use a Queue Processor, not Call-Async-Activity. Create a lightweight “producer” activity called by your Declare OnChange that uses Queue-For-Processing to enqueue the case ID and any inputs; this immediately returns to the user (after a commit) so the screen won’t hang. Configure a dedicated Queue Processor with an “action” activity (the “consumer”) that opens the case, then runs your DataFlow via DataFlow-Execute (or pxStart/pxResume Data Flow) on a background node. This pattern is resilient (survives node restarts), retries on failure, and centralizes throttling. Make sure the queue item is committed (use CommitWithErrorHandling if needed) and that the QP’s access group can open the case and run the data flow. Optionally add a dedupe key (case ID) so you don’t enqueue duplicates. Monitor progress in Admin Studio under Queue processors and Data flows. In short: OnChange → queue item → background QP runs the DataFlow asynchronously.

@Sairohith

Thanks for the reply, this is the other solution we’re considering.

In our solution, we need to enrich the data and then send it to another Kafka, so we were considering not queuing it, but I also believe it’s the best solution use a Queue Processor.