Kafka Data Set – Support for Multiple Topics in a Single Data Set (Pega 8.8.4)

We are using Pega Platform 8.8.4 and are exploring about the Kafka streams for data loading.

Our requirement is to consume events from multiple Kafka topics that belong to the same Kafka cluster and are processed using the same integration logic.

Could you please help clarify:

  1. What is the Pega-supported approach to consume data from multiple Kafka topics using Kafka Data Set rules?

  2. Is there a recommended way to configure multiple topics in a single Data Set (for example, comma-separated topic names, topic patterns, or any other supported mechanism)?

  3. If multiple topics are not intended to be handled within a single Kafka Data Set, what is the recommended design pattern in Pega to achieve this requirement?

Any documentation reference or guidance specific to Pega 8.8.4 would be very helpful.

@Kaaviya S In Pega 8.8.4, a Kafka Data Set is designed to be linked to one Kafka topic, not a comma-separated list or a topic pattern.
To consume events from multiple topics in the same cluster, create one Kafka Data Set rule per topic.
Point all of those Data Sets to the same Kafka configuration instance so they share the same cluster connection and security setup.
Then create one real time Data Flow run per topic, using the matching Kafka Data Set as the source.
Inside each Data Flow, call the same shared data transform to map and process the message so the integration logic stays identical across topics.