Preparations and Configurations to share ElasticSearch and Kafka between Pega Environments

Hello,

we want do share Elastic Search and Kafka between multiple Pega environments.

We’ve already spoken to people from pega about this topic and they say that it is possible but it is needed to make some configurations.
Which configurations are needed in Pega and in the Helmcharts for the deployment.

Thank you

Regards

Julian

@JulianB70

To share Elastic Search and Kafka between multiple Pega environments, you need to configure both Pega and the Helm charts.

In Pega, you need to ensure that the Stream nodes are in ‘NORMAL’ status without additional details. This indicates that the Stream functionality is working as expected for your Kafka service.

In the Helm charts, you need to configure the ‘stream’ section parameters for external Kafka. You also need to set the ‘stream.enabled’ parameter to ‘true’. For Elastic Search, you need to configure the logging services in the ‘addons’ Helm chart provided by Pega.

Please note that sharing services like Elastic Search and Kafka across multiple environments can have implications on performance and isolation of data. It’s recommended to have separate instances for each environment for better control and management.

:warning: This is a GenAI-powered tool. All generated answers require validation against the provided references.

Meeting requirements and prerequisites > Choosing your preferred Kubernetes-based services for your deployment

Configuring a containerized deployment to use a Kafka service > Configuring H