@BogdanD16582931 I can see that you logged support ticket INC-270302 for this.
The investigation showed you were introducing a Data-Admin-Kafka rule which used SASL authentication, but while testing the connectivity a generic message was happening suggesting a timeout scenario or error while collecting the topics metadata, which indicated that either authentication was failing or some parameters that the external brokers were expecting was not being passed as part of the client properties on Pega side.
Pega GCS provided a standalone Kafka application that has maximum level of debugging enabled on all Apache Kafka classes. Using this approach we were able to identify that the connection was happening but it was failing at the very first stage of SASL authentication challenges (SEND_APIVERSIONS_REQUEST) and then it was getting immediately disconnected.
The issue was resolved by introducing additional Kafka client properties which were expected on external Kafka side (SSL encryption with truststore certificate) and modifying the security protocol (from SASL_PLAINTEXT to SASL_SSL).
You responded that the connect issue was resolved - solution was to use SASL_SSL security protocol and specify a truststore with valid certificate.
I will mark this issue with the above Accepted Solution.
@SUMAN_GUMUDAVELLY we have already configured the external Kafka, here the problem is with a new Kafka Instance: Records > SysAdmin > Kafka, the related documentation can be seen here: Creating a Kafka configuration instance (pega.com). I am interested in having the client properties configuration file used by @BogdanD16582931 to solve the issue above.