Pega cloud log streaming to an external S3 bucket

The link Streaming Pega logs to an external Amazon S3 bucket describes what needs to be done to stream logs to an external S3 bucket. Does anyone have experience doing this? For the Pega files that are sent to S3, does it create multiple entries of the file with different timestamps in S3 based on the frequency setup for the file transfer.

The requirement for us would be to send the file that is sent to S3 into Kafka topic which is used by Splunk to collate the log information. We cannot use the Pplunk HEC for connecting.

@JohnM16741127 I can see that no-one appears to have shared their experience with you.

Our documentation shows the following:

When you configure your Pega Cloud environment to stream log files to an Amazon S3 bucket, the logs are streamed in real-time. This means that as events are logged in your Pega application, they are immediately sent to your S3 bucket. This could result in multiple entries of log data with different timestamps, depending on when the events occurred. However, the specifics of how the log data is organized or partitioned within your S3 bucket would depend on your own S3 bucket configuration.

As for sending the log data from your S3 bucket to a Kafka topic for use by Splunk, this would be outside the scope of Pega Cloud’s log streaming feature. You would need to set up your own process or service to read the log data from your S3 bucket and send it to your Kafka topic. Please note that Pega Cloud’s log streaming feature is designed to work directly with Splunk via Splunk’s HTTP Event Collector (HEC).

:warning: This is a GenAI-powered tool. All generated answers require validation against the provided references.

Streaming Pega logs to an external Amazon S3 bucket > Client responsibilities