Ask the Expert - Understanding and Obtaining Log Files in Pega Cloud with Shawn Burrington

Join us in this month’s Ask the Expert as Shawn Burrington, Principal Product Manager, Pega Cloud Engineering shares his knowledge on log files in Pega Cloud. He will also be here to answer your questions!

Make sure to Favorite and Follow for updates via the Notification bell and/or weekly digest!

Session dates: June 17 - June 28

Session opens to questions on June 17th!

Meet Your Expert:

Shawn Burrington is a Principal Product Manager specializing in Cloud Engineering and more specifically, Pega Cloud Logging. Shawn has been with Pega for 3 years and has over 12 years’ industry experience across Systems Engineering, Software Development, and Product Management domains. Shawn is passionate about working with users to drive value delivery and promoting collaborative engagement.

Message from Your Expert:

I’m excited to work with you and learn about how you use Pega Cloud Logging. Over the past few years, Pega Cloud Logging capabilities have evolved significantly, and there’s still much more to come. Don’t hesitate to ask questions or provide feedback during the session. This is a great opportunity to not only learn but also help shape the future of our Cloud Logging strategy!

Ask the Expert Rules

I’m so excited to interact with users Pega Cloud Logging and discuss some of the logging features. Today we offer 4 core capabilities for clients and users to interact with Pega Cloud log files (outside of Infinity/Dev Studio):

  1. Log Downloads via My Pega Cloud Portal allow users to download log files on-demand for troubleshooting and analysis
  2. Log Streaming to Client Splunk (AWS only) allows users to have logs streamed in real-time to a client-managed Splunk instance
  3. Log Streaming to Client S3 Bucket (AWS only) allows users to have logs streamed in real-time to a client-managed AWS S3 bucket
  4. Log Streaming to Client Google Project (GCP only) allows users to have logs streamed in real-time to a client-managed Google Project

To learn more about the log files that we support via these features, check out this page: Understanding and obtaining Pega log files

In 2024 we’ll be making several major, foundational enhancements to our log streaming services to improve performance, security, and flexibility - all based on feedback from our clients and partners. We’ll also be releasing a new feature for Pega Cloud GCP users: Log Streaming to Client Splunk which should be available within the next few months.

I’m excited to hear thoughts and learn more about your use-cases.

Shawn | Pega Cloud Product Management

@ShawnBurrington

Hi shawn thanks first for the documentation links provided

the links mainly talks about the pega managed cloud system where pega takes care of infra but coming to my current project it is client hosted environment and there is also a need to use splunk and configure the logs

need some better insights on how to manage this process and best we can push the logs using splunk

btw we are currently using aws and a containerised application using eks

@BhanuPrasanthT

Nice to meet you! The services we support only apply to clients using Pega Cloud (on Pega-managed infrastructure). Regardless, I can try and offer some guidance for how you might be able to achieve this on self-hosted Pega solution. Pega Cloud is also hosted on fully-containerized, Kubernetes based infrastructure, so the use-case is similar.

I’d suggest investigating a log forwarding solution like fluentbit to handle the tasks of aggregating/routing logs to your Splunk destination. Fluentbit can install as a daemonset in your EKS cluster and ‘listen’ to logs coming via STDOUT and STDERR. It could also ‘tail’ any log files on the host OS and forward them to a destination of your choosing. In this case, you could leverage the Fluentbit Splunk Plugin to simplify the aspects of authentication and log routing to Splunk.

Besides Fluentbit - there are a number of other container-compatible log aggregators like logstash, fluentd, OpenTelemetry Logging…etc - this is just a suggestion based on our experience using Pega Logs.

Other configuration recommendations depend on your specific logging requirements and determining which logs you’d want to analyze in Splunk. As an example, you might want to be mindful of log volume to ensure you don’t exceed your Splunk license limits.

Hopefully this helps.

  • Shawn

@ShawnBurrington

Thank you so much for being our expert this month and for sharing those links to the 4 core capabilities for clients and users to interact with Pega Cloud log files (outside of Infinity/Dev Studio)!

Thank you to everyone that took the time to check out this session!