Pega 8.6 CDH schedule fails with Failed to create batch OfferBuilder

@reddh

We are trying to run a schedule in CDH ( Pega 8.6 ) for email channel. The schedule is getting completed but the DF is throwing the below error:-

Failed to create batch OfferBuilder: Class Name is required for all context classes

Please find attached the entire error stack trace.

I checked the clipboard page and there the context dictionary is set correctly. Have attached the same.

Also , the inbound calls are working fine with the same context dictionary setup.

Am I missing something?

error_stack.docx (15 KB)

Clipboard screenshot.docx (295 KB)

@IpsitaS3

How often does this issues happen ? Are you able to consistently reproduce it for every outbound run or it happens sporadically ?

You can try the following to see if it resolves your issue. If not please create a Pega support incident for us to take a deeper look at it.

Instructions to clear the declare page and re-run:

  1. Clear the node level declare page of the D_CDHSubjectDictionary

Open the data page named : D_CDHSubjectDictionary under the applies to class of PegaCDH-Data-SubjectDictionary in the developer studio with an operator who has CDH admin login

Navigate to the “Load Management” tab after opening the data page rule

Action the “Clear data page” button and select “Flush All” and submit.

  1. Try re-running the same outbound run that failed (re-schedule and the outbound run level in the UI) - if the environment is not in production, you could also directly run the failed dataflow run (something like PR-100) by navigating to the DSM landing page that list the data flows and re-run the failed one. If production, try a new outbound run.

  2. If the issue is still re-producible after doing step-2 (clearing the data page) please create a Pega support incident.

Before raising the support incident, create and enable the log category for the following class and attach that to the incident so that we can gather more information.

Instructions before you raise a support ticket with Pega:

  1. In developer portal under the Rule explorer, sysadmin category. Create a new log category, name it something like “OfferBuilder” and add the following class to the associate logger to category. Leave the default log level to ‘ERROR’

com.pega.mkt.offer.OfferBuilder

  1. Launch the Admin studio and navigate to “Resources” → “Log categories”
    look up the just created log category named “OfferBuilder” and change the current log level to “DEBUG” (this would enable debug level logging for this class across all the nodes in the cluster)

  2. Re run the outbound run, collect the logs for all the data flow node type and attach it to the incident.

  3. Once the issue is re-produced and you have collected the logs, you can turn the log category in the admin studio back to “ERROR” level so detailed verbose logging is not happening every time during the outbound run.

Hope this helps, please post here the outcome of your findings, it would really help shape the product better.

Hi Ipsita Saha,

Related to the above query, can you also please reply back at the earliest, if your environment does have the following DSS and have a value ? either true or false ?

Pega-Engine : datapage/newgenpages

This will identify the root cause faster.

You can look-up for dynamic system setttings (DSS) under the records explorer → SysAdmin - Dynamic System Settings

@Sriram Krishnan

Hi Sriram,

Apologies for the delay.

My environment does not have the above setting at all. The only datapage related setting available in my environment is of the below:-

prconfig/datapages/mrucapacity/default

And to respond to your earlier question:-

  1. I continously get the same issue and it is not sporadic.
  2. I tried all the above mentioned steps to flush the datapages but still no luck.

@IpsitaS3 -

Did you applied HFIX-81283 ? If not, can you please get that hfix from GCS and apply it.

Note - after applying the hfix you need to save context dictionary.

@reddh Requesting you to share some details on this issue OR please share the inc number because we are not able to view the details for this HFIX.