Pega need to pass data to KAFKA via AVRO schema format.
the connectivity in Kafka connection and Data set is successfully.
however, In runtime, Data set it throws below exceptions.
javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target at sun.security.ssl.Alert.createSSLException
Failed to save page to data set due an exception][STACK][com.pega.dsm.kafka.api.serde.avro.AvroSerdeException: org.apache.kafka.common.errors.SerializationException: Error serializing Avro message at com.pega.dsm.kafka.api.serde.avro.AvroValuesSchemaRegistrySerde.serialize(AvroValuesSchemaRegistrySerde.java:103) at com.pega.dsm.kafka.api.serde.LazyExceptionHandlingSerDe.serialize(LazyExceptionHandlingSerDe.java:66)
I get the same error “unable to find valid certification path to requested target”. Could you please let me know where you set the certificate for schema registry server please?
After setting the certificate, we are seeing new error message.
[MSG][Conversion of ClipboardPage to GenericRecord has failed.][STACK][org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: XXX Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Incompatible Avro schema; error code: 40901 at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:209) at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:235)
@KasunJ Validate your AVRO schema and upload a right format. Also, check if your organization allow updates to Schema registry once published. or you may need to create a new Event Type and Upload a new version of Schema.