Hi,
what will be the impact if i raised the column size to 20,000 characters.
Hi,
what will be the impact if i raised the column size to 20,000 characters.
Why not encode and save and decode when retrieving
Can you tell you requirement and why you will need that? its like saving a clob in the column which will slow down the inserts and updates.
hello @Bhargavj3227 ,
Raising a database column size to 20,000 characters is a big NO in most Pega applications. Irrespective of your requirement bumping up the column size may solve you ask byt it will eventually introduce significant performance, scalability, and maintenance risks, especially if the table is transactional. Encoding the data (for example, using Base64) does not alleviate these concerns; in fact, it typically increases the payload size by approximately 30–40%, resulting in an even greater performance impact during inserts, updates, and reads.
If you must do this and unavoidable, the below approaches can be used with caution, ensuring that retrieval of the data occurs strictly on a need only basis.
One reason comes to mind. I´ve had a scenario where I wanted to store the JSON response of a GenAI-agent in a CLOB-column. Even then, you want to make sure you’re not reaching levels where it would generate a Pega alert. Ultimately, I support the approach suggested by @VikasT when there is even a remote chance of an output of 20k characters…