We have a requirement where we need to extract large amount of data ( 6 columns, 1000 records ) from a web application and need to send it back to Pega BPM application for further processing. How do we achieve this ?
Right now we are adding new data in the JSON response that is sent to the Pega BPM application. Will the large volume of data breach the JSON character limitation and fail ?
Customer does not want to go with file based (CSV,excel etc. ) data transfer.
What better alternatives are recommended in this scenario ?
@KavinR16978504 A file transfer is the best option. Alternatively, you could add your data into Pega one row at a time as the bot gathers it. This is terribly inefficient though if the bot gathers all the data at once. If the bot is performing automated steps to gather each piece of data, then it shouldn’t really add any extra time to send that data as it is gathered.
Another option might be to continue to use JSON but break the request into parts. Write some logic to limit the data sent to X length and then break it apart into Y requests. You’d need to do some work in your Pega application to be able to communicate properly so that it knows when the data has been broken into parts and knows when it is all sent to be able to properly assemble it and process it.
Hello,
What’s the size of the JSON in Kb? the size that you mention (6 columns, 1000) shouldn’t be that big in plain text unless the size of the columns is huge. Even if JSON is not design to pass Mb of data, it should work if you are talking about few Mb. If the file it is more than that. You can always split it in smaller parts or define a multi-part service in Platform and use it to stream the data via REST API call.
I presume that you already have the data in memory and the Robot Runtime is handling that fine, which means that the data is not that big. Otherwise, if you need to write the data to disk and then stream it back again, the operation would be highly inefficient.
Another option is to do the reading operation from the web application in a controlled way so you read and send the data via REST API calls.
I think any option that avoids creating the file and write it to disk and manages the data in memory only, would be way more efficient (as long as you don’t hold crazy amounts of Mb in memory).
@ThomasSasnett Thank you for referring to above article. Could you please help us with little bit of information on how File Transfer works with respect to Pega RDA (Runtime 19.1 Version) and Would there be any feasibility available in Pega bot to Base64 encode the file contents and exchage in response with Case.
Looking for some suggestions on what will be best possible solution to implement below scenario using Pega RDA (Robotic Desktop Automation). Below is the scenario
User invokes attaned robot from Pega Case , Send input from Pega Case to Attended Pega Robot.
Robot Launches client application from user desktop.
searches for records in client application using input sent from Pega Case.
Needs to respond back to Pega Case with array of records formatted into JSON string.
Here response records might vary from 1000 to 10,000 records based usecase to usecase.
@SrimannarayanaB To transfer a file to a Pega case with the 19.1 version you can use the REST Component or your own custom C#. You’d need to determine the endpoints to call to get a token and then pass that token in your request to upload the file. It is not advised to use 19.1 to do this as we have drastically improved the experience in the latest 22.1. I am not sure what you mean by encoding the contents. Presumably, the delivery mechanism would still be JSON and you’d run into your length limitation as mentioned above.