Options & Best practices for large files and psftp - compression or zip?

What are options and best practices for sending very files to a pegacloud-hosted file listener vis psftp? Concerned about network latencies and stability. Text should compress quite well. Can one use the standard psftp -C option to compress/decompress automatically? Could one zip/compress data before sending it, then automatically (or periodically) decompress/unzip the data in PegaCloud repository?

How have others tacked very large input files?

@WERDA it does not look like anyone responded… Below is what our documentation recommends, but likely that is not what you were after.

I’ll list it here anyway, in case it is of use to anyone else with a similar question:

When sending very large files to a Pega Cloud-hosted file listener via SFTP, consider the following best practices:

  1. Use Pega Cloud SFTP service for secure file transfer. However, note that Pega Cloud SFTP service does not support automatic decryption or decompression of files.
  2. Compress the files before sending them to reduce their size and improve transfer speed. You can use standard compression tools like gzip or zip. However, you will need to decompress the files manually or configure a file listener to process the compressed files.
  3. For better performance and to avoid memory-related issues, consider using other file-based platform integration capabilities like Pega Cloud File storage or FTP Server data instances.
  4. Manage your data files according to your enterprise best business practices using your preferred SFTP client. You can also use the Repository API to interact with your files or configure a file listener to process your files.

File attachment configuration in REST and SOAP integrations

Using Pega Cloud SFTP service

Using Pega Cloud File storage

If you did find out anything interesting outside of this PSC post, do let us know.