I am using Pega Infinity 25 Community Edition (cloud-hosted).
I have created a REST Connector that calls an OpenAI-compatible LLM endpoint. The connector works correctly when tested directly (returns 200 with expected response).
Now I want to use this custom REST connector inside my GenAI Connect rule instead of the built-in Pega GenAI models.
However, when I open the GenAI Connect rule and go to the Advanced tab, I can only see the built-in GenAI models in the model dropdown. There is no option to select or configure a custom REST connector.
My Questions:
In Pega Infinity 25 Community Edition, is it possible to replace the built-in GenAI model with a custom REST Connector in a GenAI Connect rule?
If yes, what is the recommended/correct way to achieve this?
In Pega Infinity 25 Community Edition, you can’t directly replace the built-in GenAI models in a GenAI Connect rule with a custom REST Connector. The model dropdown in the Advanced tab is limited to models that are registered through Pega’s GenAI/LLM configuration layer, so REST connectors won’t appear there even if they work correctly on their own.
If your OpenAI-compatible endpoint is already working via REST, the recommended approach is to register it as an external LLM provider (if supported in your setup) rather than trying to plug the connector directly into the GenAI Connect rule. Once configured at the LLM/model level, it should then become selectable in the GenAI Connect rule.
Alternatively, depending on your use case, some teams route the REST call through a Data Page or Activity and integrate the response indirectly, but that’s more of a workaround than a native GenAI Connect configuration.
Also, if you’re experimenting with prompt setups or naming ideas during testing, you can try tools like generate name here for quick inspiration.
In Pega Infinity 25 Community Edition, you generally cannot point a GenAI Connect rule directly at an arbitrary standard Connect-REST rule from the Advanced tab. The GenAI Connect model picker is designed to show GenAI providers/models registered in Pega’s GenAI framework, not generic REST connectors, so a plain REST connector that works independently will usually not appear there
If your custom endpoint is to be used by GenAI Connect, the supported pattern is to create/configure it as a GenAI provider/connector inside Pega’s GenAI framework, then select that provider/model in the GenAI Connect configuration. A regular standalone REST connector is not, by itself, the selectable artifact for that dropdown
The model dropdown in GenAI Connect is populated from GenAI provider/model configuration, not from the general pool of Connect REST rules. One support answer for using your own LLM says the path is to expose the model over HTTPS, create a GenAI provider/connector that calls it, test it, and then select the newly created provider/model from the GenAI model dropdown.