I am trying to use the Azure Openai’s “GPT 4-o-mini” LLM model, from the “SAP AI Core” service in the SAP Business Technology Platform, when developing an Agentic AI application using SMOL Agents. The problem is models in SAP AI Core service can be called like an API using the “request” library for text generation, and here “bearer” token-based Authentication mechanism is used where the token is passed to the header variable of the post method of “request” library.
But in “CodeAgents” initialization or “LiteLLM” based initializations, model name and openai-api-key based models are only passed. Therefore, there is no way to call or load an LLM Model hosted in “SAP AI Core” service.
Because of this we are unable to use the SMOL Agents framework for our Agentic AI application. Could anyone please help deal with this issue?
1 Like
If you want to use an unsupported API as a model, you will basically have to define a custom model. If there seems to be demand for it, you could make a request in the issue.