Is there any way to increase the API proxy rate li...
# 🤝help
b
Our bot is a fairly complicated implementation that requires us to lean on OpenAI's newest API features such as function calling. This requires us to use axios to make several post requests to both OpenAI's API and our own API. I've been running into a lot of 429 "too many requests" errors and they happen randomly between API requests so I do not believe they are coming from the API's I'm trying to call but from the api proxy that botpress runs the requests through.
bump!
g
^This is making the stakeholders question using botpress, we're just in our testing phase and if it can't handle our test load it won't be able to handle our production load.
I should clarify, as Frankie said our bot is pretty complex, we are essentially calling out on every turn to azure, openAI, and aws to fulfill the request. There's no documentation on where the limits are that we can see. Any help in this area would be appreciated.
a
@colossal-egg-20510 do you know?
c
Sure, this rate limit is in place to prevent any bot from getting out of hand and to prevent someone with bad intentions to DDOS our system. Also, the screenshot you sent me is from the studio. The rate limit in the studio is a lot lower than the published bot since user interactions is always at a lower rate. Maybe we were too aggressive with our limit 😅 . Those can easily be increased if necessary for your use case.
g
Thanks for clarifying, especially the difference in studio vs published (we of course have been using studio a lot to troubleshoot prompts) @brainy-ram-43776 have we seen this in the published bot at all? I don't know if I've encountered it.
b
I have not encountered any of the same issues with our published version of the bot. I totally understand the need to prevent misues of the system, but I suspect that studio rate limiting is too aggressive, especially for advanced use cases where you may make several API calls per turn of the conversation.
89 Views