Hi ALL i am putting together an onprem Q&A chat solution for a school as a community service. The requirement is simple they would like to have an onprem Q&A bot to be used by students and parents which runs inside a closed environment ( entire solution should be installed on their environment and should not connect out).
I have put together a open source based LLM endpoint which I would like to hook into botpress running on a server. Basically i would like to create an AI task that connects to my own LLM endpoint isntead of OpenAI endpoint present in cloud botpress instance. They have an aws VPC setup atm.
The institution is very particular about having a closed setup and want to use their own LLM instead of openAI endpoints.
can you please guide me how can i implement AI task in onprem botpress
a
acceptable-kangaroo-64719
07/05/2023, 10:32 AM
Hey @early-crayon-77735, there is no on-prem for Botpress cloud, nor do we have a way to integrate your own LLM.