how to make inference with external LLM (not the i...
# 🤝help
r
although I can make an API call to an External LLM service, I found not way to allow streaming such as your internal AI agent has. This would be very beneficial instead of waiting the whole API call to resolve which can take sometimes more then a minute. Streaming this data would be great. so in this case you need to create kinda special component that allows to show results from streaming....
f
Hey there, Streaming is currently not possible in Botpress.
r
thanks for the clarification, would love to see that feature, it would allow much better user experience and allows offloading complexity to other layers such an external LLM wrapper, that's what im doing. ps. where can i vote for or request features?